While the possibilities for an interesting Virtual Reality project are endless, only five weeks remain for this quarter. So, finding a project that is both interesting and can feasibly complete in this time is a greater challenge. One possibility that I believe meets both of these criteria is a tool that presents search results as a navigable, three-dimensional world. One example of such a tool--albeit complex--is "Cat-a-Cone"--a medical information retrieval tool developed by Xerox PARC and Standford University School of Medicine, Information Technologies (medIT).
Conventional web search engines such as Alta-Vista and go.com present their search results in a single dimension: ranked from top to bottom by an internal matching score. Yahoo provides an additional dimension by grouping search results into categories. Clearly, though, there are many more possible dimensions in which search results can be presented and compared. Other dimensions include location of a document's host: both geographical and in the Internet domain hierarchy; size of a document; rankings of subcomponents of a query; rankings of competing search engines; links; title text, author name, etc.
With this in mind, I propose a group project to brainstorm, design and develop a metasearch interface--that is a front-end interface for one or more of the existing search engines on the web. This interface will accept queries from a web user and present the results of that query as an interactive, virtual world. I further propose that the intelligence for this interface be distributed such that, while the server hosting the interface continues to collect and assimilate search results, the user can begin to navigate early results. Subsequent results, then, will appear in the client's world as the server makes them available.
Establish an understanding of the issues surrounding distributed virtual worlds
Create a useful and impressive system on the web
Generally increase awareness about VRML and virtual reality technology
The technologies that I believe will come into play are:
Virtual Reality Modeling Language (VRML)
Java on both client and server (possibly J++ on the server)
Active Server Pages and COM
The following processes will also require particular focus:
Parsing--or otherwise obtaining--search results from the web search engine(s)
Multithreaded queue access: query results are queued and dequeued independently.
Quality test planning and execution
Aesthetic and behavioral design for the virtual world
The typical search process works as follows:
While a search engine typically will execute the user's query against the search engine's own catalogs, a metasearch agent will delegate the search to one or more other agents for execution. It then collects, evaluates and assimilates the results from these agents. The diagram below outlines an example of a metasearch agent extending the basic search process and delegating query execution to Alta-Vista.
In the web environment then, the goal of the metasearch agent is to ensure execution of the queries in its charge; and to report the results therefrom back to its client. In the above diagram the metasearch agent submits a query and collects the results therefrom. As part of the query submission it may specify a starting index number. In the context of Alta-Vista, this would the index of the first item--of up to ten--that will appear on the "results" page.
As noted above, the advantage that virtual reality brings to presentation of search results is added dimensionality. Using the three dimensional position of a search result node is only part of this enhancement. Other VR features such as color, shape and animation behavior effectively allow expression of more than three dimensions.
The diagram below illustrates extension of the basic search process using VRML elements. The search results world here is made up of two main VRML elements: a group of search results and a script node to add results to the group as they become available. The representation suggested for search results--by this model--employs anchor nodes.
Assuming that the metasearch agent is collecting search results, and the script node on the client is presenting the search results, all that remains is to convey the search results from one to the other. The diagram below shows a client/server extension of the basic search use case, "Obtain Next Result from Collection." In this model, the VRML script node is the client who continually requests the "next" search result from the server until an end-of-stream is signalled. The metasearch agent is the server who waits for client requests. Upon perceiving a client request, the agent dequeues (or waits for) the next search result queued by its collection thread. It then marshalls that result--or an end-of-stream signal--for transmission back to the client.
An asynchronous adaptation of this model would do away with the client request and continually attempt to transmit results to the client as they became available. While this would offer performance benefits, it might create enough of a management problem that it would not be a worthwhile feature for this quarter.
Creation of a metasearch interface using virtual reality appears to be both an interesting process, and a goal that--with some effort--can be obtained this quarter. The big challenges here are interacting with one or more search engines, packaging the results for transmission from client to server, and presenting these results as elements in a virtual world. And, all of this is predicated on the process of dreaming up what this world will actually look like and how it will behave. I welcome anyone interested in exploring these questions to join me in this effort.
10/18/99 - Revision 1.2 - Added "High Level Project Plan"
10/16/99 - Revision 1.1 - Miscellaneous Corrections to Original Draft
10/14/99 - First Draft
Copyright © 1999 - Scott Parent - All Rights Reserved