Aspen Movie Map
The Aspen Movie Map was a revolutionary hypermedia system developed at MIT by a team working with Andrew Lippman in 1978 with funding from ARPA.
Features
The Aspen Movie Map enabled the user to take a virtual tour through the city of Aspen, Colorado (that is, a form of surrogate travel). It is an early example of a hypermedia system.
A gyroscopic stabilizer with four 16mm stop-frame film cameras was mounted on top of a car with an encoder that triggered the cameras every ten feet. The distance was measured from an optical sensor attached to the hub of a bicycle wheel dragged behind the vehicle. The cameras were mounted in order to capture front, back, and side views as the car made its way through the city. Filming took place daily between 10 a.m. and 2 p.m. to minimize lighting discrepancies. The car was carefully driven down the center of every street in Aspen to enable registered match cuts.
The film was assembled into a collection of discontinuous scenes (one segment per view per city block) and then transferred to laserdisc, the analog-video precursor to modern digital optical disc storage technologies such as DVDs. A database was made that correlated the layout of the video on the disc with the two-dimensional street plan. Thus linked, the user was able to choose an arbitrary path through the city; the only restrictions being the necessity to stay in the center of the street; move ten feet between steps; and view the street from one of the four orthogonal views.
The interaction was controlled through a dynamically-generated menu overlaid on top of the video image: speed and viewing angle were modified by the selection of the appropriate icon through a touch-screen interface, harbinger of the ubiquitous interactive-video kiosk. Commands were sent from the client process handling the user input and overlay graphics to a server that accessed the database and controlled the laserdisc players. Another interface feature was the ability to touch any building in the current field of view, and, in a manner similar to the ISMAP feature of web browsers, jump to a façade of that building. Selected building contained additional data: e.g., interior shots, historical images, menus of restaurants, video interviews of city officials, etc., allowing the user to take a virtual tour through those buildings.
In a later implementation, the metadata, which was in large part automatically extracted from the animation database, was encoded as a digital signal in the analog video. The data encoded in each frame contained all the necessary information to enable a full-featured surrogate-travel experience.
Another feature of the system was a navigation map that was overlaid above the horizon in the top of the frame; the map both served to indicate the user’s current position in the city (as well as a trace of streets previously explored) and to allow the user to jump to a two-dimensional city map, which allowed for an alternative way of moving through the city. Additional features of the map interface included the ability to jump back and forth between correlated aerial photographic and cartoon renderings with routes and landmarks highlighted; and to zoom in and out à la Charles Eames’s Powers of Ten film.
Aspen was filmed in early fall and winter. The user was able to in situ change seasons on demand while moving down the street or looking at a façade. A three-dimensional polygonal model of the city was also generated, using the Quick and Dirty Animation System (QADAS), which featured three-dimensional texture-mapping of the facades of landmark buildings, using an algorithm designed by Paul Heckbert. These computer-graphic images, also stored on the laserdisc, were also correlated to the video, enabling the user to view an abstract rendering of the city in real time.
Credits
MIT undergraduate Peter Clay, with help from Bob Mohl and Michael Naimark, filmed the hallways of MIT with a camera mounted on a cart. The film was transferred to a laserdisc as part of a collection of projects being done at the Architecture Machine Group (ArcMac).
The Aspen Movie Map was filmed in the fall of 1978, in winter 1979 and briefly again (with an active gyro stabilizer) in the fall of 1979. The first version was operational in early spring of 1979.
Many people were involved in the production, most notably: Nicholas Negroponte, founder and director of the Architecture Machine Group, who found support for the project from the Cybernetics Technology Office of DARPA; Andrew Lippman, principal investigator; Bob Mohl, who designed the map overlay system and ran user studies of the efficacy of the system for his PhD thesis; Richard Leacock (Ricky), who headed the MIT Film/Video section and shot along with MS student Marek Zalewski the Cinéma vérité interviews placed behind the facades of key buildings; John Borden, of Peace River Films in Cambridge, MA, who designed the stabilization rig; Kristina Hooper Woolsey of UCSC; Rebecca Allen; Scott Fisher, who matched the photos of Aspen in the silver-mining days from the historical society to the same scenes in Aspen in 1978 and who experiment with anamorphic imaging of the city (using a Volpe lens); Walter Bender, who designed and built the interface, the client/server model, and the animation system; Steve Gregory; Stan Sasaki, who built much of the electronics; Steve Yelick, who worked on the laserdisc interface and anamorphic rendering; Eric "Smokehouse" Brown, who built the metadata encoder/decoder; Paul Heckbert worked on the animation system; Mark Shirley and Paul Trevithick, who also worked on the animation; Ken Carson; Howard Eglowstein; and Michael Naimark, who was at the Center for Advanced Visual Studies and was responsible for the cinematography design and production.
Purpose and applications
ARPA funding during the late 1970s was subject to the military application requirements of the Mansfield Amendment introduced by Mike Mansfield (which had severely limited funding for hypertext researchers like Douglas Engelbart).
The Aspen Movie Map's military application was to solve the problem of quickly familiarizing soldiers with new territory. The Department of Defense had been deeply impressed by the success of Operation Entebbe in 1976, where the Israeli commandos had quickly built a crude replica of the airport and practiced in it before attacking the real thing. DOD hoped that the Movie Map would show the way to a future where computers could instantly create a three-dimensional simulation of a hostile environment at much lower cost and in less time (see virtual reality).
While the Movie Map has been referred to as an early example of interactive video, it is perhaps more accurate to describe it as a pioneering example of interactive computing. Video, audio, still images and metadata were retrieved from a database and assembled on the fly by the computer (an Interdata minicomputer running the MagicSix operating system) redirecting its actions based upon user input; video was the principal, but not sole affordance of the interaction.
See also
- Google Street View
- EveryScape
- Eye2eye Software
- Mapillary - Crowdsourced Street Level Photos
Further reading
- Video The Interactive Movie Map: A Surrogate Travel System, January 1981, The Architecture Machine, at the MIT MediaLab Speech Interface Group; Youtube copy.
- Bender, Walter, Computer animation via optical video disc, Thesis Arch 1980 M.S.V.S., Massachusetts Institute of Technology.
- Brand, Stewart, The Media Lab, Inventing the Future at MIT (New York: Penguin Books, 1989), 141.
- Brown, Eric, Digital data bases on optical videodiscs, Thesis E.E. 1981 B.S., Massachusetts Institute of Technology.
- Clay, Peter, Surrogate travel via optical videodisc, Thesis Urb.Stud 1978 B.S., Massachusetts Institute of Technology.
- Heckbert, Paul, "Survey of Texture Mapping," IEEE Computer Graphics and Applications, Nov. 1986, pp. 56–67.
- Lippman, Andrew, "Movie-maps: An application of the optical videodisc to computer graphics," Proceedings of the 7th annual conference on Computer graphics and interactive techniques, Seattle, Washington, United States, 1980, pp. 32–42.
- Mohl, Robert, Cognitive space in the interactive movie map : an investigation of spatial learning in virtual environments, Thesis Arch 1982 Ph.D., Massachusetts Institute of Technology.
- Naimark, Michael, "Aspen the Verb: Musings on Heritage and Virtuality," Presence: Teleoperators and Virtual Environments, Special Issue on Virtual Heritage, MIT Press Journals, Vol. 15, No. 3, June 2006.
- Yelick, Steven, Anamorphic image processing, Thesis E.E. 1980 B.S., Massachusetts Institute of Technology.
References
External links
- EveryScape: Aspen, Colorado
- Aspen Movie Map on Michael Naimark’s website (includes video demo).
|