SixthSense
SixthSense is a gesture-based wearable computer system developed at MIT Media Lab by Steve Mann in 1994 and 1997 (headworn gestural interface), and 1998 (neckworn version), and further developed by Pranav Mistry (also at MIT Media Lab), in 2009, both of whom developed both hardware and software for both headworn and neckworn versions of it. It comprises a headworn or neck-worn pendant that contains both a data projector and camera. Headworn versions were built at MIT Media Lab in 1997 (by Mann) that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition (e.g. finger-tracking using colored tape on the fingers).[3]
SixthSense is a name for extra information supplied by a wearable computer, such as the device called EyeTap (Mann), Telepointer (Mann), and "WuW" (Wear yoUr World) by Pranav Mistry et al.[6][7]
Origin of the name
Sixth Sense technology (a camera combined with a light source) was developed in 1997 as a headworn device, and in 1998 as a neckworn object, but the Sixth Sense name for this work was not coined and published until 2001, when Mann coined the term "Sixth Sense" to describe such devices.[8][9]
Mann referred to this wearable computing technology as affording a "Synthetic Synesthesia of the Sixth Sense", believing that wearable computing and digital information could act in addition to the five traditional senses.[10] Ten years later, Pattie Maes, also with MIT Media Lab, used the term "Sixth Sense" in this same context, in a TED talk.
Subsequently, other inventors have used the term sixth-sense technology to describe new capabilities that augment the traditional five human senses. For example, in their 2012–13 patent applications, timo platt et als, refer to their new communications invention as creating a new social and personal sense, i.e., a "metaphorical sixth sense", enabling users (while retaining their privacy and anonymity) to sense and share the "stories" and other attributes and information of those around them.
Construction and workings
The SixthSense technology contains a pocket projector, and a camera contained in a head-mounted, handheld or pendant-like, wearable device. Both the projector and the camera are connected to a mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks users' hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tips of the user’s fingers. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. SixthSense supports multi-touch and multi-user interaction.
Mann has described how the SixthSense apparatus can allow a body-worn computer to recognise gestures. If the user attaches colored tape to his or her fingertips, of a color distinct from the background, the software can track the position of those fingers.[11]
Example applications
During a 2009 TED talk given by Professor Pattie Maes,[12] she showed a video demonstrating a number of applications of the SixthSense system. Those applications include:
- Four colored cursors are controlled by four fingers wearing different colored markers in real time. The projector displays video feedback to the user on a vertical wall.
- The projector displaying a map on the wall, and the user controlling it using zoom and pan gestures.
- The user can make a frame gesture to instruct the camera take a picture. It is hinted that the photo will be automatically cropped to remove the user's hands.
- The system could project multiple photos on a wall, and the user could sort, re-size and organize them with gestures. This application was called Reality Window Manager (RWM) in Mann's headworn implementation of Sixth Sense.[13]
- A number pad is projected onto the user's palm, and the user can dial a phone number by touching his palm with a finger. It was hinted that the system is able to pin point the location of the palm, and that the camera and projector are able to adjust themselves for surfaces that are not horizontal.
- The user can pick up a product in supermarket (e.g. a package of paper towels), and the system could display related information (e.g. the amount of bleach used) back on the product itself.
- The system can recognize any book picked up by the user and display Amazon rating on the book cover.
- As the user opens a book, the system can display additional information such as reader's comments.
- The system is able to recognize individual pages of a book and display annotation by the user's friend. This demo also suggested the system would be able to handle tilted surface.
- The system is able to recognize newspaper articles and project the most recent video on the news event on a blank region of the newspaper.
- The system is able to recognize people by their appearances and project a word cloud of related information retrieved from the internet on the person's body.
- The system is able to recognize a boarding pass and display related information such as flight delay and gate change.
- The user can draw a circle on his or her wrist, and the system will project a clock on it.
Despite wearing the device during the presentation, Professor Maes did not give a live demonstration of the technology. During the talk, she had emphasized repeatedly that the SixthSense technology was a work in progress, however it was never clarified whether the video demos were showing real working prototypes or merely made-up examples for illustrating the concept.
This concern was expressed by M. C. Elish, "One relatively famous video to emerge from the Media Lab is Pranav Mistry’s original Sixth Sense video demo.... the video demo is misleading... raising the question of ethically responsible communication of research... misconstrue the reality of a technology."[14]
Advantage
- Portable:
One of the main advantages of the Sixth Sense devices is its small size and portability. It can be easily carried around without any difficulty. The prototype of the Sixth Sense is designed in such a way that it gives more importance to the portability factor. All the devices are light in weight and the Smartphone can easily fit in to the user’s pocket Support Multi touch and Multi user interactionMulti touch and Multi user interaction is another added feature of the Sixth Sense devices. Multi sensing technique allows the user to interact with system with more than one finger at a time. Sixth Sense devices also in-corporate Multi user functionality. This is typically useful for large interaction scenarios such as interactive table tops and walls.
- Cost Effective:
The cost incurred for the construction of the Sixth Sense prototype is quite low. It was made from parts collected together from common devices. And a typical Sixth Sense device costs up to $300. The Sixth Sense devices have not been made in large scale for commercial purpose. Once that happens it’s almost certain that the device will cost much lower than the current price.
- Data access directly from the machines in real time:
With the help of a Sixth Sense device the user can easily access data from any machine at real time speed. The user doesn’t require any machine human interface to access the data. The data access through recognition of hand gestures is much easier and user friendlier compared to the text user interface or graphical user interface which requires keyboard or mouse.
- Open Source Software:
The software that is used to interpret and analysis the data collected by the device is going to be made open source as said by its inventor. This will enable other developers to contribute to the development of the system
References
- ↑ "Telepointer: Hands-Free Completely Self Contained Wearable Visual Augmented Reality without Headwear and without any Infrastructural Reliance", IEEE International Symposium on Wearable Computing (ISWC00), pp. 177, 2000, Los Alamitos, CA, USA
- ↑ "WUW – wear Ur world: a wearable gestural interface", Proceedings of CHI EA '09 Extended Abstracts on Human Factors in Computing Systems Pages 4111-4116, ACM New York, NY, USA
- ↑ IEEE Computer, Vol. 30, No. 2, February 1997, Wearable Computing: A First Step Toward Personal Imaging, pp25-32
- ↑ Wearable, tetherless computer–mediated reality, Steve Mann. February 1996. In Presentation at the American Association of Artificial Intelligence, 1996 Symposium; early draft appears as MIT Media Lab Technical Report 260, December 1994
- ↑ IEEE Computer, Vol. 30, No. 2, February 1997, Wearable Computing: A First Step Toward Personal Imaging, pp25-32
- ↑ "IEEE ISWC P. 177" (PDF). Retrieved 2013-10-07.
- ↑ "Cyborg: Digital Destiny and Human Possibility in the Age of the Wearable Computer", Steve Mann with Hal Niedzviecki, ISBN 0-385-65825-7 (Hardcover), Random House Inc, 304 pages, 2001.
- ↑ Cyborg, 2001
- ↑ Geary 2002
- ↑ An Anatomy of the New Bionic Senses [Hardcover], by James Geary, 2002, 214pp
- ↑ MIT Media Lab Technical Report 260, December 1994
- ↑ Pattie Maes + Pranav Mistry: Meet the SixthSense interaction | Video on. Ted.com. Retrieved on 2013-12-09.
- ↑ Intelligent Image Processing, Wiley, 2001
- ↑ Elish, M. C. (2011, January). Responsible storytelling: communicating research in video demos. In Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction (pp. 25-28). ACM.
External links
- Sixthsense Tutorials
- Steve Mann's SixthSense site
- Pranav Mistry's SixthSense homepage
- SixthSense Google code site
- SixthSense Github repository