Difference between revisions of "Non-Visual Augmented Reality"

From Cyborg Anthropology
Jump to: navigation, search
 
Line 3: Line 3:
  
 
"The trouble with visual augmented reality" says Robert Rice, "is that one has to sacrifice a certain amount of reality in order to experience the AR".'' Non-visual augmented reality is a way to experience a current of information without amending the percentage of that reality.
 
"The trouble with visual augmented reality" says Robert Rice, "is that one has to sacrifice a certain amount of reality in order to experience the AR".'' Non-visual augmented reality is a way to experience a current of information without amending the percentage of that reality.
 
As [[Charlotte Magnusson]] points out, "we can use other senses than just vision. For example, if you’re visiting an archaeological site you could hear people from the past doing there labour or people chatting about that big mountain that might have some volcanic action going on. Charlotte runs a project called Haptimap that seeks to combine virtual feelings with maps. For example, we could attach sensors to our bodies that give us impulses that have certain meanings and thereby are able to guide us. Imaging biking thru a city for the first time and instead of a map for directions get impulses on your arms that tells you which way to turn next. You can have both hands on the handle and focus on dodging busses and other dangerous traffic".<ref>Non-visual Augmented Reality. Martin Thörnkvist. Published 27 April, 2010. Accessed 22 April, 2011. http://mediaevolution.se/nordicgame/non-visual-augmented-reality/</ref>
 
  
 
Non-visual augmented reality is critique of the friction and lag that visual augmented reality systems inherent by their reliance on image processing, network connection and load time. Non-visual augmented reality gets rid of you holding the phone up, and pushes relevant information to you when you need it. Augmented reality is things like an app that you hold up to your face and look ridiculous as you wave it around looking.  
 
Non-visual augmented reality is critique of the friction and lag that visual augmented reality systems inherent by their reliance on image processing, network connection and load time. Non-visual augmented reality gets rid of you holding the phone up, and pushes relevant information to you when you need it. Augmented reality is things like an app that you hold up to your face and look ridiculous as you wave it around looking.  
 +
 +
As [[Charlotte Magnusson]] points out, "we can use other senses than just vision. For example, if you’re visiting an archaeological site you could hear people from the past doing there labour or people chatting about that big mountain that might have some volcanic action going on. Charlotte runs a project called Haptimap that seeks to combine virtual feelings with maps. For example, we could attach sensors to our bodies that give us impulses that have certain meanings and thereby are able to guide us. Imaging biking thru a city for the first time and instead of a map for directions get impulses on your arms that tells you which way to turn next. You can have both hands on the handle and focus on dodging busses and other dangerous traffic".<ref>Non-visual Augmented Reality. Martin Thörnkvist. Published 27 April, 2010. Accessed 22 April, 2011. http://mediaevolution.se/nordicgame/non-visual-augmented-reality/</ref>
  
 
====Haptic, or Touch-Based Augmented Reality====  
 
====Haptic, or Touch-Based Augmented Reality====  

Latest revision as of 22:00, 1 August 2011

Definition

Non-Visual Augmented Reality describes an augmented reality system that does not rely or create primarily visual stimulus or images in order to function. Non-visual augmented reality is anything that adds to the user's environment without requesting attention from the user's vision.Rather, other senses are augmented and stimulated in the system.

"The trouble with visual augmented reality" says Robert Rice, "is that one has to sacrifice a certain amount of reality in order to experience the AR". Non-visual augmented reality is a way to experience a current of information without amending the percentage of that reality.

Non-visual augmented reality is critique of the friction and lag that visual augmented reality systems inherent by their reliance on image processing, network connection and load time. Non-visual augmented reality gets rid of you holding the phone up, and pushes relevant information to you when you need it. Augmented reality is things like an app that you hold up to your face and look ridiculous as you wave it around looking.

As Charlotte Magnusson points out, "we can use other senses than just vision. For example, if you’re visiting an archaeological site you could hear people from the past doing there labour or people chatting about that big mountain that might have some volcanic action going on. Charlotte runs a project called Haptimap that seeks to combine virtual feelings with maps. For example, we could attach sensors to our bodies that give us impulses that have certain meanings and thereby are able to guide us. Imaging biking thru a city for the first time and instead of a map for directions get impulses on your arms that tells you which way to turn next. You can have both hands on the handle and focus on dodging busses and other dangerous traffic".[1]

Haptic, or Touch-Based Augmented Reality

Starting in Fall 2004, Udo Wächter, a systems administrator University of Osnabrück in Germany stared wearing a belt that gave him an extra sense of direction. The belt was lined with 13 piezoelectric pads than encirled his waist when worn. The buzzer that was facing north at any given moment would constantly buzz, letting Wächter always know which direction he was facing. He wore the belt for six weeks straight. [2] This "feelSpace belt", originally invented by Osnabrück cognitive scientist Peter König, was part of the feelSpace project, a project whose primary purpose "was to investigate the effects of long-term stimulation with orientation information on humans".[3]

A company called Sensebridge has a DIY haptic compass belt kit called North Paw. It was created by out of Noisebrdige, a hackerspace in San Francisco, California, and hacklab.to, a hackerspace in Toronto, Canada. Adam Skory, David Allen, Eric Boyd, Mikolaj Habryn and Rachel McConnell. [4] The company also builds other sensory augmentation devices such as the Heart Spark[5], a heart-beat tracking pendant. Boyd and Skory also created a ankle-mounted compass belt because they liked the idea of Wächter's belt but felt that its physical size might be impractical. [6]

Location-Based Augmented Reality

A location-based AR system with automatic check-ins based on GPS data with updates sent by SMS. Developed by Aaron Parecki. Locations are defined by circles on a map, and SMS messages are triggered to send when one enters into the area defined by that circle. One can set neighborhoods, areas, and blocks. When one takes automatic check-ins further, one can add streaming data, allowing one’s device to collect SMS messages for hyperlocal areas without the need for QR codes or any visuals. Privacy is an enormous issue with systems like this. One does not always want SMS updates, open GPS map data, or text notifications of another’s proximity.

The basic case here is the meeting. Person A and Person B need to meet each other, but GPS data is only shared between them when they have a scheduled meeting. When the meeting ends, the data wall closes off, giving them back their privacy, kind of like a wormhole of temporary transparency between two people. This solves the problem of extreme bouts of “checkin-ism”., as well as the issue of remaining privy to one’s whereabouts all the time.[7]

Audio-Based Augmented Reality

Audio-based augmented reality systems have been built for mobile phones and other physical devices, allowing for audio-based tours of everyday reality of a sound-based notification systems. "The movements of users through their workplace can trigger the transmission of auditory cues".[8] In this case messages, events, and locations can trigger different sounds based on priority level and type of data.

References

  1. Non-visual Augmented Reality. Martin Thörnkvist. Published 27 April, 2010. Accessed 22 April, 2011. http://mediaevolution.se/nordicgame/non-visual-augmented-reality/
  2. Udo Wächter's Directional Belt. Wired Magazine. Published 15 April. Accessed 22 April 2011. http://www.wired.com/wired/archive/15.04/esp.html
  3. feelSpace: Report of a Study Project. Universität Osnabrück. Institute of Cognitive Science Department of Neurobiopsychology. May 2005. Accessed 22 April 2011. http://cogsci.uni-osnabrueck.de/~feelspace/downloads/feelSpace_finalReport.pdf
  4. Sensebridge. North Paw Instructions. Accessed 22 April 2011. http://sensebridge.net/projects/northpaw/instructions/
  5. http://sensebridge.net/projects/heart-spark/
  6. Compass Vibro Anklet. Noisebridge.net Accessed 22 April 2011. https://www.noisebridge.net/wiki/Compass_Vibro_Anklet
  7. Geonotes, Proximal Notification Systems, and Automatic Check-ins with GPS and SMS. Published 06 April 2010. Accessed 22 April 2011. http://oakhazelnut.com/2010/04/06/geonotes-proximal-notification-systems-and-automatic-check-ins-with-gps-and-sms/
  8. Audio Aura: Light-Weight Audio Augmented Reality. Mynatt, Back, Wandt and Frederick. Xerox Palo Alto Research Center. 1997. Accessed April 22, 2011. http://www.roywant.com/cv/papers/pubs/1997-11%20(ICAD97)%20Audio%20Aura%20AR.pdf