Difference between revisions of "Affective Computing"

From Cyborg Anthropology
Jump to: navigation, search
 
(12 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 +
[[Image:affective-computing-Maggie-Nichols.jpg|center|600px]]
 +
 
===Definition===  
 
===Definition===  
'''Affective Computing''' is a term used to describe the process of developing computing architectures that account for human issues such as usability, touch, access, persona; characteristics, emotion and background. Those who build systems by these principles think of computing as a solution or a helper for problems or essences of human living, especially in an industrial world.   
+
Affective Computing is a term used to describe the process of developing computing architectures that account for human concerns such as usability, touch, access, persona, emotions and history. Those who build systems by these principles think of computing as a solution or a helper for problems or essences of human living, especially in an industrial world.   
 
+
"Plutowski (2000) identifies three broad categories of research within the area of affective or emotional computing,: emotional expression programs that display and communicate simulated emotional affect; emotional detection programs that recognize and react to emotional expression in humans; and emotional action tendencies, instilling computer programs with emotional processes in order to make them more efficient and effective. Thus specific projects at MIT (www.media.mit.edu/affect) have included the development of affective wearables such as affective jewelry, expression glasses, and a conductors jacket to designed to extend a conductor's ability to express emotion and intentionally to the audience and to the orchestra."<ref>Plutowski, 2000, in [http://books.google.com/books?id=AazrLYOaetwC&dq=broad+categories+of+research+within+the+area+of+affective+or+emotional+computing+(&source=gbs_navlinks_s Knowing capitalism] By N. J. Thrift. SAGE, 2005 - Business & Economics - 256 pages.</ref>
+
 
+
MIT Media Lab's Rosalind Picard<ref>(1997; 2000)</ref> defines affective computing as "computational systems with the ability to sense, recognize, understand and respond to human moods and emotions and argues that to be truly interactive computers must have the power to recognize, feel and express emotion".
+
 
+
{{clear}}
+
 
+
[[File:Kelly-Dobson-Blender.jpg|200px|thumb|right|Kelly Dobson controlling her blender with voice.]]
+
 
+
===Learning the Language of Machines===
+
  
Instead of teaching machines to understand humans, MIT’s [[Kelly Dobson]] programmed a blender to understand voice activation, but not the typical voice one uses. Instead of saying “Blender, ON!”, she made an auditory model of a machine voice.  
+
Many examples of affective computing exist. One of the most notable examples was built by [[Kelly Dobson]] while she was at MIT Media Lab. Instead of teaching machines to understand humans, Dobson programmed a blender to understand voice activation, but not the typical voice one uses.<ref>Dobson, Kelly. Blendie. MIT Media Lab. 2003-2004. http://web.media.mit.edu/~monster/blendie/ Accessed 02 July 2011.</ref> Dobson's work called to question the notion that machines should always be built to understand human commands instead of simply understanding a command similar to its own native machine language.
  
If she wants the blender to begin, she simply growls at it. The low-pitched “Rrrrrrrrr” she makes turns the blender on low. If she wants to increase the speed of the machine, she increases her voice to “RRRRRRRRRRR!, and the machine increases in intensity. This way, the machine can understand volume and velocity, instead of a human voice. Her work called to question the notion that a machines should always be built to understand human commands. Instead, of simply understanding a command similar to its own native machine language?
+
Instead of saying "Blender, ON!", Dobson made an auditory model of a machine voice. If she wanted the blender to begin, she simply made blending noises at it. The low-pitched "Rrrrrrrrr" she made turned the blender on low. If she wanted to increase the speed of the machine, she increased her voice to "RRRRRRRRRRR!", and the machine increased in intensity. This way, the machine could understand volume and velocity, instead of a human voice.
  
===Related Reading===
+
The principles of affective computer represent a next step in making computers that are responsive and helpful to humans. At Carnegie Mellon University, a pillow was developed could stored a person's hug for future playback.<ref>Foo, Juniper. Hug Pillow. CNET News Asia. Published 23 Nov 2004. http://asia.cnet.com/crave/hug-pillow-62100099.htm Accessed 02 June 2011.</ref> This pillow allowed one to feel across distances and well as leave haptic recordings. If one’s grandmother were to use the device to store her own hug, and then two months later were to die, her family members could replay it after she was gone.
*[[Media Lab at MIT]]
+
*[[Kelly Dobson]]
+
*[[Rosalind Picard]]
+
  
{{clear}}
 
 
==References==
 
==References==
 
<references />  
 
<references />  
  
 
[[Category:Book Pages]]
 
[[Category:Book Pages]]
[[Category:Marked for Editing]]
+
[[Category:Finished]]
 +
[[Category:Illustrated]]
  
 
__NOTOC__
 
__NOTOC__

Latest revision as of 18:30, 26 November 2011

Affective-computing-Maggie-Nichols.jpg

Definition

Affective Computing is a term used to describe the process of developing computing architectures that account for human concerns such as usability, touch, access, persona, emotions and history. Those who build systems by these principles think of computing as a solution or a helper for problems or essences of human living, especially in an industrial world.

Many examples of affective computing exist. One of the most notable examples was built by Kelly Dobson while she was at MIT Media Lab. Instead of teaching machines to understand humans, Dobson programmed a blender to understand voice activation, but not the typical voice one uses.[1] Dobson's work called to question the notion that machines should always be built to understand human commands instead of simply understanding a command similar to its own native machine language.

Instead of saying "Blender, ON!", Dobson made an auditory model of a machine voice. If she wanted the blender to begin, she simply made blending noises at it. The low-pitched "Rrrrrrrrr" she made turned the blender on low. If she wanted to increase the speed of the machine, she increased her voice to "RRRRRRRRRRR!", and the machine increased in intensity. This way, the machine could understand volume and velocity, instead of a human voice.

The principles of affective computer represent a next step in making computers that are responsive and helpful to humans. At Carnegie Mellon University, a pillow was developed could stored a person's hug for future playback.[2] This pillow allowed one to feel across distances and well as leave haptic recordings. If one’s grandmother were to use the device to store her own hug, and then two months later were to die, her family members could replay it after she was gone.

References

  1. Dobson, Kelly. Blendie. MIT Media Lab. 2003-2004. http://web.media.mit.edu/~monster/blendie/ Accessed 02 July 2011.
  2. Foo, Juniper. Hug Pillow. CNET News Asia. Published 23 Nov 2004. http://asia.cnet.com/crave/hug-pillow-62100099.htm Accessed 02 June 2011.