Affective Computing

From Cyborg Anthropology
Revision as of 23:29, 2 July 2011 by Caseorganic (Talk | contribs)

Jump to: navigation, search


Affective Computing is a term used to describe the process of developing computing architectures that account for human issues such as usability, touch, access, persona; characteristics, emotion and background. Those who build systems by these principles think of computing as a solution or a helper for problems or essences of human living, especially in an industrial world.

Instead of teaching machines to understand humans, MIT's Kelly Dobson programmed a blender to understand voice activation, but not the typical voice one uses.[1] Instead of saying "Blender, ON!", she made an auditory model of a machine voice. If she wanted the blender to begin, she simply made blending noises at it. The low-pitched "Rrrrrrrrr" she made turned the blender on low. If she wanted to increase the speed of the machine, she increased her voice to "RRRRRRRRRRR!", and the machine increased in intensity. This way, the machine could understand volume and velocity, instead of a human voice. Her work called to question the notion that machines should always be built to understand human commands. Instead, of simply understanding a command similar to its own native machine language?


  1. Dobson, Kelly. Blendie. MIT Media Lab. 2003-2004. Accessed 02 July 2011.