Difference between revisions of "Information"

From Cyborg Anthropology
Jump to: navigation, search
Line 127: Line 127:
 
57 Ibid. p. 90, Quoting Norbert Wiener, Cybernetics; or, Control and Communication in the Animal and the
 
57 Ibid. p. 90, Quoting Norbert Wiener, Cybernetics; or, Control and Communication in the Animal and the
 
Machine (New York: M.I.T. Press, 1961).p. 36, 34[[Category:Book Pages]]
 
Machine (New York: M.I.T. Press, 1961).p. 36, 34[[Category:Book Pages]]
[[Category:Unfinished]]
+
[[Category:Marked for Editing]]

Revision as of 23:00, 10 June 2011

Definition

In our modern techno-cultural discourse, “information” has become a ubiquitous term for all types of knowledge and communication, but this was not always the case. The cultural seeding of “information” took place in WWII, the war that saw the first programmable computer (used to break Nazi enigma ciphers) and the first use of systematic information warfare. Hitler used media to disseminate his message across all of Germany, most notably harnessing the broadcasting power of the PA system during rallies and the radio for national addresses. The US also practiced information warfare, with an unprecedented use of radio to destabilize the Axis powers and strict monitoring of civilian activities. Although technology has played a key role in every war, in WWII technology entered the American cultural imagination to an unprecedented degree. The Nazi’s V2 rockets (invented by Wernher Von Braun, who later designed most of NASA’s rockets) almost destroyed London, while the Allies’ victory hinged on Alan Turning’s (father of the computer and AI) use of computers to crack the Nazi enigma code and the Manhattan Project’s systematic coordinated research, which pioneered “Big Science” as we know it today. By the end of the war, it was widely understood that our technological prowess and control of information had kept us from all speaking German, and would continue to keep us from speaking Russian.

Information was first formulated as a mathematical-theoretical entity by Claude Shannon in his 1948 paper “The Mathematical Theory of Communication”. In this paper, information is posited as a probability function of a given message, allowing mathematicians to compute the most efficient way to encode any message. In Shannon’s theory, all communication can in principle be reduced to binary code, allowing “meaningful” communication to transcend the “noise”, chaos, or entropy that is understood as the backdrop to communication.51 It was a theory designed to maximize efficiency in communication channels, allowing for the global telecommunications networks and internet we know today.

There are several implications inherent in the idea of “information”. The first is a metaphysics of pattern / randomness that underlies information theory. Successful communication entails a message being formulated, encoded, transferred, decoded, and understood. If I send an e-mail from a Mac to a PC, my patterned communication might be disrupted, turning my apostrophe into “#*%”. This would be an example of the encoding/decoding process failing and entropic noise interfering with my patterned communication. Entropy/noise (equated in Wiener’s formulation of information) is the constant enemy to successful information transfer. But entropic noise is also necessary for successful communication, for if there is no ambiguity there is no lack for the information transfer to fill. If I know it is Thursday, an interlocutor stating, “It is Thursday” has little information value.52 Thus information theory is about preserving the pattern of communication against the onslaught of noise that is simultaneously required yet working in opposition to the pattern of communication.

Perhaps the most profound implication of “information” is that it was formulated with no reference to “meaning”. Because it was formulated in reference to maximum efficiency in communication technology, information value needed to be autonomous from the specific contexts in which it is enacted. For example, the e-mail “We are all going to die today” is “informationally” the same if it is a joke or if it is a warning. Mathematically this made sense for Shannon and Wiener. Accounting for the range of variables complicit in figuring out how much/what meaning someone is going to derive from the communication would yield a vastly more complex theory.53 Semantic meaning is always dependent on context, yet information theory ignores context in favor of mathematical ease.

Shannon was aware of the limited applications of his theory of information. When scholars from other fields tried to use “information” as a means for conceptualizing social phenomenon, Shannon warned that he did not see “too close a connection between the notion of information as we use it in communication engineering and what you are doing here … the problem here is not so much finding the best encoding of symbols… but, rather, the determination of the semantic question of what to send and to whom to send it.

Despite Shannon’s care in applying information theory, information proved too strong a meme to contain, leading many scholars to adapt this term while not considering the larger metaphysical implications of the concept. It is one of the central paradoxes of modern techno-culture that it reduces all value to information, despite information’s inherent lack of reference to value itself.

Wiener himself often took to conflating information with meaning. In The Human Use of Human Beings, one of the central documents in the foundation of Cybernetics, Wiener contrasts information (associated with patterned organization, communication, form, and coherence) with entropy (the force of disorder, randomness, and disintegration).55 This move was partially justified by Shannon’s equation for information exactly matching Ludwig Boltzman’s equation for the second law of thermodynamics, also known as entropy.56 Asking “is this devil [entropy] Manichean or Aristotelian?”, Wiener comes to the conclusion that entropy, and by extension, Western science, is Aristotelian. We do not have to accept this conclusion to note the significance of Wiener’s use of a theodicean metaphor to explain the nature of entropy and information. Entropy here is understood as evil, the (non-) entity that marks the absence of information, meaning, and the good. Eric Davis explains how Wiener’s theodicy gives information, and derivatively, cybernetics, cosmic significance:

The devil that the scientist fights is simply confusion, the lack of information, and not an organized resistance waged by some dark trickster. “Nature offers resistance to decoding, but it does not show ingenuity in finding new and undecipherable methods for jamming our communication with the outer world” The enemy is dumb and blind, Wiener says, “defeated by our intelligence as thoroughly as by a sprinkle of holy water.”57

With Wiener, information becomes the cybernetic “holy water” that defeats unintelligent nature and is linked “to meaning, value, and life itself. Wiener even suggests that the order- and form-generating power of information systems is basically analogous to what some people call God.”58 Information is our primary weapon against “nature’s [entropy’s] tendency to degrade the organized and to destroy the meaningful.”59 Thus the reification of information becomes complete in Wiener’s cosmologic extension of Shannon’s laws of efficient communication.

References

N. Katherine Hayles, How We Became Posthuman : Virtual Bodies in Cybernetics, Literature, and Informatics (Chicago, Ill.: University of Chicago Press, 1999). p. 8 51 Hayles gives a good example of how Shannon’s information theory works: Imagine a paranoid bookie who has a code for callers who are placing a bet. When the caller places a bet, there is a message that necessitates a binary response (“If your number is between 1-16, press 1, between 17-32, press 2”). Through five of these questions/binary responses, the program will know your number (If between 1-8, If between 1-4, If 1 or 2, If 1), because 32=25. Therefore information can be understood as I=log2N, N being the number of possibilities in a given communication, assuming equal probability of each option.

Since there is never a real-life situation in which ambiguity is nonexistent, some type of information can always be extrapolated from an utterance. Perhaps my friend said, “It is Thursday” because he thinks I have my head in the clouds, or because there are two Thursdays this particular week, causing general confusion. 53 Donald MacKay’s information theory tries to do exactly this, and became the dominant paradigm in British Information Theory. MacKay’s theory was soon displaced as Wiener and Shannon’s American theory became the industry standard. Hayles, How We Became Posthuman : Virtual Bodies in Cybernetics, Literature, and Informatics. p. 56.

Ibid. p. 54 Erik Davis, Techgnosis : Myth, Magic + Mysticism in the Age of Information (New York: Three River Press, 1998). p. 86 56 There is a very interesting story behind this math. When Shannon found the equation for information to be the same as the equation for entropy, he equated the two and saw information as entropy. Wiener took this equation and reversed it, defining information as the opposite of entropy. In mathematical terms, this is only a difference between a + and – sign in the equation, but Wiener’s interpretation of information became the dominant paradigm for understanding information, a paradigm that allowed him to make the larger metaphysical and cosmological metaphors that eventually became the basis of transhumanist metaphysics. Ibid. 57 Ibid. p. 90, Quoting Norbert Wiener, Cybernetics; or, Control and Communication in the Animal and the Machine (New York: M.I.T. Press, 1961).p. 36, 34