Entropy

The following is a guest post by Kristen Curtis and originally appeared on her blog**.** Kristen is currently a student at The Flatiron School. You can follow her on Twitter here. Every time I successfully launch something I wrote from the command line I like to celebrate by mentally singing along to some Weird Science: “Bits and pieces, bits and […]

Reading Time 3 mins

The following is a guest post by Kristen Curtis and originally appeared on her blog**.** Kristen is currently a student at The Flatiron School. You can follow her on Twitter here.

Every time I successfully launch something I wrote from the command line I like to celebrate by mentally singing along to some Weird Science: “Bits and pieces, bits and pieces. It’s my creation, is it reaaaal?” And then quickly follow up with, “Yes, yes it’s real!” before manically laughing at the birth of my creation. But tonight was slightly different – Post-song, I began thinking about the process of bringing information into existence and what that means in an age when so many of us are already suffering from information overload. Where did these bits and pieces come from?


Luckily, I’m not the first to tackle such heady questions. In fact, there’s an entire field dedicated to the study of information theory. But my favorite introduction to the subject is science and technology author, James Gleick’s book, The Information. Gleick masterfully paints a picture of the rich history of information, from the talking drums of Africa, to the written word, to the telegraph, to DNA, to quantum mechanics– all while weaving in the stories of the brilliant men and woman who pioneered the field, people such as Babbage, Lovelace, Turing, and Crick. But the star of the book is none other than my favorite autodidact, Claude Shannon.

Shannon, who is credited with founding digital computer and digital circuit design in addition to information theory, ushered in the digital revolution by figuring out how to quantify information. Information, as it turns out, is a complex notion to define, let alone quantify, and yet it’s something that we all seem to naturally create. We’re all programmers in that sense, communicating in code – from creating words, to constructing alphabets, to abstracting these foundations into electrical signals. Creating information is, arguably, what we do best. The desire to create and communicate in order to manifest meaning is, well… life.


Shannon began to notice that there were inherent patterns to the way we communicate. When these patterns, or messages, were separated from their meaning they could be translated into the language of mathematics, or what he referred to as “bits”. [Note: “bits” stands for binary digit, and is a term that was coined by John Tukey, though Shannon himself was the first ever to use it in print.] Expanding upon on this observation, Shannon hypothesized that communication must be riddled with redundancy. In order to test this theory he devised an experiment in which he showed groups of people written words that were cut off at different points and asked them to guess what letter came next. By and large, people were able to guess the correct words. From this he concluded that if you were to strip away the unnecessary bits, you could compact a message and still have it be readable – much like how we now write txt msgs. Mathematically compacting a digital message in this way allows one to transmit information at a faster (and cheaper) rate.


While Shannon’s ideas ushered in a new wave of digital communication, not all of his contemporaries were pleased with the notion of messages without meaning. Though Shannon’s work was answering a problem of engineering, the implications of applying his work to a broader spectrum of disciplines meant sacrificing “the very quality that gives information its value and purpose.” Or as cybernetics historian, Jean-Pierre Dupey puts it, “…a paradox is at work here: ours is a world about which we pretend to have more and more information but which seems to us increasingly devoid of meaning.”

This sentiment is echoed in today’s world when we talk about how technology is making us feel “alone together”, or how we speak about “unplugging” ourselves to escape information overload. And while more information may equate to more uncertainty within us, I ultimately feel that the balance between information and meaning is one we must struggle with in order to keep up with the pace of modern life.

Disclaimer: The information in this blog is current as of June 28, 2013. Current policies, offerings, procedures, and programs may differ.

About Flatiron School

More articles by Flatiron School