Neither snow nor rain nor heat nor gloom of night stays these couriers from the swift completion of their appointed rounds

Wednesday, January 20, 2010


In December, Roger Bohn and James Short of the UCSD Global Information Industry Center released their "How Much Information?" report, which measures American information consumption in 2008 (I suppose it takes a whole year to analyze the previous year's information use...). The paper makes for good reading, if only because it contains such mind-bending phrases as, “The alphabet is a very compact way to transmit words,” and “conversation is very high bandwidth,” or “ the overwhelmingly preferred way to receive words on the internet.” Bohn and Short found that in 2008, Americans collectively consumed 3.6 zettabytes (a million million gigabytes) of information in sheer volume, or about 34 gigabytes per person per day, and we spent1.3 trillion hours doing it--that's 12 hours for each of us every day (read the full text here, New York Times synopsis here). Examples of “consuming information” include listening to “Car Talk,” reading The Post or a biography of Arthur Conan Doyle, checking craigslist missed connections, watching Taylor Swift music videos on youtube, playing computer games, or sobbing through “The Young Victoria” at Bay Ridge Alpine Cinemas.

It never occurred to me that I was consuming information when I did these things. Surely I was analyzing, processing, chattering about, changing what I saw, heard, and experienced—not just becoming a container for it. If all I'm really doing with nearly half my time is taking things out of the world and putting them, whole and undifferentiated, into my head—well, that suggests that the culture, other people's opinions, “facts,” have far more influence over me than the other way around.

To say that we “consume” information implies, not just eating, but joyless, compulsive eating (something like this). It reminds me of The Matrix, in which the programming flowing to the captives' minds is symbolically linked to the foul liquid forced through their feeding tubes. The report estimates that, since 1980, our consumption of information as measured in compressed bytes has increased five-fold. This increase in consumption is a function of the increased availability of information (via the internet), which is, in turn, a function of the ever plummeting price of storage. According to Bohn and Short, in 1982, the cost per megabyte for hard disk storage on a 10 megabyte drive was about $50; today it's less than $1 per gigabyte for a drive of 100 gigabytes—that's 50,000 times less expensive.

Another unit of consumption which has gotten cheaper over time is the calorie. In The Omnivore's Dilemma, Michael Pollan reports that since 1977, the American daily intake of calories has increased by more than 10 percent. Pollan writes that the cheapest calories to consume are also the unhealthiest. He cites a study in The American Journal of Clinical Nutrition which found that a dollar could buy 1,200 calories of potato chips and cookies, but only 250 calories worth of carrots. The same dollar can buy 875 calories of soda, but only 170 calories of fruit juice.

The question implied in the UCSD report's title is: how much information is too much information? We understand that cheap calories have hidden costs—what are the hidden costs of cheap information? Will we one day discover that, at greater and greater volumes of bytes, the brain suffers from a kind of overload, an inability to process? Could we call it databetes? Will we have to carefully ration our intake of information, the way we now ration sugar? One hour of high-resolution television gets you six hours on the couch, blindfolded, with cotton stuffed in your ears? Can you purge information by writing or talking? Unlikely, since writers and speakers generally re-read their words as they write them and listen as they speak—is that the cerebral equivalent of eating your own puke?

The report's final section discusses trends and speculates on the future of American information use. It argues that our consumption of interactive forms of information—like computer games and programs, or internet communications—has increased the most. Bohn and Short write, “Before the Internet, the only ways to have a two-way exchange without being in the same room were telephone and first-class letters.” Of course, that's not strictly true—there was plenty of variety in two-way communication before the internet. What about tapping codes, without which the 1970 classic “Knock three times” by Tony Orlando and Dawn would make absolutely no sense? After all, clunk clunk clunk means you ain't gonna show. And what about smoke signals? Semaphore? Signal mirrors? The intricate flag language employed by the British navy during the Napoleonic wars? Or the constant back-and-forth of passive aggressive pamphlets among 17th-century wits like Johnathan Swift, who would annotate one another's essays, then re-release their own polemics with annotations on their rivals' annotations?

People used to compose riddles and poems to one another. Cheaters and paramours used to go to elaborate lengths to cover their tracks; now, you just open a secret email account under the username likes2bspanked (likes2bspanked is not available. Suggest likes2bspanked101849381, or likes2bspanked101849382) and you sneak along without getting caught until the day you forget to wipe your browser's history. Craigslist and email and instant messaging have made affairs so sordid and predictable. I long to receive a cryptic valentine, whose obscure verses are themselves clues to the sender's identity.

The report notes that, “A full fidelity video link between two locations, including stereo vision and sound is not possible with present technology—the observer will realize they are not physically in the same location. If we could do it, however, it would require conservatively 100 million bits per second.” While Bohn and Short make this type of technology sound far away in the realm of speculation, they do go so far as to assign “full fidelity” a finite number—at 100 million bits per second, they estimate, we could convincingly simulate the experience of “really” being in the same room with someone. Does it then follow that at 200 million bits per second, we could create an experience more “real” than reality? Twice as real, in fact? And what would that look like? The experience of seeing a film in a theater is so immersive because the picture quality is high and the screen is enormous. At a higher picture quality (that's a higher density of information), you're less conscious of the medium between you and the images, and the larger screen tracks the boundaries of the picture, the edges that define it as separate from reality, to the peripheries of your vision. I imagine that this is one of the closest things we have to the “full fidelity” experience Bohn and Short are referring to. Trying to imagine an experience at a higher resolution—visually, auditorially, texturally—than our understanding of the unmediated world through our eyes, ears, and fingertips, is like trying to imagine a fourth dimension, the way that Carl Sagan enjoined us to in that episode of “Cosmos.” I can't think how to wrap this up, so I'm going to ask Carl to do it for me—he of the exquisitely rounded syllables.

1 comment:

  1. In hyper fidelity, you can see the space between the pixels that form the human soul.