In 2008, Americans consumed information for about 1.3 trillion hours, an average of almost 12 hours per day. Consumption totaled 3.6 zettabytes and 10,845 trillion words, corresponding to 100,500 words and 34 gigabytes for an average person on an average day. A zettabyte is 10 to the 21st power bytes, a million million gigabytes.From a study done by the Global Information Industry Center, via an article in the CS Monitor. That sounds high to me: a DVD holds 4-4.5 gigabytes, so this rate of data consumption would be the equivalent of watching roughly eight full-resolution movies a day, every day. The first link is to the abstract and summary, where the PDF of the full study is available. I haven't looked at it yet, but I'd like to figure out how they conclude we consume better than 2 gigs of data every waking hour. Another mind-boggling way of expressing this incredible data rate was in the CS Monitor article:
According to the study's authors, 3.6 zettabytes is equivalent to all of the "information in thick paperback novels stacked seven feet high over the entire United States, including Alaska."I'm wondering if the issue here is "available to consume" versus "consumed." For example, there's some music playing right now that I'm not very fond of. It's not obnoxious, but I am more or less ignoring it. It's just background noise for me and the twenty or so others in here right now. So over the course of an hour, does that CD count as 0.7 Gb of data consumed by me and each of the others present? I suspect that's the case. If so, I'm extremely curious as to what the "desirability ratio" is. In other words, how much of the data consumed is desired, and how much is simply inflicted upon us by an increasingly electronic and commercially driven world?