Snagged a copy of The Economist at my club the other day, caught by the cover, which promoted a special report on "the data deluge". (Free to read on-line, copies available by online subscription and as reprints.)
Although the series of articles does not drill down to data centers and the infrastructure needed to store and analyze the flood of "big data", I can't but highly recommend the report to get a high level understanding of this ground-changing phenomenon.
The opening, introductory article, "Data, data everywhere" fascinates, as does the accompanying sidebar "All too much", which explains data units. We're all talking in Petabytes right now, but they have already defined a Yottabyte, which they say is "currently too big to imagine"!
I have been telling audiences for some time that the explosion in data will result in a focus on energy-efficient storage technologies - like MAID arrays and de-duplication. In my view, this will be the next big focus for data center efficiency, and groups like SNIA and their green storage initiative are only just getting ahead of the curve.
On the more utility-centric front, the transition to smart grid, and especially the metering and customer-focused end of the smart grid, will require the collection, processing, and analysis of a torrent of data that is simply not extant now.
Ask utility CIO what they expect is their next big challenge, and I would be surprised if they didn't cite the need for massive new data center and data storage infrastructure to enable smart grid implementation at the top of their list.
Lastly, I love The Economist dearly but ended my subscription some time ago. I couldn't keep up with the "data deluge" of such a comprehensive magazine every week!