Modern data centers have been around since the 1990s, but they were largely unknown to most of us over their first 20 years. From about 2010 until just a few years ago, cloud computing and various mobile and software services became commonplace. The number of data centers grew from hundreds to a couple of thousand. The current third era of data centers, running AI training and inference, has exploded the number of centers and, especially, the amount of power they consume. In 2005, data centers consumed 20 GW of power. Last year, that number exceeded 114 GW with an annual growth rate of over 17%.
The scale to which financial capital, expertise, infrastructure, and other resources are evolving is seen by many as an historical moment comparable to the Industrial Revolution.
The impact of artificial intelligence and the data centers that enable it are changing how we do things and what things we are doing on a similarly broad level.
The data center industry is requiring vastly more energy, upgraded regulations for electric grid access, and is driving the adoption of new local laws in response to growing community opposition. Data centers were once literally hidden away in closets but now require enormous warehouses spanning sprawling campuses running on gigawatts of electricity.
There is a race among Big Tech companies to develop the most sophisticated AI systems that is increasingly shaping investment markets, the environment, and the economy in general. How the growing data center revolution will all turn out remains to be seen.