AI Will Not Be Taking Over Anytime Soon. Here's Why:
I attended an AI training class at my local University a few weeks ago. There were obvious pros and cons that many are already wondering about, such as the sustainability of AI amongst our power grids and resources. It became apparent that with the sheer energy needs and human superiority involved, that we may not need to fear AI as much as we think. It actually made me think of “Y2K,” or the Year 2000. If you had the luxury of being alive then, you would have seen the chaos that Y2K ensued in communities around the globe.
It was a fascinating case study of how a legitimate technical problem became a source of widespread public panic and, ultimately, a punchline. The overreaction to Y2K was not because the problem was a hoax, but because of a fundamental misunderstanding of the issue and the massive, unheralded efforts to solve it. While the tech community was working hard behind the scenes, public awareness was fueled by a media narrative of doomsday scenarios. When January 1, 2000, arrived, the world did not end. There were a few minor glitches, but there were no widespread power grid failures, no planes falling from the sky, and no financial meltdowns.
Basically, everyone was fearful about something that turned out not to be much of a threat…
Unsustainable Living
The amount of energy that it takes to power an AI center is massive, and would take billions of dollars and hours of energy to process. This is unrealistic, unless there is complete planetary annihilation.
Training a large language model (LLM) can consume a staggering amount of electricity, equivalent to the annual power usage of a small city. This is because it requires thousands of powerful processors, such as GPUs, to run continuously for weeks or even months. The electricity consumption of data centers is projected to more than double by 2030. This puts immense pressure on power grids and contributes to greenhouse gas emissions, especially when the electricity comes from fossil fuels.
Water Usage: Data centers require sophisticated cooling systems to prevent their equipment from overheating. A single data center can consume millions of gallons of fresh water annually, raising concerns in regions already facing water scarcity. As the number of data centers grows, so too will their demand for water.
Hardware and Resources: The specialized chips and components needed for AI, such as GPUs and TPUs, rely on raw materials and rare earth elements that are often obtained through environmentally destructive mining practices.
Do We Need To Worry At All?
The true concern lies within Artificial General Intelligence (AGI) which is theoretical, or may not even be possible:
Narrow AI: All of the AI systems we have today are "narrow." They are highly specialized and can only perform a single or a limited set of tasks. ChatGPT, Gemini, etc.
Artificial General Intelligence (AGI): This is the hypothetical "human-level" AI that can reason, learn, and apply its intelligence to a wide variety of tasks, much like a human. Experts have differing views on when, or even if, AGI will be achieved. Some predict it could be within a few decades, while others believe it is a much more distant or impossible goal. This is because at the underlying of every model is a power plug and a human hand.
My opinion:
Given the unsustainable nature of this endeavor, I do not believe it is much of a threat. I think universities that are developing AI majors and AI curriculum are digging themselves into a hole. The most you can really learn about AI is taught during a one hour class developing prompts. How much more can we delve into prompt configuration or coding? We shall see.

Comments
Post a Comment