ChatGPT’s contribution to climate change

ChatGPT’s excessive use has become hazardous to the environment ( Sophia Nagra/ The Puma Prensa)

Written By: Sophia Kaur Nagra, Social Media Manager

Born in the 1950s and taking the world by storm come 2022, artificial intelligence has always been a cause of both excitement and intimidation. With ChatGPT users and their queries increasing exponentially since its introduction to the public, AI has bred issues from escalating plagiarism and misinformation, to climate change.

The first commercial microprocessor– the "brain" of a computer, presented as a small, powerful chip that performs calculations and executes instructions– was created by Intel, a California-based tech company, the 4004, and was introduced in 1971. Multiple processing operations were unified on a single integrated chip, and it had more and quicker transistors– a tiny electronic component that controls electrical signals and power by amplifying or switching them– than previous technology. Because the 4004 was a reusable module, engineers could integrate it into various electrical devices, including computers. 

As many remember, the traditional computer had a large backside consisting of a mainframe system, similar to an individual data center, to power the device. These mainframes would heat the machine, as the computational engine was housed in one central unit, requiring substantial processing power. The high volume of calculations and data transfers had to be promptly dissipated to prevent performance degradation or hardware failure. To manage this, early machines incorporated bulky cooling solutions like fans. 

As the demand for more computational power in a more compact product increased, the need for more efficient cooling and convenient carrying methods grew. This led to the rise in data processing centers, which were made to manage the larger volumes of data primarily used by business or government operations from the 1960s-70s— just a decade after there was no use for them at all. Over time, having one, if not multiple devices for work and personal life became the norm. By the 2000s, the development of enterprise-level data centers spiked to support cloud services, which essentially borrows computing power and software from a provider instead of owning and managing individual servers and software. These providers include major companies such as Amazon, Google, and Microsoft.

Fast-forward to December 2015, when OpenAI was founded as a nonprofit company by Sam Altman who then later transitioned it into a for-profit in 2019. This was also the year that Google’s emissions rose by 48%

With a direct tie to emissions, energy is being expended at alarmingly fast rates via AI. A single ChatGPT query uses nearly ten times the electricity of a Google search, and up to thirty-three times that of any traditional software– adding up to over 200 million queries, which amounts to about 621.4 megawatt-hours (MWh) per day. This accounts for 24 hours’ worth of electricity usage for over 30,000 American homes.

This astronomical energy usage is required to run and cool these data centers, which then release carbon dioxide into the atmosphere. In 2022, Microsoft reported that its data center’s emissions consisted of 280,782 metric tons of carbon dioxide equivalent, the total warming impact of all greenhouse gases.

However, this is just the beginning, “The International Energy Agency forecasts that by 2026, power demand for AI applications could double, equating to the electricity used by the entire nation of Japan.” 

Although it is empowering to have the world and its answers at our fingertips, we must maintain a delicate hand, so it may not suffocate.

Previous
Previous

Trump and DOGE: what do they mean for the U.S.?

Next
Next

Games, Gacha, and Gambling: real-world consequences