Energy Consumption Showdown: Bitcoin vs. AI
A recent study on energy use in technology sectors reveals that the energy required to operate artificial intelligence (AI) systems has surpassed the energy needed for Bitcoin mining. However, this isn’t good news for Bitcoin miners, an industry already under scrutiny for its high electricity consumption.
AI’s Growing Energy Demand Outpaces Bitcoin
According to the research, AI has become a fierce competitor to Bitcoin in terms of electricity and equipment usage. The abundant capital in the AI sector allows companies to outbid Bitcoin miners for the necessary energy resources.
While the AI industry is still in its early stages, the energy demands of generative AI models are immense. Goldman Sachs estimates that a single ChatGPT query consumes nearly ten times the energy of a typical Google search. MIT Technology Review reports that creating AI-generated images can use as much energy as charging a smartphone.
The significant energy consumption of Bitcoin mining has led to the threat of bans in Europe and an actual ban in New York. BPI reports that the annual energy usage of Bitcoin mining facilities in the U.S. is around 121.13 TWh, while AI consumed between 20 and 125 TWh in 2023.
However, with the rapid growth of generative AI this year, the report estimates that AI will use 169 TWh in 2024, with growth outpacing Bitcoin mining, reaching around 240 TWh by 2027.
In comparison, Bitcoin mining in the U.S. alone is estimated to require between 10,000 and 13,000 liters of water annually, with each transaction said to use enough water to fill a swimming pool.
Pressure Mounts on Bitcoin Miners
AI’s profit margins are currently much higher than those of Bitcoin mining. Cryptocurrency mining generates revenue from $0.17 to $0.20 per kWh, while revenue from Nvidia graphics processing units used for AI can range from $3 to $5 per kWh, a difference of 17 to 25 times.
Given this disparity, why don’t Bitcoin mining companies repurpose their rigs to run AI and make more money?
Anibal Garrido, a crypto asset advisor and Bitcoin mining expert, explains that this leap is not easy. Bitcoin miners use application-specific integrated circuit (ASIC) machines designed solely to calculate the hash value of the PoW protocol, making them unsuitable for other purposes like AI.
Bitcoin mining rigs are much more flexible and can be turned off or on to take advantage of excess, waste, or cheap power. In contrast, AI requires 99.9% uptime for its models to function correctly. This demand could lead to the use of less environmentally friendly energy sources, as power plants that address sudden surges in demand often rely on fossil fuels, exacerbating environmental impacts.
The flexibility that Bitcoin mining offers also allows operators to make agreements with governments to ensure they stop consuming energy when the grid is saturated. Once the grid stabilizes, miners can resume operations, giving the grid more flexibility to maintain balance.
A BPI report shows that U.S. Bitcoin mining companies have halted operations 5% to 31% of the time when electricity prices were too high or when directed by grid operators.
The study, which collected data from eight U.S. mining facilities between July and September 2023, estimates that these interruptions prevented 13.6 million tons of CO2 emissions. This reduction is equivalent to removing 2,951 cars from the road.
Another key difference between the two technologies is location requirements. Bitcoin mining is location-independent, while AI demands low latency to provide rapid feedback, requiring data centers to be near major urban areas.
This means AI data centers must consume whatever energy is available at those specific locations, while Bitcoin miners can relocate to energy-surplus locations, such as renewable energy facilities in remote areas with abundant hydro, solar, or wind power.
The non-location-specific nature of Bitcoin mining also allows for the use of wasted energy. This includes mining with stranded hydroelectric power, capturing excess methane emissions, heating through electrification by reusing waste heat, or harnessing renewable energy from solar and wind sources that might otherwise be stranded due to transmission constraints.
AI developers are employing various techniques to enhance energy efficiency, including fine-tuning existing models, using smaller models for specific tasks, and leveraging cloud solutions that can significantly reduce overall energy consumption.
Advances in hardware may also play a crucial role. Graphics manufacturers like Nvidia are at the forefront of developing specialized hardware that improves performance while consuming less energy. The combination of more efficient algorithms and advanced hardware could help address AI’s growing energy demands in a more sustainable way.
https://bitforum.net – Crypto forum discussions about all aspects of cryptocurrency bitforum socialfi #InnovationSocialNetwork