Oregon State University’s Breakthrough Chip Cuts AI Language Model Energy Use by Half
Oregon State University (OSU) researchers have developed groundbreaking technology designed to cut the energy consumption of artificial intelligence’s Large Language Models (LLMs) by half, addressing the potential shortfall in available energy projected for 2026.
OSU Technology To Halve Electricity Use Of LLMs
Because LLMs such as Gemini and ChatGPT consume a great deal of energy, the International Energy Agency estimates that electricity consumption by data centers will equal Japan’s current total consumption of around 1,000 terawatts by 2026.
OSU noted that data rate demands keep increasing, but the production of energy needed to transmit the data is unable to keep up.
OSU has developed a new chip that aims to reduce energy usage by recognizing and correcting errors. To do this, the technology embedded in the chip will be able to significantly aid in recovering original data and identifying errors, and- in doing so, result in a 50% reduction in energy consumption.