Data centers can cut energy use by up to 30% with just about 30 lines of code, research shows

Research has found that data centers can reduce their energy usage by up to 30% simply by altering Around 30 lines of code in the linux kernel’s network stock. Scientists from the University of Waterloo in Canada Identified Inetificiencies in the Way Servers Process Process Incoming Network traffic.
The Breakthrough Comes from Interrupt Request Suspension, A Technique That Optimies CPU Power Efficiency by Reduction Unnecessary Interruptions DURING HIGH-TRAFFIC CONDITIONS. Typically, when a new data packet enters the network, it triggers an interrupt request, causing the cpu core to pause its current task to process the data, Slowing Things Down.
The new code reduces interrupt requests by allowing the system to activly check the network for new data packets when needed instead of waiting for each individual interrupt. However, this approach is Power-Intensive, The System Reverts to interrupt handling when traffic slows.
By refining how the kernel handles IRQs, data throughput improves by up to 45% While ensuring tail latency remains low. In other words, the system can handle more traffic without delays for the most time-sensitive operations. The modification has been incorporated into Linux Kernel Version 6.13,
“We Didn’T Add Anything,” said Cheriton School of Computer Science Professor Martin Karsten in a Press release“We just rearring what is done when, which leads to a much better usage of the data center’s cpu caches. It’s kind of like rearring the pipeline at a manufacturing plant, so that you don’t have people running Around all the time. “
Data centers will be responsible for up to 4% of Global Power Demand By 2030Driven by AI, at Least in Part. Training Openai’s GPT-4, with 1.76 trillion parameters, consumed an amount of energy equivalent to the Annual Power Usage of 5,000 US HouseholdsThis Figure does not even increase the electricity required for infection, which is the process in which the ai generates outputs based on new data.
See: sending one email with chatgpt is the equivalent of consuming one bottle of water
Data Center Operators Arguable Have a Responsibility to Reduce their carbon footprint, Yet it does not appear to be a priority. A Report from the uptime institute found that fewer than half of data center owners and operators even track key sustainability metrics
Individual businesses don’t appear to be motivated to take a stand against their data centers’ Energy-invited practices, eite. In Fact, Recent Research Found that Nearly Half of Businesses are Relaxing Sustainability Goals to Allow for their AI Expantions.
Tech giats have also come under scrutiny. In July, Google came under fire after its annual environmental report revealed that its emissions had increased by 48% in four years, largely due to the expansion of its data centres to support AI developments.
AOIFE FOLEY, Senior Member of the Institute of Electrical and Electronics Engineers and Engineering Professor at Queen’s University Belfast, Told Techrepublic in an email Generate and Accumulate Vast Amounts of Data. These inclusions routine activities Across Enterprise Systems, Machines, Sensors, and Demand-Side Digitalization.
“All of this data comes in multiple forms – Whiter Redundant or Critical. However, the majority is unstructed and inrt content, commonly referred to as ‘Dark data’ which is decided more prevalent. The result is a large volume of digital data that needs to be stored, Most of which will not even be accessed later.
“That Managing Data Center and Server Rooms must Strive for a High Standard of Energy Efficiency, Demonstrated Through Agressive Power Use Effectiveness Targets. Achieving sustainability means addressing environmental considersrations during solution design as well as during the building. Solutions Must Meet Pre-Defined and Agreed Environmental Sustainability Criteria. This Includes Filtering Dark Data, Removing Unnecessary Information from Storage and Relying Upon ‘Grener’ Energy Sources. “