Boosting the ‘Brains’ of Computers with Less Wasted Energy

New computer program could mimic human brains using target memories and AI to help devices perform better, cooler

energy-efficient computer technology
AI (Artificial Intelligence) concept. 3D rendering.

Many internal components used in today’s computers reach temperatures that are hot enough to cook a Thanksgiving meal. The heat produced by the computations can easily burn human skin and tissue – and much of the heat is simply wasted energy, a byproduct of the computer’s internal functions.

Now, Purdue University researchers are working on more energy-efficient technology to better mimic functions of the human brain and produce only a fraction of the heat.

“The human brain is a wonderfully efficient machine that actually does much of the computing work within the memory,” said Kaushik Roy, Purdue’s Edward G. Tiedemann Jr. Distinguished Professor of Electrical and Computer Engineering. “Purdue is working with other leading research organizations to develop devices and algorithms that act like the human brain and reduce energy consumption by computing within the memory itself.”

Von-Neumann
This is an illustration of the von-Neumann bottleneck. Frequent to-and-fro data transfers between the processor and memory units incur large energy consumption and limit the throughput. ‘In-memory computing’ allows performing computations on the data they store within, thereby reducing the number of unnecessary transfers of data to the processor. (Image provided) Download image

The Purdue researchers are developing artificial intelligence algorithms that can be used with current computer applications but consume less energy. Researchers at Purdue’s Center for Brain-inspired Computing Enabling Autonomous Intelligence are studying and developing algorithms that can allow personal robots, self-driving cars and drones to interpret their environments and then make decisions based on what they perceive.

“The brain computes through a dense connection of neurons and synapses employing integrated memory and compute units, unlike the von-Neumann computing prevalent in all modern computers,” said Roy, who leads C-BRIC. “Seeking inspiration from the brain, we invent circuit techniques to perform computations within the memory itself, leading to energy-efficient implementation of algorithms.”           

The Purdue algorithms use brain-like networks, called spiking neural networks, to convert voice and image inputs into a common special representation of spike patterns, similar to the brain. Using special encoding and decoding, the voice input can be used to recall the images.Roy said this current AI research has applications for personal robots, drones, smart vehicles and other devices that already perform functions similar to brain computations.

neural network
This image shows a spiking neural network that can convert an audio input into a handwritten digit by using only spikes to communicate between layers. (Image provided) Download image

C-BRIC involves 19 faculty and around 85 graduate students and postdoctoral researchers across several major universities. The center is supported by $32 million in funding from the Semiconductor Research Corp. through the Joint University Microelectronics Program, which provides funding from the Defense Advanced Research Projects Agency and a consortium of industrial sponsors.

Their work aligns with Purdue’s Giant Leaps celebration, acknowledging the university’s global advancements in artificial intelligence as part of Purdue’s 150th anniversary. It is one of the four themes of the yearlong celebration’s Ideas Festival, designed to showcase Purdue as an intellectual center solving real-world issues.

Roy has worked with the Purdue Research Foundation Office of Technology Commercialization on patented technologies that are providing the basis for some of the research at C-BRIC. They are looking for partners to license the technology.