Faculty-researcher explores new brain-inspired computing architectures to increase processing power

brain-inspired computing
Undergraduate and graduate students on Dhireesha Kudithipudi’s (center) NanoComputing Research Lab team include (left to right) Swathika Ramakrishnan, Amanda Hartung, James Thesing, James Mnatzaganian, Nicholas Soures, Qutaiba Saleh.

Researchers developing next-generation computer systems at technology/” title=”View all articles about Rochester Institute of Technology here”>Rochester Institute of Technology are designing brain-inspired computer architectures using memristors that will have increased processing capabilities, robustness and be more energy efficient.

Dhireesha Kudithipudi, a faculty-researcher in RIT’s Kate Gleason College of Engineering, received a grant award from the Department of the Air Force Research Laboratory, expected to total $605,639 over four years, to further develop the computer architecture for the project, “Reservoir computing and benchmarking of neuromorphic systems for size/weight/power constrained environments.”

Kudithipudi’s research would be twofold: to develop architectural capabilities that accelerate processing of natural tasks, and to benchmark the capacity needed for various brain-inspired systems based on the researcher’s assessment of energy, size and weight of computing environments.

Energy usage is a critical target for next-generation computing systems, and improved efficiency through neuromorphic systems could advance autonomous processing, critical for military or intelligence applications such as target identification or any anomaly detection. Data from the benchmarking aspect of the research could be used by computer scientists to strategically build systems based on design knobs for specific tasks.

“Applications are really critical in this paradigm,” said Kudithipudi, an associate professor of computer engineering and an expert in neuromorphic systems, defined as an interdisciplinary approach to developing computing infrastructure inspired by how the human brain performs its complex functions.

“Today, conventional systems are powerful at extreme levels of number-crunching for scientific data. However, they are not efficient at understanding ill-posed problems or performing spatio-temporal processing on noisy inputs,” she explained. “Our approach is to design these new classes of computing systems, known as neuromemristive systems, which are very powerful at intelligent processing.”

A key component of this system is a device called a memristor, a type of nano-electronic circuit which enables computing and memory in one substrate.

Conventional machine learning algorithms are very good at some tasks, but focus primarily on maximizing accuracy. Using neuromorphic computing, Kudithipudi said scientists would be able to maximize accuracy along with the overall system footprint. The critical part of her work in this area is not just to build systems that are better at prediction, but to build a system that is energy efficient and compact.

“Part of our effort is to benchmark different cognitive tasks that can be performed in neuromorphic systems, not necessarily for memristors, but using general purpose systems that we have,” she said. “If somebody wants to apply these systems for prediction, they’d be able to say, ‘I can use this algorithm on this specific piece of hardware, and this is the power budget I have.’”

Reservoir computing is a type of recurrent artificial neural network with a simple training algorithm. Neuromorphic computing could improve multi-modal signal processing defined as the ability to acquire, manage and assess data efficiently from multiple input streams, which is currently not feasible with traditional computing systems. One of the characteristics of brain-inspired computing, or neuromorphic computing systems, is they are unaffected by noise—variations in data inputs within the system.

“This is the right time for this kind of research. The technology is there, the momentum is there because of the unsurmountable data – a lot of basic research has been established since the 1800s, so this is an opportune time to do research in this field to make an impact in building large scale intelligent systems, ” she said.

Her team has baseline architectures and circuits developed, and an understanding of which types of system integration are efficient. The group has researched use of digital and mixed-signal approaches, all the building blocks for the system they are looking to improve.

“There are a lot of parallel efforts going on in this area where people are trying to actually mimic the brain, but we are not trying to do that,” Kudithipudi said. “Our approach is to take inspiration from the human brain and build better systems from there.”