News Analysis

Neuromorphic Computing Mimics Human Brain

Perhaps the most impressive “computer” on Earth is the human brain, the complex organ that is more or less responsible for making a human a human. While comprising just a small percentage of a human’s mass, the brain controls and/or enables our movements, senses, emotions, and memories, and it makes rational thought possible. What if computers had the processing power of a human brain? What could be accomplished with a computer chip that operated more like the incredible organ that powers a human being?

In the emerging field of neuromorphic computing, researchers are working to design computer chips that mimic the way a human brain processes information. In January, MIT (Massachusetts Institute of Technology), www.mit.edu, engineers announced they’ve made progress in this arena by designing an artificial synapse that allows precise control of the strength of an electric current flowing across it, which MIT says is similar to the way ions flow between neurons in a human brain.

While today’s digital computer chips carry out computations based on binary on/off signaling, neuromorphic chips send information in a burst of different intensities—much like a brain—which could lead to processors that run machine-learning tasks with lower-energy demands. Further, MIT says neuromorphic chips could efficiently process “millions of streams” of parallel computations that are currently only possible with supercomputers. The MIT team built a small “brain on a chip” with artificial synapses made from silicon germanium and discovered that in simulations, the chip and its synapses could be used to recognize samples of handwriting with 95% accuracy. The design represents forward progress in the quest to build portable, low-power neuromorphic chips that could be used for learning tasks such as pattern recognition. It is also considered to be a steppingstone toward portable AI (artificial intelligence) devices.

Intel, www.intel.com, is also a pioneer in the neuromorphic-computing field. The company’s self-learning research chip, Loihi, mimics how the brain learns based on feedback from the environment. Intel’s equivalent of MIT’s handwriting test was an experiment that proved Loihi could rapidly distinguish among various items, including a rubber duck, an elephant figurine, and a bobblehead statue, based on a handful of photos. The chip accomplished this task in just four seconds, and Intel says the task leveraged a mere 1% of the chip’s resources. In March, Intel also announced the INRC (Intel Neuromorphic Research Community), a collaborative research initiative meant to encourage experimentation with the Loihi neuromorphic test chip.

Neuromorphic computing isn’t the only new computing paradigm that promises to drive innovation and solve real-world problems in AI, the IoT (Internet of Things), and beyond. Quantum computing may one day offer the computational power necessary to solve problems today’s classic computers can’t even touch. In the future, Intel’s neuromorphic computing program leader, Mike Davies, has predicted that robotics will be the killer app driving adoption of neuromorphic computing chips. The computing paradigms of tomorrow, including both neuromorphic and quantum computing, will likely revolutionize everything from robotics and AI applications to smart security, smart-city infrastructure, and autonomous vehicles, and beyond. For now, the race is on to provide relevant proof-of-concepts that push the envelope in both realms.

Want to tweet about this article? Use hashtags #IoT #M2M #security #cybersecurity #blockchain #AI #bigdata #machinelearning #analytics #neuromorphic #neuromorphiccomputing #quantum #quantumcomputing

By | 2018-04-20T17:47:01+00:00 4/18/2018|

Leave A Comment