IBM's Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE)

If you are not familiar with how neural networks work, check out "On Intelligence" by Jeff Hawkins or "brain bugs" by  Dean Buonomano.  A neural net is a mesh network of neurons connected by synapse. Information is stored in the strength of the connections between neurons (the Synapse). Biological neural networks are plastic, they connections between neurons change as new associations are made ( learning ).



Researchers at IBM have been working on a cognitive computing project called Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE). By reproducing the structure and architecture of the brain—the way its elements receive sensory input, connect to each other, adapt these connections, and transmit motor output—the SyNAPSE project models computing systems that emulate the brain's computing efficiency, size and power usage without being programmed. In the brain, neural connections are plastic. They chang...


Modha: The memory holds the synapse-like state, and it can be adapt...



The saying, "neurons that fire together, wire together" is derived from Hebb's rule.  I wrote about Hebb's rule in my review of the book, "Brain Bugs".


The new chip is an "array of neurons and synapses" connected via network.   

"IBM has released only limited details about the workings and performance of its new chips. But project leader Dharmendra Modha says the chips go beyond previous work in this area by mimicking two aspects of the brain: the proximity of parts responsible for memory and computation (mimicked by the hardware) and the fact that connections between these parts can be made and unmade, and become stronger or weaker over time (accomplished by the software).

The new chips contain 45-nanometer digital transistors built directly on top of a memory array. "It's like having data storage next to each logic gate within the processor," says Cornell University computer scientist Rajit Manohar, who's collaborating with IBM on hardware designs. Critically, this means the chips consume 45 picojoules per "event," mimicking the transmission of a pulse in a neural network. That's about 1,000 times less power than a conventional computer consumes, says Gert Cauwenberghs, director of the Institute for Neural Computation at the University of California, San Diego."


Modha's team has simulated neural networks as large as a monkeys brain, but even with supercomputers could not get anywhere hear real-time performance.  This new chip will "execute" large neural networks in real-time.  


For additional information about how neural networks work, check out this page:



For Further reading:




Views: 505


You need to be a member of buildsmartrobots to add comments!


© 2017   Created by eric gregori.   Powered by

Badges  |  Report an Issue  |  Terms of Service