Scientists from IBM have unveiled a software system to support the programming of computer chips that can act like the brain.
The chips in question would have an architecture “inspired by the function, low power and compact volume of the brain,” said Big Blue. The technology could “enable a new generation of intelligent sensor networks that mimic the brain’s abilities for perception, action, and cognition,” it added.
Such technology could be used for sensory-based data input capabilities and on-board analytics for automobiles, medical imagers, healthcare devices, smartphones, cameras and robots, said IBM.
IBM’s new programming differs from the sequential operation underlying today’s von Neumann architectures and computers. It is instead “tailored for a new class of distributed, highly interconnected, asynchronous, parallel, large-scale cognitive computing architectures,” IBM said.
Dr Dharmendra Modha, principal investigator and senior manager at IBM Research, said, “Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm. We are working to create a FORTRAN for synaptic computing chips.
“While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems.”
To advance and enable the new programming ecosystem for these new chips, IBM researchers have developed technology to support all aspects of the programming cycle from design through to development, debugging and deployment.
IBM has developed a multi-threaded, massively parallel and highly scalable functional software simulator of a cognitive computing architecture, comprising a network of neurosynaptic cores.
There is also a simple, digital, highly parameterised spiking neuron model, that “forms a fundamental information processing unit of brain-like computation,” and supports a wide range of deterministic and stochastic neural computations, codes and behaviours. A network of such neurons “can sense, remember and act upon a variety of spatio-temporal, multi-modal environmental stimuli.”