VLADGRIN

IBM scientists create a freaky brain-like computer architecture

Roboticists have long accepted that nature’s millennia of R&D have already developed some of the best designs for autonomous moving creatures. So, it stands to reason that researchers creating the next generation of computer processors would do well to look at nature's most powerful data cruncher: the human brain.

To that end, IBM scientists have just unveiled a “breakthrough software ecosystem” designed to work with a silicon chip architecture inspired by the function, low power, and compact volume of the human brain via a series of so-called “neurosynaptic cores.”

“Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm,” said Dr. Dharmendra S. Modha, Principal Investigator and Senior Manager, IBM Research.

Since 2011, an IBM research project in coordination with DARPA dubbed SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) has been developing nanometer-scale electronic synaptic components analogous to those in the human brain. In conjunction with this synapse-like chip architecture, the new programming model would facilitate an electronic system where all parts are able to work together simultaneously, as in the brain.

According to the company, the new technology might enable intelligent sensor networks that mimic the brain’s abilities for perception, action, and cognition. "While complementing today’s computers, this will bring forth a fundamentally new technological capability in terms of programming and applying emerging learning systems,” said Modha.

Today’s computers kick ass at sequential operations based on if-x-then-y algorithms, but this new programming architecture would allow computers to analyze situations and apply solutions as those situations warrant.

The brain is still the best

While computers excel at crunching a tsunami's worth of 0s and 1s quickly, for certain tasks, good ol' skull goo is still what's most efficient and most capable.

For example, the human brain can take in a terabyte of visual information throughout the day and parse it nearly instantly for spatial relations, familiar faces, and impending danger. That is a quite feat of computing power and only describes a fraction of the activity the brain plows through during the course of a day. And the whole thing takes place in an organ that is smaller than a toaster and runs on less power than a 20-watt light bulb.

IBM foresees a day when this new neural-like programming model might be deployed with advanced sensors that would facilitate real-time capture and analysis of various types of data. For better or worse these types of technologies might very likely be utilized for real-time Big Data analysis, but they might also be used to help impaired people regain aspects of their lost sensory functionality.

How Skynet is born

IBM’s long-term goal is to build a neurosynaptic chip system with ten billion “neurons” (which, for the record, would still pale in comparison with the average adult human brain, which boasts around 86 billion neurons). The new technology might even be combined with coldly methodical systems like IBM’s Watson to create a right-brain, left-brain scenario where Watson (the left brain) would focus on language and analytical thinking and the cognitive chip architecture (the right brain) would be used for the senses and pattern recognition.

The new technology might rival (or perhaps complement) Google’s collaboration with Singularity star Ray Kurzweil to make a virtual AI assistant that could analyze massive amounts of data on an unparalleled scale.

The corporate rivalries of the near future may have less to do with any Android vs. Apple or Facebook vs. Google nonsense, but rather on which virtual intelligence will have more impact (and hopefully not dominance) on our lives.

[IBM]

Follow TechHive on Tumblr today.

Subscribe to the Power Tips Newsletter

Comments