Mind blowing

Why we’re a long way from computers that really work like the human brain

August 15, 2013
August 15, 2013

IBM was in the news last week when it announced it had created “an entire computing architecture based on the brain” (at least, that’s how Gizmodo summed it up). A few days before that, a press release announced that scientists from Forschungszentrum Jülich, a research center in Germany, had run the world’s largest ever computer simulation of a human brain using the K computer, a Japanese supercomputer that is the world’s fourth fastest. Both these events were duly reported as being like replications of the human brain. Yet that is a long way away.

IBM’s research, part of a project announced in 2008 called SyNAPSE, involves building chips that bring memory and processing power close together. That makes them more compact, faster, and less power-hungry than conventional chips, where the two functions are separate and shuttle data back and forth. Last week’s announcement was in fact of a new programming language designed for such chips. The project has had over $100 million of funding so far, from the US military. Meanwhile German scientists, led by Dr Markus Diesmann, a computational neurophysicist, are part of the Human Brain Project, which along with graphene this year won the largest research award in history: €1 billion in total ($1.3 billion).

The trouble is that at the moment, no computer is powerful enough to run a program simulating the brain. One reason is the brain’s interconnected nature. In computing terms, the brain’s nerve cells, called neurons, are the processors, while synapses, the junctions where neurons meet and transmit information to each other, are analogous to memory. Our brains contain roughly 100 billion neurons; a powerful commercial chip holds billions of transistors. Yet a typical transistor has just three legs, or connections, while a neuron can have up to 10,000 points of connection, and a brain has some 100 trillion synapses. “There’s no chip technology which can represent this enormous amount of wires,” says Diesmann.

Even deciding what counts as “simulating the brain” is tricky. Diesmann’s team ran the K computer, a Japanese supercomputer that is the world’s fourth fastest, and simulated the activity of 1.73 billion nerve cells connected by 10.4 trillion synapses—or about 1% as many as in the human brain. By contrast, IBM last year simulated 530 billion neurons and 100 trillion synapses. But Diesmann’s experiment has been called “the largest general neuronal network simulation to date,” because his team’s version was more sophisticated.

Tap image to zoom
Transistors have become smaller since the EDSAC II computer in 1958, but the separation of processing and memory remains a constant.(Quartz/Leo Mirani)

Indeed, IBM’s researchers stressed that (pdf) they had “not built a biologically realistic simulation of the complete human brain.” There are several obstacles to doing so. Neurons have many characteristics and properties. Any simulation can represent only a few of them, and hope this makes for a reasonably realistic model. “We don’t know which level [of description] is the correct one,” says Diesmann.

A faithful simulation must also account for synapses as well as neurons. Diesmann’s project did this by allotting 24 bytes of memory to each synapse, which would allow it to “learn.” Then there is the brain’s “deterministic chaos”, which is all the activity that appears random but would reappear under identical conditions. And since the K Computer is, despite its awesome power, still a traditional computer, it takes time for all that simulation to happen. It took 40 minutes to run one second’s worth of activity in the German team’s artificial 1% brain.

IBM’s solution to this problem is a new kind of chip made up of 4,000 “corelets” that each comprises 256 “neurons” for processing, as well as memory and communications. Each chip therefore would have the equivalent of 1 million neurons. IBM figures this could scale up to brain size.

The end goal of all such projects is the same: To apply the principles of the human brain to computing, so that machines can work faster, use less power, and develop the ability to learn. Diesmann hopes that with the advent of exascale computing—processing power 1,000 times greater than currently exists—within the next decade, we might be able to better understand how the brain works. IBM is hoping that its new form of chip will help it to “build a neurosynaptic chip system with 10 billion neurons and 100 trillion synapses, all while consuming only one kilowatt of power and occupying less than two liters of volume.” But a fully working simulation of the brain, Diesmann thinks, is still a decade or two away.

Top News

Powered by WordPress.com VIP
Follow

Get every new post delivered to your Inbox.

Join 23,527 other followers