by Jesse Arsenault student, UMass-Amherst (2018)
In the past few decades, research has exploded within both neuroscience and computer science. Continuous discoveries and inventions have revitalized our understanding of these fields, tempting us with the possibilities of super-efficient AI, brain-computer interfaces, and other snippets of sci-fi daydreams. While it is certainly tempting to draw analogies between the brain and the modern computer, this sort of comparison is not exactly airtight. For instance, the structure of a conventional computer circuit is immensely different from that the brain, which is made up of neurons with hundreds or thousands of branching connections that allow signals to be passed around in creative nonlinear pathways.
Like the squishy and highly varied neurons that make up the brain, transistors – the digital amplifiers of electric signals – are one of the fundamental building blocks of most electronic devices, including computers. But their elegant, tightly controlled design makes them quite different from living, variable, and highly interconnected nerve cells. Regardless, they are staples of modern computing, helping to shuttle electronic signals back and forth between the regions of the computer designed for memory and calculation. This transfer process, though essential, presents a huge problem to computer engineers: its immense energy cost.
In fact, Dr. Qiangfei Xia, a professor of Electrical and Computer Engineering at UMass Amherst, says there are three major problems facing the modern transistor. First, there is the problem of miniaturization: “You only have a certain amount of space to make transistors smaller,” he says. Eventually, the individual components that make computation possible will reach this lower size limit, preventing significant improvements in computing power. The second problem, perhaps unsurprisingly, is financial cost: smaller transistors are far more expensive to produce. Finally, there is the aforementioned energy burden – for instance, Xia highlights the fact that the supercomputer-powered mining of bitcoin and other cryptocurrencies has already exceeded the yearly energy consumption of some developed countries.
Xia believes that one solution to this problem of modern computing lies in the makeup of the human brain. Along with a team of UMass colleagues, Xia is studying a newer, more flexible type of transistor called a “memristor.” As its name implies, a memristor combines the functions of memory and computation, storing transient information about the signals passed through it. First conceived at UC Berkeley in the 1970s, the memristor transitioned from concept to concrete product in 2008, with its first experimental implementation at Hewlett-Packard. This relatively recent advance has allowed researchers to seriously investigate its applications as new tool for computation.
Memristors possess unique characteristics modeled after human neurons. When a neuron is triggered to “fire,” it opens up special pores along its surface, allowing charged ions like sodium and potassium to flow in, increasing its internal charge. This rapid shift in electric charge helps propel chemical signals between different regions of the brain at an incredible rate. Like neurons, the memristors studied by Xia and his colleagues rely on a movement of ions similar to those of nerve cells, introducing exciting new opportunities for innovating computer architecture. As an illustration, Xia points to the energy-demanding process of converting analog sensory data to digital 0s and 1s that can be understood by the computer. Memristors, he argues, may represent a more energy-efficient solution to this burdensome conversion process.
Ultimately, Xia makes sure to underline the highly interdisciplinary nature of his research. “We are collaborating with computer scientists, with neuroscientists, and with psychologists,” he says, pointing out that deep study in one field inevitably brings researchers into contact with many others. The challenge lies in the building of bridges between them: “An individual neuron processes information on the level of milliseconds,” says Xia, while a transistor processes that same information a million times faster. “We are a million times slower, but [our brains] can take advantage of massive parallelism. We win at parallelism,” he concludes, referring to capacity for computers to perform multiple distinct tasks at once. If Xia is right, and a dependable bridge can indeed be built between the living neuron and the artificial transistor, one day we may be able to pass such biological advantages on to our machines as well.