Researchers at Massachusetts Institute of Technology (MIT) claim to have successfully created analog synapses that are one million times faster than those found in the human brain.
Just as our brains need synapses and neurons to create a series of links between different parts of the brain, and digital processors need transistors, analog processors require programmable transistors. When put into the right configuration, these can be programmed to act much like the human brain.
MIT has created synapses, made from a form of specialised glass, which are extremely fast and efficient. This glass, called PSG, can withstand high voltages without breaking, allowing researchers to send protons at breakneck speeds with little resistance. These connections allow for AI to create neural responses in nanosecond speeds, much faster than the human brain.
“The action potential in biological cells rises and falls with a timescale of milliseconds, since the voltage difference of around 0.1 volt is constrained by the stability of water,” explained researcher Ju Li. “Here we apply up to ten volts across a special solid glass film of nanoscale thickness that conducts protons, without permanently damaging it.”
- Facial recognition for remote exams raises huge concerns
- Flanders statistics agency to use AI to better understand civilian needs
This development will help in the creation of even faster neural networks, algorithms that attempt to recognise relationships in sets of data similar to the way our natural brains process data.
This greater speed and efficiency will allow researchers to grow neural networks in scale and reduce the carbon footprint of the computers needed to run them. The increase in potential processing speed, researcher Murat Onen said, is not comparable to a faster car, but to a “spacecraft.”
In MIT’s research paper, the scientists express hope that the new discovery will advance the field of analog deep learning, a new and innovative field in artificial intelligence. It is hoped, Onen said, that the new networks will have “unprecedented complexity” and will be capable of outperforming all other types of neural networks.