Creating machine learning technology from human neurons

A team of scientists have developed a novel artificial neuron based on the framework of human neurons, potentially advancing machine learning capabilities.

Created by researchers at the University of Liège – the Bistable Recurrent Cell (BRC) – provides a monumental enhancement of machine learning technology, utilising the different modes of operation of human neurons to fashion an innovative artificial neuron with remarkable proficiencies.

The Bistable Recurrent Cell has far surpassed conventional machine learning processes, allowing recurrent networks to study temporal relationships in excess of 1,000 discrete time units, with the other methods failing after only 100. Their research is published in the journal PLOS One.

Machine learning processes have evolved drastically in recent years due to the boom of artificial intelligence, offering a vast array of applications that we frequently indulge in our everyday life. An example of this is time series – data in which time is a present component – such as weather patterns, electroencephalograms, and stock prices.

Time series is a division of machine learning that employs knowledge of past events to predict future ones, with a specific type of artificial neural network – recurrent neural network (RNN) – being established in recent years that aids the network in storing information over time so that time series can correctly process it. The network efficiently updates its memory as new data is received; however, training these networks being extremely challenging, as their memory capabilities are limited in time.

Nicola Vecoven, a doctoral student in the Systems and Modeling lab at the University of Liège and first author of the study, said: “We can imagine the example of a network that receives new information every day, but after the fiftieth day, we notice that the information from the first day had already been forgotten.

“However, human neurons are capable of retaining information over an almost infinite period of time thanks to the bi-stability mechanism. This allows neurons to stabilise in two different states, depending on the history of the electrical currents they have been subjected to, and do this for an infinite period of time. In other words, thanks to this mechanism, human neurons can retain a bit (a binary value) of information for an infinite time.”

By employing the bi-stability mechanism, the researchers were able to successfully manufacture a new artificial neuron with an identical mechanism, amalgamating it into the recurrent artificial networks. The results were phenomenal, achieving the learning of temporal relationships of over 1,000-time steps, with the team now working to advance the memories of RNN’s further.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements

Media Partners

Similar Articles

More from Innovation News Network