What are the limitations to reservoir computing?

A new study has identified the limitations to reservoir computing that have been overlooked by the research community so far.

A change in one place can trigger a huge change elsewhere in nonlinear dynamic systems.

Examples of this include the climate, the workings of the human brain, and the behaviour of the electric grid. Dynamic systems like these are extremely difficult to model because of their inherent unpredictability – changing dramatically over time.

However, researchers in the last two decades have reported success modelling high-dimensional chaotic behaviours with a Machine Learning approach called reservoir computing.

“Machine Learning is increasingly being used to learn some complex dynamic systems that we don’t have a good mathematical description for from data,” said Yuanzhao Zhang, an SFI Complexity Postdoctoral Fellow.

Recent papers have reported that reservoir computing is effective in predicting the trajectory of chaotic systems. It has been argued that they are efficient even after seeing little training data, and can determine where the system would end up from its initial conditions.

Zhang aimed to find out if the reports were true.

In his research, created in collaboration with physicist Sean Cornelius at Toronto Metropolitan University, Zhang identified limitations to reservoir computing. He argues that these limitations suggest ‘Catch-22’ that can prove hard to circumvent, especially for complicated dynamic systems.

“It’s one of those limitations that I think hasn’t been very well appreciated by the community,” said Zhang.

What is reservoir computing?

Reservoir computing, first proposed by computer scientists more than 20 years ago, is a nimble predictive model, built with neural networks. The model is simple and cheaper to train than other neural net frameworks.

Next-generation reservoir computing was introduced in 2021. This model has several advantages over the conventional system, including less data to train.

Recent studies exploring its use in Machine Learning applications show that reservoir computing and the next-generation models can be powerful in modelling dynamic systems, even with little data.

However, further examinations done by the team showed that they both still have problems.

Problems with the Machine Learning technology

The team looked at a simple dynamic chaotic system for next-generation reservoir computing. This system is a pendulum with a magnet attached at the end, swinging among three magnets fixed in a triangle on a flat surface.

They discovered that if they gave the system information about the type of non-linearity needed to describe the system, then it performed well.

“In a sense, you have this kind of information sneaked in before the training begins.”

But if the model was perturbed, the team found that it performed very poorly.

This suggests that the model cannot make accurate predictions unless crucial information about the prediction was already built in.

For reservoir computing, the duo observed that to correctly predict the system, the model requires a long warm-up time. This is almost as time-consuming as the dynamic movements of the magnet itself.

The team concluded that addressing these limitations could help researchers better use this emerging framework.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements

Media Partners

Similar Articles

More from Innovation News Network