We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

stat.ML

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Statistics > Machine Learning

Title: Separation capacity of linear reservoirs with random connectivity matrix

Abstract: We argue that the success of reservoir computing lies within the separation capacity of the reservoirs and show that the expected separation capacity of random linear reservoirs is fully characterised by the spectral decomposition of an associated generalised matrix of moments. Of particular interest are reservoirs with Gaussian matrices that are either symmetric or whose entries are all independent. In the symmetric case, we prove that the separation capacity always deteriorates with time; while for short inputs, separation with large reservoirs is best achieved when the entries of the matrix are scaled with a factor $\rho_T/\sqrt{N}$, where $N$ is the dimension of the reservoir and $\rho_T$ depends on the maximum length of the input time series. In the i.i.d. case, we establish that optimal separation with large reservoirs is consistently achieved when the entries of the reservoir matrix are scaled with the exact factor $1/\sqrt{N}$. We further give upper bounds on the quality of separation in function of the length of the time series. We complement this analysis with an investigation of the likelihood of this separation and the impact of the chosen architecture on separation consistency.
Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG); Probability (math.PR)
MSC classes: 68T07, 60B20, 37M10
Cite as: arXiv:2404.17429 [stat.ML]
  (or arXiv:2404.17429v2 [stat.ML] for this version)

Submission history

From: Youness Boutaib [view email]
[v1] Fri, 26 Apr 2024 14:10:55 GMT (813kb,D)
[v2] Wed, 1 May 2024 15:53:49 GMT (813kb,D)

Link back to: arXiv, form interface, contact.