Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    MIT Press ; 2012
    In:  Neural Computation Vol. 24, No. 4 ( 2012-04), p. 967-995
    In: Neural Computation, MIT Press, Vol. 24, No. 4 ( 2012-04), p. 967-995
    Abstract: In this work, a variational Bayesian framework for efficient training of echo state networks (ESNs) with automatic regularization and delay & sum (D & S) readout adaptation is proposed. The algorithm uses a classical batch learning of ESNs. By treating the network echo states as fixed basis functions parameterized with delay parameters, we propose a variational Bayesian ESN training scheme. The variational approach allows for a seamless combination of sparse Bayesian learning ideas and a variational Bayesian space-alternating generalized expectation-maximization (VB-SAGE) algorithm for estimating parameters of superimposed signals. While the former method realizes automatic regularization of ESNs, which also determines which echo states and input signals are relevant for “explaining” the desired signal, the latter method provides a basis for joint estimation of D & S readout parameters. The proposed training algorithm can naturally be extended to ESNs with fixed filter neurons. It also generalizes the recently proposed expectation-maximization-based D & S readout adaptation method. The proposed algorithm was tested on synthetic data prediction tasks as well as on dynamic handwritten character recognition.
    Type of Medium: Online Resource
    ISSN: 0899-7667 , 1530-888X
    Language: English
    Publisher: MIT Press
    Publication Date: 2012
    detail.hit.zdb_id: 1025692-1
    detail.hit.zdb_id: 1498403-9
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages