feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    UID:
    b3kat_BV035788032
    Format: 1 Online-Ressource (IX, 276 S. , Ill., graph. Darst.)
    ISBN: 3540522557 , 0387522557
    Series Statement: Lecture notes in computer science 412.
    Language: English
    Subjects: Computer Science
    RVK:
    RVK:
    Keywords: Neurocomputer ; Rechnernetz ; Sesimbra ; Neuronales Netz ; Neurocomputer ; Konferenzschrift ; Konferenzschrift
    URL: Volltext  (lizenzpflichtig)
    URL: Volltext  (lizenzpflichtig)
    URL: Volltext  (lizenzpflichtig)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    UID:
    almahu_BV003572595
    Format: IX, 276 S. : , Ill., graph. Darst.
    ISBN: 3-540-52255-7 , 0-387-52255-7
    Series Statement: Lecture notes in computer science 412
    Language: English
    Subjects: Computer Science
    RVK:
    RVK:
    RVK:
    Keywords: Neurocomputer ; Rechnernetz ; Neuronales Netz ; Neurocomputer ; Konferenzschrift ; Konferenzschrift
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    UID:
    almahu_9948621400902882
    Format: XIII, 279 p. , online resource.
    Edition: 1st ed. 1990.
    ISBN: 9783540469391
    Series Statement: Lecture Notes in Computer Science, 412
    Content: The EURASIP workshop contributions collected in this volume have an interdisciplinary character. The authors include psychologists, biologists, engineers and mathematicians as well as computer scientists. The volume starts with two invited papers, by George Cybenko and by Eric Baum, on the formal study of the capabilities of neural networks. The following papers are organized into parts dealing with theory and algorithms, speech processing, image processing, and implementation. The workshop was sponsored by the European Association for Signal Processing without restriction on the origin of participants.
    Note: When are k-nearest neighbor and back propagation accurate for feasible sized sets of examples? -- Complexity theory of neural networks and classification problems -- Generalization performance of overtrained back-propagation networks -- Stability of the random neural network model -- Temporal pattern recognition using EBPS -- Markovian spatial properties of a random field describing a stochastic neural network: Sequential or parallel implementation? -- Chaos in neural networks -- The "moving targets" training algorithm -- Acceleration techniques for the backpropagation algorithm -- Rule-injection hints as a means of improving network performance and learning time -- Inversion in time -- Cellular neural networks: Dynamic properties and adaptive learning algorithm -- Improved simulated annealing, Boltzmann machine, and attributed graph matching -- Artificial dendritic learning -- A neural net model of human short-term memory development -- Large vocabulary speech recognition using neural-fuzzy and concept networks -- Speech feature extraction using neural networks -- Neural network based continuous speech recognition by combining self organizing feature maps and Hidden Markov Modeling -- Ultra-small implementation of a neural halftoning technique -- Application of self-organising networks to signal processing -- A study of neural network applications to signal processing -- Simulation machine and integrated implementation of neural networks -- VLSI implementation of an associative memory based on distributed storage of information.
    In: Springer Nature eBook
    Additional Edition: Printed edition: ISBN 9783662204917
    Additional Edition: Printed edition: ISBN 9783540522553
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    UID:
    almahu_9947920909702882
    Format: XIII, 279 p. , online resource.
    ISBN: 9783540469391
    Series Statement: Lecture Notes in Computer Science, 412
    Content: The EURASIP workshop contributions collected in this volume have an interdisciplinary character. The authors include psychologists, biologists, engineers and mathematicians as well as computer scientists. The volume starts with two invited papers, by George Cybenko and by Eric Baum, on the formal study of the capabilities of neural networks. The following papers are organized into parts dealing with theory and algorithms, speech processing, image processing, and implementation. The workshop was sponsored by the European Association for Signal Processing without restriction on the origin of participants.
    Note: When are k-nearest neighbor and back propagation accurate for feasible sized sets of examples? -- Complexity theory of neural networks and classification problems -- Generalization performance of overtrained back-propagation networks -- Stability of the random neural network model -- Temporal pattern recognition using EBPS -- Markovian spatial properties of a random field describing a stochastic neural network: Sequential or parallel implementation? -- Chaos in neural networks -- The “moving targets” training algorithm -- Acceleration techniques for the backpropagation algorithm -- Rule-injection hints as a means of improving network performance and learning time -- Inversion in time -- Cellular neural networks: Dynamic properties and adaptive learning algorithm -- Improved simulated annealing, Boltzmann machine, and attributed graph matching -- Artificial dendritic learning -- A neural net model of human short-term memory development -- Large vocabulary speech recognition using neural-fuzzy and concept networks -- Speech feature extraction using neural networks -- Neural network based continuous speech recognition by combining self organizing feature maps and Hidden Markov Modeling -- Ultra-small implementation of a neural halftoning technique -- Application of self-organising networks to signal processing -- A study of neural network applications to signal processing -- Simulation machine and integrated implementation of neural networks -- VLSI implementation of an associative memory based on distributed storage of information.
    In: Springer eBooks
    Additional Edition: Printed edition: ISBN 9783540522553
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    UID:
    edocfu_9959186222202883
    Format: 1 online resource (XIII, 279 p.)
    Edition: 1st ed. 1990.
    Edition: Online edition Springer Lecture Notes Archive ; 041142-5
    ISBN: 3-540-46939-7
    Series Statement: Lecture Notes in Computer Science, 412
    Content: The EURASIP workshop contributions collected in this volume have an interdisciplinary character. The authors include psychologists, biologists, engineers and mathematicians as well as computer scientists. The volume starts with two invited papers, by George Cybenko and by Eric Baum, on the formal study of the capabilities of neural networks. The following papers are organized into parts dealing with theory and algorithms, speech processing, image processing, and implementation. The workshop was sponsored by the European Association for Signal Processing without restriction on the origin of participants.
    Note: When are k-nearest neighbor and back propagation accurate for feasible sized sets of examples? -- Complexity theory of neural networks and classification problems -- Generalization performance of overtrained back-propagation networks -- Stability of the random neural network model -- Temporal pattern recognition using EBPS -- Markovian spatial properties of a random field describing a stochastic neural network: Sequential or parallel implementation? -- Chaos in neural networks -- The “moving targets” training algorithm -- Acceleration techniques for the backpropagation algorithm -- Rule-injection hints as a means of improving network performance and learning time -- Inversion in time -- Cellular neural networks: Dynamic properties and adaptive learning algorithm -- Improved simulated annealing, Boltzmann machine, and attributed graph matching -- Artificial dendritic learning -- A neural net model of human short-term memory development -- Large vocabulary speech recognition using neural-fuzzy and concept networks -- Speech feature extraction using neural networks -- Neural network based continuous speech recognition by combining self organizing feature maps and Hidden Markov Modeling -- Ultra-small implementation of a neural halftoning technique -- Application of self-organising networks to signal processing -- A study of neural network applications to signal processing -- Simulation machine and integrated implementation of neural networks -- VLSI implementation of an associative memory based on distributed storage of information.
    In: Springer eBooks
    Additional Edition: ISBN 3-540-52255-7
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    UID:
    edoccha_9959186222202883
    Format: 1 online resource (XIII, 279 p.)
    Edition: 1st ed. 1990.
    Edition: Online edition Springer Lecture Notes Archive ; 041142-5
    ISBN: 3-540-46939-7
    Series Statement: Lecture Notes in Computer Science, 412
    Content: The EURASIP workshop contributions collected in this volume have an interdisciplinary character. The authors include psychologists, biologists, engineers and mathematicians as well as computer scientists. The volume starts with two invited papers, by George Cybenko and by Eric Baum, on the formal study of the capabilities of neural networks. The following papers are organized into parts dealing with theory and algorithms, speech processing, image processing, and implementation. The workshop was sponsored by the European Association for Signal Processing without restriction on the origin of participants.
    Note: When are k-nearest neighbor and back propagation accurate for feasible sized sets of examples? -- Complexity theory of neural networks and classification problems -- Generalization performance of overtrained back-propagation networks -- Stability of the random neural network model -- Temporal pattern recognition using EBPS -- Markovian spatial properties of a random field describing a stochastic neural network: Sequential or parallel implementation? -- Chaos in neural networks -- The “moving targets” training algorithm -- Acceleration techniques for the backpropagation algorithm -- Rule-injection hints as a means of improving network performance and learning time -- Inversion in time -- Cellular neural networks: Dynamic properties and adaptive learning algorithm -- Improved simulated annealing, Boltzmann machine, and attributed graph matching -- Artificial dendritic learning -- A neural net model of human short-term memory development -- Large vocabulary speech recognition using neural-fuzzy and concept networks -- Speech feature extraction using neural networks -- Neural network based continuous speech recognition by combining self organizing feature maps and Hidden Markov Modeling -- Ultra-small implementation of a neural halftoning technique -- Application of self-organising networks to signal processing -- A study of neural network applications to signal processing -- Simulation machine and integrated implementation of neural networks -- VLSI implementation of an associative memory based on distributed storage of information.
    In: Springer eBooks
    Additional Edition: ISBN 3-540-52255-7
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    UID:
    b3kat_BV003572595
    Format: IX, 276 S. , Ill., graph. Darst.
    ISBN: 3540522557 , 0387522557
    Series Statement: Lecture notes in computer science 412
    Language: English
    Subjects: Computer Science
    RVK:
    RVK:
    RVK:
    Keywords: Neurocomputer ; Rechnernetz ; Sesimbra ; Neuronales Netz ; Neurocomputer ; Konferenzschrift
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    UID:
    gbv_02522767X
    Format: IX, 276 S. , Ill., graph. Darst. , 25 cm
    ISBN: 3540522557 , 0387522557
    Series Statement: Lecture notes in computer science 412
    Note: Literaturangaben
    Additional Edition: Online-Ausg. Neural networks Berlin [u.a.] : Springer, 1990 ISBN 9783540469391
    Additional Edition: Erscheint auch als Online-Ausgabe Almeida, Luis B. Neural Networks Berlin, Heidelberg : Springer Berlin Heidelberg, 1990 ISBN 9783540469391
    Language: English
    Subjects: Computer Science
    RVK:
    RVK:
    Keywords: Neuronales Netz ; Konferenzschrift ; Konferenzschrift
    URL: Cover
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    Online Resource
    Online Resource
    Berlin, Heidelberg : Springer Berlin Heidelberg
    UID:
    gbv_1649280343
    Format: Online-Ressource
    ISBN: 9783540469391
    Series Statement: Lecture Notes in Computer Science 412
    Additional Edition: ISBN 9783540522553
    Additional Edition: Buchausg. u.d.T. Neural networks Berlin : Springer, 1990 ISBN 3540522557
    Additional Edition: ISBN 0387522557
    Language: English
    Subjects: Computer Science
    RVK:
    RVK:
    Keywords: Neuronales Netz ; Konferenzschrift
    URL: Cover
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    Online Resource
    Online Resource
    [San Rafael] : Morgan & Claypool Publishers
    UID:
    gbv_722652542
    Format: 1 Online-Ressource (114 Seiten)
    Edition: Also available in print
    ISBN: 1598290312 , 9781598290318
    Series Statement: Synthesis Lectures on Signal Processing #2
    Content: The purpose of this lecture book is to present the state of the art in nonlinear blind source separation, in a form appropriate for students, researchers and developers. Source separation deals with the problem of recovering sources that are observed in a mixed condition. When we have little knowledge about the sources and about the mixture process, we speak of blind source separation. Linear blind source separation is a relatively well studied subject. Nonlinear blind source separation is still in a less advanced stage, but has seen several significant developments in the last few years. This publication reviews the main nonlinear separation methods, including the separation of post-nonlinear mixtures, and the MISEP, ensemble learning and kTDSEP methods for generic mixtures. These methods are studied with a significant depth. A historical overview is also presented, mentioning most of the relevant results, on nonlinear blind source separation, that have been presented over the years
    Content: The purpose of this lecture book is to present the state of the art in nonlinear blind source separation, in a form appropriate for students, researchers and developers. Source separation deals with the problem of recovering sources that are observed in a mixed condition. When we have little knowledge about the sources and about the mixture process, we speak of blind source separation. Linear blind source separation is a relatively well studied subject. Nonlinear blind source separation is still in a less advanced stage, but has seen several significant developments in the last few years. This publication reviews the main nonlinear separation methods, including the separation of post-nonlinear mixtures, and the MISEP, ensemble learning and kTDSEP methods for generic mixtures. These methods are studied with a significant depth. A historical overview is also presented, mentioning most of the relevant results, on nonlinear blind source separation, that have been presented over the years
    Content: Acknowledgments -- Notation -- Preface -- 1. Introduction. -- 1.1. Basic concepts -- 1.2. Summary -- 2. Linear source separation -- 2.1. Statement of the problem -- 2.2. INFOMAX -- 2.3. Exploiting the time-domain structure -- 2.4. Other methods : JADE and FastICA -- 2.5. Summary -- 3. Nonlinear separation -- 3.1. Post-nonlinear mixtures -- 3.2. Unconstrained nonlinear separation -- 3.3. Conclusion -- 4. Final comments -- A. Statistical concepts -- A.1. Passing a random variable through its cumulative distribution function -- A.2. Entropy -- A.3. Kullback-Leibler divergence -- A.4. Mutual information -- B. Online software and data
    Note: Description based upon print version of record , 1. Introduction.1.1. Basic concepts -- 1.2. Summary -- 2. Linear source separation -- 2.1. Statement of the problem -- 2.2. INFOMAX -- 2.3. Exploiting the time-domain structure -- 2.4. Other methods : JADE and FastICA -- 2.5. Summary -- 3. Nonlinear separation -- 3.1. Post-nonlinear mixtures -- 3.2. Unconstrained nonlinear separation -- 3.3. Conclusion -- 4. Final comments -- A. Statistical concepts -- A.1. Passing a random variable through its cumulative distribution function -- A.2. Entropy -- A.3. Kullback-Leibler divergence -- A.4. Mutual information -- B. Online software and data. , Also available in print. , System requirements: Adobe Acrobat Reader. , Mode of access: World Wide Web.
    Additional Edition: ISBN 1598290304
    Additional Edition: ISBN 9781598290301
    Additional Edition: Erscheint auch als Druck-Ausgabe Nonlinear Source Separation
    Language: English
    Keywords: Electronic books
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages