Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
Type of Medium
Language
Region
Library
Years
Access
  • 1
    UID:
    almahu_9948621309402882
    Format: XXIII, 295 p. 40 illus. , online resource.
    Edition: 1st ed. 2001.
    ISBN: 9781447103592
    Series Statement: Advances in Computer Vision and Pattern Recognition,
    Content: Automatic (machine) recognition, description, classification, and groupings of patterns are important problems in a variety of engineering and scientific disciplines such as biology, psychology, medicine, marketing, computer vision, artificial intelligence, and remote sensing. Given a pattern, its recognition/classification may consist of one of the following two tasks: (1) supervised classification (also called discriminant analysis); the input pattern is assigned to one of several predefined classes, (2) unsupervised classification (also called clustering); no pattern classes are defined a priori and patterns are grouped into clusters based on their similarity. Interest in the area of pattern recognition has been renewed recently due to emerging applications which are not only challenging but also computationally more demanding (e. g. , bioinformatics, data mining, document classification, and multimedia database retrieval). Among the various frameworks in which pattern recognition has been traditionally formulated, the statistical approach has been most intensively studied and used in practice. More recently, neural network techniques and methods imported from statistical learning theory have received increased attention. Neural networks and statistical pattern recognition are two closely related disciplines which share several common research issues. Neural networks have not only provided a variety of novel or supplementary approaches for pattern recognition tasks, but have also offered architectures on which many well-known statistical pattern recognition algorithms can be mapped for efficient (hardware) implementation. On the other hand, neural networks can derive benefit from some well-known results in statistical pattern recognition.
    Note: 1. Quick Overview -- 1.1 The Classifier Design Problem -- 1.2 Single Layer and Multilayer Perceptrons -- 1.3 The SLP as the Euclidean Distance and the Fisher Linear Classifiers -- 1.4 The Generalisation Error of the EDC and the Fisher DF -- 1.5 Optimal Complexity - The Scissors Effect -- 1.6 Overtraining in Neural Networks -- 1.7 Bibliographical and Historical Remarks -- 2. Taxonomy of Pattern Classification Algorithms -- 2.1 Principles of Statistical Decision Theory -- 2.2 Four Parametric Statistical Classifiers -- 2.3 Structures of the Covariance Matrices -- 2.4 The Bayes Predictive Approach to Design Optimal Classification Rules -- 2.5. Modifications of the Standard Linear and Quadratic DF -- 2.6 Nonparametric Local Statistical Classifiers -- 2.7 Minimum Empirical Error and Maximal Margin Linear Classifiers -- 2.8 Piecewise-Linear Classifiers -- 2.9 Classifiers for Categorical Data -- 2.10 Bibliographical and Historical Remarks -- 3. Performance and the Generalisation Error -- 3.1 Bayes, Conditional, Expected, and Asymptotic Probabilities of Misclassification -- 3.2 Generalisation Error of the Euclidean Distance Classifier -- 3.3 Most Favourable and Least Favourable Distributions of the Data -- 3.4 Generalisation Errors for Modifications of the Standard Linear Classifier -- 3.5 Common Parameters in Different Competing Pattern Classes -- 3.6 Minimum Empirical Error and Maximal Margin Classifiers -- 3.7 Parzen Window Classifier -- 3.8 Multinomial Classifier -- 3.9 Bibliographical and Historical Remarks -- 4. Neural Network Classifiers -- 4.1 Training Dynamics of the Single Layer Perceptron -- 4.2 Non-linear Decision Boundaries -- 4.3 Training Peculiarities of the Perceptrons -- 4.4 Generalisation of the Perceptrons -- 4.5 Overtraining and Initialisation -- 4.6 Tools to Control Complexity -- 4.7 The Co-Operation of the Neural Networks -- 4.8 Bibliographical and Historical Remarks -- 5. Integration of Statistical and Neural Approaches -- 5.1 Statistical Methods or Neural Nets? -- 5.2 Positive and Negative Attributes of Statistical Pattern Recognition -- 5.3 Positive and Negative Attributes of Artificial Neural Networks -- 5.4 Merging Statistical Classifiers and Neural Networks -- 5.5 Data Transformations for the Integrated Approach -- 5.6 The Statistical Approach in Multilayer Feed-forward Networks -- 5.7 Concluding and Bibliographical Remarks -- 6. Model Selection -- 6.1 Classification Errors and their Estimation Methods -- 6.2 Simplified Performance Measures -- 6.3 Accuracy of Performance Estimates -- 6.4 Feature Ranking and the Optimal Number of Feature -- 6.5 The Accuracy of the Model Selection -- 6.6 Additional Bibliographical Remarks -- Appendices -- A.1 Elements of Matrix Algebra -- A.2 The First Order Tree Type Dependence Model -- A.3 Temporal Dependence Models -- A.4 Pikelis Algorithm for Evaluating Means and Variances of the True, Apparent and Ideal Errors in Model Selection -- A.5 Matlab Codes (the Non-Linear SLP Training, the First Order Tree Dependence Model, and Data Whitening Transformation) -- References.
    In: Springer Nature eBook
    Additional Edition: Printed edition: ISBN 9781447110712
    Additional Edition: Printed edition: ISBN 9781447103608
    Additional Edition: Printed edition: ISBN 9781852332976
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Online Resource
    Online Resource
    London : Springer
    UID:
    gbv_749178744
    Format: Online-Ressource (XXIII, 295 p) , digital
    Edition: Springer eBook Collection. Computer Science
    ISBN: 9781447103592
    Series Statement: Advances in Pattern Recognition
    Content: The classification of patterns is an important area of research which is central to all pattern recognition fields, including speech, image, robotics, and data analysis. Neural networks have been used successfully in a number of these fields, but so far their application has been based on a "black box approach", with no real understanding of how they work. In this book, Sarunas Raudys - an internationally respected researcher in the area - provides an excellent mathematical and applied introduction to how neural network classifiers work and how they should be used to optimal effect. Among the topics covered are: - Different types of neural network classifiers; - A taxonomy of pattern classification algorithms; - Performance capabilities and measurement procedures; - Which features should be extracted from raw data for the best classification results. This book will provide essential reading for anyone researching or studying relevant areas of pattern recognition (such as image processing, speech recognition, robotics, and multimedia). It will also be of interest to anyone studing or researching in applied neural networks
    Additional Edition: ISBN 9781852332976
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 9781447110712
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 9781447103608
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 9781852332976
    Language: English
    URL: Volltext  (lizenzpflichtig)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Did you mean 9781447100713?
Did you mean 9781447101710?
Did you mean 9781447107132?
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages