Format:
Online-Ressource (xv, 225 p)
,
ill
,
24 cm
Edition:
Online-Ausg. 2011 Electronic reproduction; Available via World Wide Web
ISBN:
9814271063
,
9789814271066
Series Statement:
Series in machine perception and artificial intelligence v. 75
Content:
Researchers from various disciplines such as pattern recognition, statistics, and machine learning have explored the use of ensemble methodology since the late seventies. Thus, they are faced with a wide variety of methods, given the growing interest in the field. This book aims to impose a degree of order upon this diversity by presenting a coherent and unified repository of ensemble methods, theories, trends, challenges and applications. The book describes in detail the classical methods, as well as the extensions and novel approaches developed recently. Along with algorithmic descriptions o
Note:
Includes bibliographical references (p. 185-222) and index
,
Preface; Contents; 1. Introduction to Pattern Classi.cation; 1.1 Pattern Classification; 1.2 Induction Algorithms; 1.3 Rule Induction; 1.4 Decision Trees; 1.5 Bayesian Methods; 1.5.1 Overview.; 1.5.2 NaıveBayes; 1.5.2.1 The Basic Naıve Bayes Classifier; 1.5.2.2 Naıve Bayes Induction for Numeric Attributes; 1.5.2.3 Correction to the Probability Estimation; 1.5.2.4 Laplace Correction; 1.5.2.5 No Match; 1.5.3 Other Bayesian Methods; 1.6 Other Induction Methods; 1.6.1 Neural Networks; 1.6.2 Genetic Algorithms; 1.6.3 Instance-based Learning; 1.6.4 Support Vector Machines
,
2. Introduction to Ensemble Learning2.1 Back to the Roots; 2.2 The Wisdom of Crowds; 2.3 The Bagging Algorithm; 2.4 The Boosting Algorithm; 2.5 The AdaBoost Algorithm; 2.6 No Free Lunch Theorem and Ensemble Learning; 2.7 Bias-Variance Decomposition and Ensemble Learning; 2.8 Occam's Razor and Ensemble Learning; 2.9 Classifier Dependency; 2.9.1 DependentMethods; 2.9.1.1 Model-guided Instance Selection; 2.9.1.2 Basic Boosting Algorithms; 2.9.1.3 Advanced Boosting Algorithms; 2.9.1.4 Incremental Batch Learning; 2.9.2 Independent Methods; 2.9.2.1 Bagging; 2.9.2.2 Wagging
,
2.9.2.3 Random Forest and Random Subspace Projection2.9.2.4 Non-Linear Boosting Projection (NLBP); 2.9.2.5 Cross-validated Committees; 2.9.2.6 Robust Boosting; 2.10 Ensemble Methods for Advanced Classification Tasks; 2.10.1 Cost-Sensitive Classification; 2.10.2 Ensemble for Learning Concept Drift; 2.10.3 Reject Driven Classification; 3. Ensemble Classification; 3.1 Fusions Methods; 3.1.1 Weighting Methods; 3.1.2 Majority Voting; 3.1.3 Performance Weighting; 3.1.4 Distribution Summation; 3.1.5 Bayesian Combination; 3.1.6 Dempster-Shafer; 3.1.7 Vogging; 3.1.8 Naıve Bayes
,
3.1.9 Entropy Weighting3.1.10 Density-based Weighting; 3.1.11 DEA Weighting Method; 3.1.12 Logarithmic Opinion Pool; 3.1.13 Order Statistics; 3.2 Selecting Classification; 3.2.1 Partitioning the Instance Space; 3.2.1.1 The K-Means Algorithm as a Decomposition Tool; 3.2.1.2 Determining the Number of Subsets; 3.2.1.3 The Basic K-Classifier Algorithm; 3.2.1.4 The Heterogeneity Detecting K-Classifier (HDK-Classifier); 3.2.1.5 Running-Time Complexity; 3.3 Mixture of Experts and Meta Learning; 3.3.1 Stacking; 3.3.2 Arbiter Trees; 3.3.3 Combiner Trees; 3.3.4 Grading; 3.3.5 Gating Network
,
4. Ensemble Diversity4.1 Overview; 4.2 Manipulating the Inducer; 4.2.1 Manipulation of the Inducer's Parameters; 4.2.2 Starting Point in Hypothesis Space; 4.2.3 Hypothesis Space Traversal; 4.3 Manipulating the Training Samples; 4.3.1 Resampling; 4.3.2 Creation; 4.3.3 Partitioning; 4.4 Manipulating the Target Attribute Representation; 4.4.1 Label Switching; 4.5 Partitioning the Search Space; 4.5.1 Divide and Conquer; 4.5.2 Feature Subset-based Ensemble Methods; 4.5.2.1 Random-based Strategy; 4.5.2.2 Reduct-based Strategy; 4.5.2.3 Collective-Performance-based Strategy
,
4.5.2.4 Feature Set Partitioning
,
Electronic reproduction; Available via World Wide Web
Additional Edition:
Print version Pattern Classification Using Ensemble Methods
Language:
English
Bookmarklink