Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
Type of Medium
Language
Region
Years
Keywords
  • 1
    UID:
    almahu_9949292205402882
    Format: 1 online resource (xix, 248 pages)
    Edition: 1st ed. 2015.
    ISBN: 1-4302-5990-6
    Series Statement: The expert's voice in machine learning
    Content: Machine learning techniques provide cost-effective alternatives to traditional methods for extracting underlying relationships between information and data and for predicting future events by processing existing information to train models. Efficient Learning Machines explores the major topics of machine learning, including knowledge discovery, classifications, genetic algorithms, neural networking, kernel methods, and biologically-inspired techniques. Mariette Awad and Rahul Khanna’s synthetic approach weaves together the theoretical exposition, design principles, and practical applications of efficient machine learning. Their experiential emphasis, expressed in their close analysis of sample algorithms throughout the book, aims to equip engineers, students of engineering, and system designers to design and create new and more efficient machine learning systems. Readers of Efficient Learning Machines will learn how to recognize and analyze the problems that machine learning technology can solve for them, how to implement and deploy standard solutions to sample problems, and how to design new systems and solutions. Advances in computing performance, storage, memory, unstructured information retrieval, and cloud computing have coevolved with a new generation of machine learning paradigms and big data analytics, which the authors present in the conceptual context of their traditional precursors. Awad and Khanna explore current developments in the deep learning techniques of deep neural networks, hierarchical temporal memory, and cortical algorithms. Nature suggests sophisticated learning techniques that deploy simple rules to generate highly intelligent and organized behaviors with adaptive, evolutionary, and distributed properties. The authors examine the most popular biologically-inspired algorithms, together with a sample application to distributed datacenter management. They also discuss machine learning techniques for addressing problems of multi-objective optimization in which solutions in real-world systems are constrained and evaluated based on how well they perform with respect to multiple objectives in aggregate. Two chapters on support vector machines and their extensions focus on recent improvements to the classification and regression techniques at the core of machine learning.
    Note: Includes index. , Contents at a Glance; Chapter 1: Machine Learning; Key Terminology; Developing a Learning Machine; Machine Learning Algorithms; Popular Machine Learning Algorithms; C4.5; k -Means; Support Vector Machines; Apriori; Estimation Maximization; PageRank; AdaBoost (Adaptive Boosting); k -Nearest Neighbors; Naive Bayes; Classification and Regression Trees; Challenging Problems in Data Mining Research; Scaling Up for High-Dimensional Data and High-Speed Data Streams; Mining Sequence Data and Time Series Data; Mining Complex Knowledge from Complex Data , Distributed Data Mining and Mining Multi-Agent DataData Mining Process-Related Problems; Security, Privacy, and Data Integrity; Dealing with Nonstatic, Unbalanced, and Cost-Sensitive Data; Summary; References; Chapter 2: Machine Learning and Knowledge Discovery; Knowledge Discovery; Classification; Clustering; Dimensionality Reduction; Collaborative Filtering; Machine Learning: Classification Algorithms; Logistic Regression; Random Forest; Hidden Markov Model; Multilayer Perceptron; Machine Learning: Clustering Algorithms; k -Means Clustering; Fuzzy k -Means (Fuzzy c - Means) , Streaming k -MeansStreaming Step; Ball K-Means Step; Machine Learning: Dimensionality Reduction; Singular Value Decomposition; Principal Component Analysis; Lanczos Algorithm; Initialize; Algorithm; Machine Learning: Collaborative Filtering; User-Based Collaborative Filtering; Item-Based Collaborative Filtering; Alternating Least Squares with Weighted- l -Regularization; Machine Learning: Similarity Matrix; Pearson Correlation Coefficient; Spearman Rank Correlation Coefficient; Euclidean Distance; Jaccard Similarity Coefficient; Summary; References , Chapter 3: Support Vector Machines for ClassificationSVM from a Geometric Perspective; SVM Main Properties; Hard-Margin SVM; Soft-Margin SVM; Kernel SVM; Multiclass SVM; SVM with Imbalanced Datasets; Improving SVM Computational Requirements; Case Study of SVM for Handwriting Recognition; Preprocessing; Feature Extraction; Hierarchical, Three-Stage SVM; Experimental Results; Complexity Analysis; References; Chapter 4: Support Vector Regression; SVR Overview; SVR: Concepts, Mathematical Model, and Graphical Representation , Kernel SVR and Different Loss Functions: Mathematical Model and Graphical RepresentationBayesian Linear Regression; Asymmetrical SVR for Power Prediction: Case Study; References; Chapter 5: Hidden Markov Model; Discrete Markov Process; Definition 1; Definition 2; Definition 3; Introduction to the Hidden Markov Model; Essentials of the Hidden Markov Model; The Three Basic Problems of HMM; Solutions to the Three Basic Problems of HMM; Solution to Problem 1; Forward Algorithm; Backward Algorithm; Scaling; Solution to Problem 2; Initialization; Recursion; Termination; State Sequence Backtracking , Solution to Problem 3 , English
    Additional Edition: ISBN 1-4302-5989-2
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    UID:
    gbv_1778620280
    Format: 1 Online-Ressource (268 p.)
    ISBN: 9781430259893 , 9781430259909
    Content: Machine learning techniques provide cost-effective alternatives to traditional methods for extracting underlying relationships between information and data and for predicting future events by processing existing information to train models. Efficient Learning Machines explores the major topics of machine learning, including knowledge discovery, classifications, genetic algorithms, neural networking, kernel methods, and biologically-inspired techniques. Mariette Awad and Rahul Khanna’s synthetic approach weaves together the theoretical exposition, design principles, and practical applications of efficient machine learning. Their experiential emphasis, expressed in their close analysis of sample algorithms throughout the book, aims to equip engineers, students of engineering, and system designers to design and create new and more efficient machine learning systems. Readers of Efficient Learning Machines will learn how to recognize and analyze the problems that machine learning technology can solve for them, how to implement and deploy standard solutions to sample problems, and how to design new systems and solutions. Advances in computing performance, storage, memory, unstructured information retrieval, and cloud computing have coevolved with a new generation of machine learning paradigms and big data analytics, which the authors present in the conceptual context of their traditional precursors. Awad and Khanna explore current developments in the deep learning techniques of deep neural networks, hierarchical temporal memory, and cortical algorithms. Nature suggests sophisticated learning techniques that deploy simple rules to generate highly intelligent and organized behaviors with adaptive, evolutionary, and distributed properties. The authors examine the most popular biologically-inspired algorithms, together with a sample application to distributed datacenter management. They also discuss machine learning techniques for addressing problems of multi-objective optimization in which solutions in real-world systems are constrained and evaluated based on how well they perform with respect to multiple objectives in aggregate. Two chapters on support vector machines and their extensions focus on recent improvements to the classification and regression techniques at the core of machine learning
    Note: English
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    UID:
    almahu_9947388548402882
    Format: XIX, 268 p. 88 illus. , online resource.
    ISBN: 9781430259909
    Content: Machine learning techniques provide cost-effective alternatives to traditional methods for extracting underlying relationships between information and data and for predicting future events by processing existing information to train models. Efficient Learning Machines explores the major topics of machine learning, including knowledge discovery, classifications, genetic algorithms, neural networking, kernel methods, and biologically-inspired techniques. Mariette Awad and Rahul Khanna’s synthetic approach weaves together the theoretical exposition, design principles, and practical applications of efficient machine learning. Their experiential emphasis, expressed in their close analysis of sample algorithms throughout the book, aims to equip engineers, students of engineering, and system designers to design and create new and more efficient machine learning systems. Readers of Efficient Learning Machines will learn how to recognize and analyze the problems that machine learning technology can solve for them, how to implement and deploy standard solutions to sample problems, and how to design new systems and solutions. Advances in computing performance, storage, memory, unstructured information retrieval, and cloud computing have coevolved with a new generation of machine learning paradigms and big data analytics, which the authors present in the conceptual context of their traditional precursors. Awad and Khanna explore current developments in the deep learning techniques of deep neural networks, hierarchical temporal memory, and cortical algorithms. Nature suggests sophisticated learning techniques that deploy simple rules to generate highly intelligent and organized behaviors with adaptive, evolutionary, and distributed properties. The authors examine the most popular biologically-inspired algorithms, together with a sample application to distributed datacenter management. They also discuss machine learning techniques for addressing problems of multi-objective optimization in which solutions in real-world systems are constrained and evaluated based on how well they perform with respect to multiple objectives in aggregate. Two chapters on support vector machines and their extensions focus on recent improvements to the classification and regression techniques at the core of machine learning.
    In: Springer eBooks
    Additional Edition: Printed edition: ISBN 9781430259893
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    UID:
    almahu_9949301483102882
    Format: 1 online resource (263 pages)
    ISBN: 9781430259909
    Note: Intro -- Contents at a Glance -- Contents -- About the Authors -- About the Technical Reviewers -- Acknowledgments -- Chapter 1: Machine Learning -- Key Terminology -- Developing a Learning Machine -- Machine Learning Algorithms -- Popular Machine Learning Algorithms -- C4.5 -- k -Means -- Support Vector Machines -- Apriori -- Estimation Maximization -- PageRank -- AdaBoost (Adaptive Boosting) -- k -Nearest Neighbors -- Naive Bayes -- Classification and Regression Trees -- Challenging Problems in Data Mining Research -- Scaling Up for High-Dimensional Data and High-Speed Data Streams -- Mining Sequence Data and Time Series Data -- Mining Complex Knowledge from Complex Data -- Distributed Data Mining and Mining Multi-Agent Data -- Data Mining Process-Related Problems -- Security, Privacy, and Data Integrity -- Dealing with Nonstatic, Unbalanced, and Cost-Sensitive Data -- Summary -- References -- Chapter 2: Machine Learning and Knowledge Discovery -- Knowledge Discovery -- Classification -- Clustering -- Dimensionality Reduction -- Collaborative Filtering -- Machine Learning: Classification Algorithms -- Logistic Regression -- Random Forest -- Hidden Markov Model -- Multilayer Perceptron -- Machine Learning: Clustering Algorithms -- k -Means Clustering -- Fuzzy k -Means (Fuzzy c - Means) -- Streaming k -Means -- Streaming Step -- Ball K-Means Step -- Machine Learning: Dimensionality Reduction -- Singular Value Decomposition -- Principal Component Analysis -- Lanczos Algorithm -- Initialize -- Algorithm -- Machine Learning: Collaborative Filtering -- User-Based Collaborative Filtering -- Item-Based Collaborative Filtering -- Alternating Least Squares with Weighted- l -Regularization -- Machine Learning: Similarity Matrix -- Pearson Correlation Coefficient -- Spearman Rank Correlation Coefficient -- Euclidean Distance. , Jaccard Similarity Coefficient -- Summary -- References -- Chapter 3: Support Vector Machines for Classification -- SVM from a Geometric Perspective -- SVM Main Properties -- Hard-Margin SVM -- Soft-Margin SVM -- Kernel SVM -- Multiclass SVM -- SVM with Imbalanced Datasets -- Improving SVM Computational Requirements -- Case Study of SVM for Handwriting Recognition -- Preprocessing -- Feature Extraction -- Hierarchical, Three-Stage SVM -- Experimental Results -- Complexity Analysis -- References -- Chapter 4: Support Vector Regression -- SVR Overview -- SVR: Concepts, Mathematical Model, and Graphical Representation -- Kernel SVR and Different Loss Functions: Mathematical Model and Graphical Representation -- Bayesian Linear Regression -- Asymmetrical SVR for Power Prediction: Case Study -- References -- Chapter 5: Hidden Markov Model -- Discrete Markov Process -- Definition 1 -- Definition 2 -- Definition 3 -- Introduction to the Hidden Markov Model -- Essentials of the Hidden Markov Model -- The Three Basic Problems of HMM -- Solutions to the Three Basic Problems of HMM -- Solution to Problem 1 -- Forward Algorithm -- Backward Algorithm -- Scaling -- Solution to Problem 2 -- Initialization -- Recursion -- Termination -- State Sequence Backtracking -- Solution to Problem 3 -- Continuous Observation HMM -- Multivariate Gaussian Mixture Model -- Example: Workload Phase Recognition -- Monitoring and Observations -- Workload and Phase -- Mixture Models for Phase Detection -- Sensor Block -- Model Reduction Block -- Emission Block -- Training Block -- Parameter Estimation Block -- Phase Prediction Model -- State Forecasting Block -- System Adaptation -- References -- Chapter 6: Bioinspired Computing: Swarm Intelligence -- Applications -- Evolvable Hardware -- Bioinspired Networking -- Datacenter Optimization -- Bioinspired Computing Algorithms. , Swarm Intelligence -- Ant Colony Optimization Algorithm -- Particle Swarm Optimization -- Artificial Bee Colony Algorithm -- Bacterial Foraging Optimization Algorithm -- Artificial Immune System -- Distributed Management in Datacenters -- Workload Characterization -- Thermal Optimization -- Load Balancing -- Algorithm Model -- References -- Chapter 7: Deep Neural Networks -- Introducting ANNs -- Early ANN Structures -- Classical ANN -- ANN Training and the Backpropagation Algorithm -- DBN Overview -- Restricted Boltzmann Machines -- DNN-Related Research -- DNN Applications -- P arallel Implementations to Speed Up DNN Training -- Deep Networks Similar to DBN -- References -- Chapter 8: Cortical Algorithms -- Cortical Algorithm Primer -- Cortical Algorithm Structure -- Training of Cortical Algorithms -- Unsupervised Feedforward -- Supervised Feedback -- Weight Update -- The workflow for CA training is displayed in Figure  8-4 . -- Experimental Results -- Modified Cortical Algorithms Applied to Arabic Spoken Digits: Case Study -- Entropy-Based Weight Update Rule -- Experimental Validation -- References -- Chapter 9: Deep Learning -- Overview of Hierarchical Temporal Memory -- Hierarchical Temporal Memory Generations -- Sparse Distributed Representation -- Algorithmic Implementation -- Spatia l Poole r -- Temporal Pooler -- Related Work -- Overview of Spiking Neural Networks -- Hodgkin-Huxley Model -- Integrate-and-Fire Model -- Leaky Integrate-and-Fire Model -- Izhikevich Model -- Thorpe's Model -- Information Coding in SNN -- Learning in SNN -- SNN Variants and Extensions -- Evolving Spiking Neural Networks -- Reservoir-Based Evolving Spiking Neural Networks -- Dynamic Synaptic Evolving Spiking Neural Networks -- Probabilistic Spiking Neural Networks -- Conclusion -- References -- Chapter 10: Multiobjective Optimization -- Formal Definition. , Pareto Optimality -- Dominance Relationship -- Performance Measure -- Machine Learning: Evolutionary Algorithms -- Genetic Algorithm -- Genetic Programming -- Multiobjective Optimization: An Evolutionary Approach -- Weighted-Sum Approach -- Vector-Evaluated Genetic Algorithm -- Multiobjective Genetic Algorithm -- Niched Pareto Genetic Algorithm -- Nondominated Sorting Genetic Algorithm -- Strength Pareto Evolutionary Algorithm -- Strength of Solutions -- Fitness of P Solutions -- Clustering -- Strength Pareto Evolutionary Algorithm II -- Pareto Archived Evolutionary Strategy -- Pareto Envelope-Based Selection Algorithm -- Pareto Envelope-Based Selection Algorithm II -- Elitist Nondominated Sorting Genetic Algorithm -- Example: Multiobjective Optimization -- Objective Functions -- References -- Chapter 11: Machine Learning in Action: Examples -- Viable System Modeling -- Example 1: Workload Fingerprinting on a Compute Node -- Phase Determination -- Fingerprinting -- Size Attribute -- Phase Attribute -- Pattern Attribute -- Forecasting -- Example 2: Dynamic Energy Allocation -- Learning Process: Feature Selection -- Learning Process: Optimization Planning -- Learning Process: Monitoring -- Model Training: Procedure and Evaluation -- Example 3: System Approach to Intrusion Detection -- Modeling Scheme -- Observed (Emission) States -- Hidden States -- Intrusion Detection System Architecture -- Profiles and System Considerations -- Sensor Data Measurements -- Summary -- References -- Index.
    Additional Edition: Print version: Awad, Mariette Efficient Learning Machines Berkeley, CA : Apress L. P.,c2015 ISBN 9781430259893
    Language: English
    Keywords: Electronic books.
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Did you mean 9781430259299?
Did you mean 9781461259893?
Did you mean 9781493025893?
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages