UID:
almahu_9949983720802882
Umfang:
1 online resource (538 pages)
ISBN:
9780128204283
,
0128204281
Inhalt:
"Biosignal Processing and Classification Using Computational Learning and Intelligence: Principles, Algorithms and Applications posits an approach for biosignal processing and classification using computational learning and intelligence, highlighting that the term biosignal refers to all kinds of signals that can be continuously measured and monitored in living beings."--
Anmerkung:
Front Cover -- Biosignal Processing and Classification Using Computational Learning and Intelligence -- Copyright -- Contents -- List of figures -- List of contributors -- About the authors -- Alejandro A. Torres-García -- Carlos A. Reyes-García -- Luis Villaseñor-Pineda -- Omar Mendoza-Montoya -- Part 1 Introduction -- 1 Introduction to this book -- 1.1 Brief description of the contents of this book -- 1.2 What are the applications and reviews presented in this book? -- 1.3 Who should read this book? -- 2 Biosignals analysis (heart, phonatory system, and muscles) -- 2.1 Introduction -- 2.2 Sensing the heart -- 2.2.1 Electrocardiogram (ECG) -- 2.2.1.1 History of the electrocardiogram -- 2.2.1.2 The electrocardiogram -- 2.2.1.3 The action potential -- 2.2.1.4 The heart's conduction system -- 2.2.1.5 ECG leads -- 2.2.2 Echocardiogram -- 2.3 Sensing human acoustics -- 2.3.1 Speech -- 2.3.2 Infant cry -- 2.4 Electromyography -- References -- 3 Neuroimaging techniques -- 3.1 Introduction -- 3.2 The electroencephalogram -- 3.2.1 Recording -- 3.2.1.1 Digital recordings -- 3.2.1.2 Electrodes and montages -- 3.2.1.3 Artifacts -- 3.2.2 Main EEG characteristics -- 3.2.3 Effect of age -- 3.2.4 Characteristics of the EEG during sleep -- 3.2.5 Factors that affect negatively the EEG at rest -- 3.3 Functional near-infrared spectroscopy (fNIRS) -- 3.3.1 Introduction to optical imaging -- 3.3.2 Differences with other technologies -- 3.3.3 Technical considerations and experimental designs -- 3.3.4 Registration -- 3.3.5 Data analysis -- 3.4 Biosignals in conventional and functional magnetic resonance imaging -- 3.4.1 Basic physical principles -- 3.4.1.1 Spin -- 3.4.1.2 Nuclear magnetic resonance -- 3.4.1.3 Proton excitation and signal reception -- 3.4.1.4 Relaxation: transverse and longitudinal -- 3.4.2 Signal generation in magnetic resonance imaging.
,
3.4.3 Contrast -- 3.4.3.1 Types of contrast -- 3.4.4 Pulse sequences -- 3.4.5 Functional magnetic resonance imaging -- 3.4.5.1 Task-based fMRI -- 3.4.5.2 Resting-state fMRI -- 3.4.6 Clinical applications -- 3.5 Conclusion -- Acknowledgments -- References -- Part 2 Biosignal processing: From biosignals to features' datasets -- 4 Pre-processing and feature extraction -- 4.1 Preprocessing -- 4.1.1 Filtering -- 4.1.2 Normalization -- 4.1.3 Standardization -- 4.1.4 Independent component analysis -- 4.1.5 Common average reference -- 4.2 Time-domain feature extraction techniques -- 4.2.1 Features extracted from the waveform -- 4.2.1.1 Zero crossing rate -- 4.2.1.2 Slope sign changes -- 4.2.1.3 Waveform length -- 4.2.2 Statistical values -- 4.2.2.1 Skewness -- 4.2.2.2 Kurtosis -- 4.2.3 Features from chaos and fractal theory -- 4.2.3.1 Shannon entropy -- 4.2.3.2 Higuchi fractal dimension -- 4.2.3.3 Katz fractal dimension -- 4.2.3.4 Generalized Hurst exponent (GHE) -- 4.2.4 Auto-regressive models -- 4.3 Frequency-domain feature extraction techniques -- 4.3.1 Discrete Fourier transform -- 4.3.2 Power spectral density -- 4.3.3 Autoregressive model-based spectrum estimation -- 4.4 Time-frequency-domain feature extraction techniques -- 4.4.1 Short-time Fourier transform (STFT) -- 4.4.2 Morlet wavelet (MW) -- 4.4.3 Filter-based Hilbert transform (FHT) -- 4.5 Wavelet transform -- 4.5.1 Continuous wavelet transform -- 4.5.2 Discrete wavelet transform -- 4.6 Empirical mode decomposition -- 4.7 Extracting features -- 4.7.1 Relative wavelet energy (RWE) -- 4.7.2 Instantaneous wavelet energy (IWE) -- 4.7.3 Teager wavelet energy (TWE) -- 4.7.4 Hierarchical wavelet energy (HWE) -- References -- 5 Dimensionality reduction -- 5.1 Introduction -- 5.2 Background -- 5.2.1 Supervised learning -- 5.2.2 Dimensionality reduction -- 5.2.3 Challenges of dimensionality reduction.
,
5.2.4 Discussion -- 5.3 Feature selection -- 5.3.1 Filter-based feature selection -- 5.3.2 Wrappers for feature selection -- 5.3.3 Embedded methods -- 5.3.4 Unsupervised feature selection -- 5.3.5 Discussion -- 5.4 Feature transformation -- 5.4.1 Projection-based approaches -- 5.4.2 Preserving structure methods -- 5.4.3 Feature construction -- 5.4.4 Representation learning approaches -- 5.4.5 Discussion -- 5.5 Final remarks -- Acknowledgments -- References -- Part 3 Computational learning (machine learning) -- 6 A brief introduction to supervised, unsupervised, and reinforcement learning -- 6.1 Brief history of the area -- 6.2 Machine learning -- 6.2.1 Supervised learning -- 6.2.1.1 Classification -- 6.2.1.2 Decision trees -- 6.2.1.3 Support vector machines -- 6.2.1.4 Instance-based learning -- 6.2.2 Dependency analysis -- 6.2.2.1 Regression and deep neural networks -- 6.2.3 Unsupervised learning -- 6.2.4 Reinforcement learning -- 6.2.5 Other approaches -- 6.3 Conclusions -- References -- 7 Assessing classifier's performance -- 7.1 Introduction -- 7.2 Evaluation methods -- 7.2.1 Hold-out -- 7.2.2 Cross-validation -- 7.2.3 Leave-one-out -- 7.3 Bootstrapping sampling -- 7.3.1 Confidence intervals -- 7.3.2 Bootstrapping aggregating (bagging) -- 7.3.3 Some remarks on data splitting -- 7.4 Evaluation metrics of a classifier -- 7.4.1 Confusion matrix -- 7.4.2 Sensitivity -- 7.4.3 Specificity -- 7.4.4 False negative rate -- 7.4.5 False positive rate -- 7.4.6 Accuracy -- 7.4.7 Precision -- 7.4.8 Fβ score -- 7.4.9 F1 score in multi-class problem -- 7.4.10 Receiver operating characteristic (ROC) curve -- 7.5 Statistical significance -- 7.6 Interpreting results -- 7.6.1 Variance and bias errors -- 7.6.2 Error analysis -- 7.7 A brief summary -- References -- Part 4 Computational intelligence -- 8 Fuzzy logic and fuzzy systems -- 8.1 Fuzzy logic.
,
8.1.1 Fuzzy set -- 8.1.2 Characteristics of a fuzzy set -- 8.1.3 Difference between fuzzy and crisp sets -- 8.1.4 Operations of fuzzy sets -- 8.1.5 Geometry of fuzzy sets -- 8.1.6 Midpoint paradox -- 8.1.7 Linguistic variables -- 8.1.8 Linguistic hedges -- 8.2 Fuzzy relations -- 8.2.1 Relations in mathematics -- 8.2.2 Inclusion operations -- 8.2.3 Relational products -- 8.2.4 Fuzzy relations -- 8.2.5 Fuzzy implication operators used in fuzzy relational products -- 8.3 Fuzzy inference systems -- 8.3.1 Mamdani fuzzy inference system -- 8.3.2 Sugeno fuzzy inference system -- 8.3.2.1 Fuzzifying method -- 8.3.2.2 Base of fuzzy rules -- 8.3.2.3 Fuzzy inference mechanism -- 8.3.3 Example of the design of a FIS -- 8.3.3.1 Guidelines to obtain fuzzy rules -- 8.3.4 Applications and benefits of FIS -- References -- 9 Neural networks and deep learning -- 9.1 Introduction -- 9.1.1 Basic principles -- 9.2 Convolutional neural networks -- 9.3 Recurrent neural networks -- 9.3.1 Gated recurrent units -- 9.3.2 Bidirectional gated recurrent unit -- 9.3.3 Long short-term memory (LSTMs) -- 9.3.4 Attention mechanisms -- 9.4 Conclusions -- References -- 10 Spiking neural networks and dendrite morphological neural networks: an introduction -- 10.1 Introduction -- 10.2 Spiking neural networks -- 10.2.1 Differences between second and third generation of ANN -- 10.2.2 Spiking neural models -- 10.2.2.1 Leaky integrate and fire (LIF) model -- 10.2.2.2 Hodgkin-Huxley model -- 10.2.2.3 Izhikevich model -- 10.2.3 Coding and decoding pulses train -- 10.2.3.1 Coding: difference between static and temporal patterns -- 10.2.3.2 Decoding using frequency analysis -- 10.2.4 Training -- 10.3 Dendrite morphological neural networks -- 10.3.1 Differences between classical perceptron and dendritic neuron -- 10.3.2 What can we do with this kind of machine? -- 10.3.3 Training.
,
10.3.3.1 Ritter's method -- 10.3.3.2 Divide and conquer method (DCM) -- 10.3.3.3 Differential evolution -- 10.4 Chapter conclusions -- Acknowledgments -- References -- 11 Bio-inspired algorithms -- 11.1 Introduction -- 11.2 Genetic algorithms (GA) -- 11.2.1 The GA computational procedure -- 11.2.2 Initialization procedure -- 11.2.3 Selection procedure -- 11.2.3.1 Roulette wheel selection -- 11.2.3.2 Tournament selection -- 11.2.4 Crossover procedure -- 11.2.5 Mutation procedure -- 11.2.6 Replacement procedure -- 11.3 Particle swarm optimization (PSO) -- 11.3.1 The PSO computational procedure -- 11.4 Ant colony optimization (ACO) -- 11.4.1 The ACO computational procedure -- 11.4.2 Pheromone representation -- 11.4.3 Ant based solution construction -- 11.4.4 Pheromone update -- 11.5 Cuckoo search (CS) -- 11.5.1 Lévy flights -- 11.5.2 Cuckoo search using Lévy flights -- 11.5.2.1 Movement via Lévy flights -- 11.5.2.2 Construction of the new solution -- 11.5.2.3 Selection strategy -- 11.6 Artificial bee colony (ABC) -- 11.6.1 The ABC computational procedure -- 11.6.2 Initialization phase -- 11.6.3 Employed bees' phase -- 11.6.4 Onlooker bees' phase -- 11.6.5 Scout bees' phase -- 11.7 Flower pollination algorithm (FPA) -- 11.7.1 Computational procedure for FPA -- 11.7.1.1 Global rule through Lévy flights procedure -- 11.7.1.2 Local rule procedure considering the flower constancy -- 11.7.1.3 Elitist selection procedure -- 11.8 Summary -- References -- Part 5 Applications and reviews -- 12 A survey on EEG-based imagined speech classification -- 12.1 Introduction -- 12.2 Background: the indirect methods -- 12.3 Imagined speech classification -- 12.3.1 Approach based on vowels, syllables, or phonemes -- 12.3.2 Approach based on words -- 12.4 Discussion -- 12.5 Conclusions -- 12.6 Future work -- Acknowledgments -- References.
,
13 P300-based brain-computer interface for communication and control.
Weitere Ausg.:
ISBN 9780128201251
Weitere Ausg.:
ISBN 0128201258
Sprache:
Englisch
Bookmarklink