Ihre E-Mail wurde erfolgreich gesendet. Bitte prüfen Sie Ihren Maileingang.

Leider ist ein Fehler beim E-Mail-Versand aufgetreten. Bitte versuchen Sie es erneut.

Vorgang fortführen?

Exportieren
  • 1
    Online-Ressource
    Online-Ressource
    Cambridge, England :Cambridge University Press,
    UID:
    edocfu_9961532112802883
    Umfang: 1 online resource (294 pages) : , digital, PDF file(s).
    Ausgabe: First edition.
    ISBN: 1-009-00770-X , 1-009-00847-1 , 1-009-00676-2
    Inhalt: Multivariate biomarker discovery is increasingly important in the realm of biomedical research, and is poised to become a crucial facet of personalized medicine. This will prompt the demand for a myriad of novel biomarkers representing distinct 'omic' biosignatures, allowing selection and tailoring treatments to the various individual characteristics of a particular patient. This concise and self-contained book covers all aspects of predictive modeling for biomarker discovery based on high-dimensional data, as well as modern data science methods for identification of parsimonious and robust multivariate biomarkers for medical diagnosis, prognosis, and personalized medicine. It provides a detailed description of state-of-the-art methods for parallel multivariate feature selection and supervised learning algorithms for regression and classification, as well as methods for proper validation of multivariate biomarkers and predictive models implementing them. This is an invaluable resource for scientists and students interested in bioinformatics, data science, and related areas.
    Anmerkung: Title from publisher's bibliographic system (viewed on 30 May 2024). , Cover -- Half-title -- Title page -- Imprints page -- Dedication -- Table of Contents -- Preface -- Acknowledgments -- Part I Framework for Multivariate Biomarker Discovery -- 1 Introduction -- 1.1 Biomarkers and Multivariate Biomarkers -- 1.2 Biomarkers and Personalized Medicine -- 1.3 Biomarker Studies -- 1.4 Basic Terms and Concepts -- 2 Multivariate Analytics Based on High-Dimensional Data: Concepts and Misconceptions -- 2.1 Introduction -- 2.2 High-Dimensional Data and the Curse of Dimensionality -- 2.3 Multivariate and Univariate Approaches -- 2.4 Supervised and Unsupervised Approaches -- 3 Predictive Modeling for Biomarker Discovery -- 3.1 Regression versus Classification -- 3.2 Parametric and Nonparametric Methods -- 3.3 Predictive Modeling for Multivariate Biomarker Discovery -- 3.3.1 Additional Data Evaluation and Preparation -- 3.3.2 Training and Test Sets -- 3.3.3 Parallel Multivariate Feature Selection Experiments -- 3.3.4 Selecting an Optimal Multivariate Biomarker -- 3.3.4.1 One-Standard-Error Method -- 3.3.4.2 Tolerance Method -- 3.3.4.3 Simultaneous Evaluation of Several Performance Metrics -- 3.3.5 Hyperparameters -- 3.3.6 Building, Tuning, and Validating Predictive Models Based on the Optimal Multivariate Biomarker -- 3.3.7 Testing Predictive Models on the Test Data -- 3.4 Bias-Variance Tradeoff -- 3.4.1 Notes on Overparameterization -- 3.5 Screening Biomarkers, Segmentation Models, and Biomarkers for Personalized Medicine -- 3.6 Committees of Predictive Models -- 3.6.1 Committees of Regression Models -- 3.6.2 Committees of Classification Models -- 4 Evaluation of Predictive Models -- 4.1 Methods of Model Evaluation -- 4.1.1 Testing the Final Predictive Model -- 4.1.1.1 Independent Test Data Set -- 4.1.1.2 Holdout Set -- 4.1.2 Evaluating Intermediate Models -- 4.1.2.1 Internal Validation (Improper). , 4.1.2.2 Proper, that is, External Validation -- 4.1.3 Resampling Techniques -- 4.1.3.1 Bootstrapping -- 4.1.3.2 Cross-Validation -- 4.2 Evaluating Regression Models -- 4.2.1 Metrics of Fit to the Training Data -- 4.2.2 Metrics of Performance on Test Data -- 4.3 Evaluating Classifiers Differentiating between Two Classes -- 4.3.1 Confusion Matrix -- 4.3.2 Basic Performance Metrics for Binary Classifiers -- 4.3.3 Proper and Improper Interpretation of Sensitivity and Specificity -- 4.3.4 Other Important Performance Metrics for Binary Classifiers -- 4.3.5 ROC Curves and the Area Under the ROC Curve -- 4.3.6 A Few More Metrics of Performance -- 4.3.6.1 Balanced Accuracy -- 4.3.6.2 F1-Measure -- 4.3.6.3 Kappa -- 4.4 Evaluating Multiclass Classifiers -- 4.4.1 Classifying All Classes Simultaneously -- 4.4.2 Multistage Approach to Multiclass Classification -- 4.4.3 One-Versus-One Approach -- 4.4.4 One-Versus-Rest Approach -- 4.5 More on Incorporating Misclassification Costs -- 4.5.1 Cost Matrix -- 4.5.2 Cost-Sensitive Classification -- 5 Multivariate Feature Selection -- 5.1 Introduction -- 5.2 General Characteristics of Feature Selection Methods -- 5.2.1 Feature Selection Search Models -- 5.2.2 Feature Selection Search Strategies -- 5.3 Multiple Feature Selection Experiments -- 5.3.1 Design of a Feature Selection Experiment -- 5.3.2 Feature Selection with Recursive Feature Elimination -- 5.3.3 Feature Selection with Stepwise Hybrid Search -- 5.3.3.1 Basic T2-Driven Hybrid Feature Selection -- 5.3.3.2 Extensions to the Basic T2-Driven Hybrid Feature Selection -- 5.3.3.3 Calculating T2 -- 5.4 Some Other Feature Selection Algorithms -- 5.4.1 Feature Selection with Simulated Annealing -- 5.4.2 Feature Selection with Genetic Algorithms -- 5.4.3 Feature Selection with Particle Swarm Optimization -- Part II Regression Methods for Estimation. , 6 Basic Regression Methods -- 6.1 Multiple Regression -- 6.1.1 Some Other Considerations for Multiple Regression -- 6.1.2 Issues with Multiple Regression -- 6.2 Partial Least Squares Regression -- 6.2.1 PLS1 Algorithm -- 6.2.2 PLS2 Approaches -- 6.2.3 A Note on Principal Component Regression -- 7 Regularized Regression Methods -- 7.1 Introduction -- 7.2 Ridge Regression -- 7.3 The Lasso (Least Absolute Shrinkage and Selection Operator) -- 7.3.1 Least Angle Regression for the Lasso -- 7.4 The Elastic Net -- 7.5 Notes on Lq-Penalized Least Squares Estimates -- 8 Regression with Random Forests -- 8.1 Introduction -- 8.2 Random Forests Algorithm for Regression -- 8.2.1 Splitting a Node -- 8.2.2 Modeling and Evaluation -- 8.3 Variable Importance Measures -- 9 Support Vector Regression -- 9.1 Introduction -- 9.1.1 A Snapshot of Support Vector Machines for Classification -- 9.1.2 e-Insensitive Error Functions -- 9.2 Linear Support Vector Regression -- 9.2.1 The Primal Optimization Problem for Linear SVR -- 9.2.2 The Dual Optimization Problem for Linear SVR -- 9.2.3 Linear Support Vector Regression Machine -- 9.3 Nonlinear Support Vector Regression -- Part III Classification Methods -- 10 Classification with Random Forests -- 10.1 Introduction -- 10.2 Random Forests Algorithm for Classification -- 10.2.1 Node-Splitting Criteria -- 10.2.2 Classification -- 10.3 Variable Importance -- 10.3.1 Permutation-Based Variable Importance -- 10.3.2 Impurity-Based Variable Importance -- 10.4 Notes on Feature Selection -- 11 Classification with Support Vector Machines -- 11.1 Introduction -- 11.2 Linear Support Vector Classification -- 11.2.1 The Primal Optimization Problem for Linear SVM -- 11.2.2 The Dual Optimization Problem for Linear SVM -- 11.2.3 Linear Support Vector Classification Machine -- 11.3 Nonlinear Support Vector Classification -- 11.4 Hyperparameters. , 11.5 Variable Importance -- 11.6 Cost-Sensitive SVMs -- 12 Discriminant Analysis -- 12.1 Introduction -- 12.2 Fisher's Discriminant Analysis -- 12.3 Gaussian Discriminant Analysis -- 12.4 Partial Least Squares Discriminant Analysis -- 13 Neural Networks and Deep Learning -- 13.1 Introduction -- 13.2 Network Topology -- 13.2.1 Adding the Bias -- 13.2.2 Nonlinear Activation -- 13.3 Backpropagation -- 13.3.1 Backpropagation through the Output Layer -- 13.3.2 Backpropagation through Any Number of Hidden Layers -- 13.3.3 Derivatives of Nonlinear Activation Functions -- 13.4 Classification of Medical Images: Deep Convolutional Networks -- 13.5 Deep Learning: Overfitting and Regularization -- 13.5.1 Regularization of the Loss Function -- 13.5.2 Dropout -- 13.5.3 Augmentation of the Training Data -- Part IV Biomarker Discovery via Multistage Signal Enhancement and Identification of Essential Patterns -- 14 Multistage Signal Enhancement -- 14.1 Introduction -- 14.2 Removing Variables Representing Experimental Noise -- 14.3 Removing Variables with Unreliable Measurements -- 14.4 Determining an Optimal Size of the Biomarker and Removing Variables with No Multivariate Importance for Prediction -- 14.4.1 Determining the Optimal Size of a Multivariate Biomarker -- 14.4.2 Removing Variables with No Multivariate Importance for Prediction -- 14.4.3 The Pool of Potentially Important Variables -- 15 Essential Patterns, Essential Variables, and Interpretable Biomarkers -- 15.1 Introduction -- 15.2 Groups of Variables with Similar Patterns -- 15.3 Essential Patterns -- 15.4 Essential Variables and Interpretable Biomarkers -- 15.5 Ancillary Information -- 15.5.1 Hierarchical Clustering: A Snapshot -- 15.5.2 Self-Organizing Maps: A Snapshot -- Part V Multivariate Biomarker Discovery Studies. , 16 Biomarker Discovery Study 1: Searching for Essential Gene Expression Patterns and Multivariate Biomarkers That Are Common for Multiple Types of Cancers -- 16.1 Introduction -- 16.2 Data -- 16.3 Determining an Optimal Size of Multivariate Biomarker -- 16.4 Identifying the Pool of Potentially Important Variables -- 16.5 Identifying Essential Patterns -- 16.6 Essential Variables of the Essential Patterns -- 16.7 Building and Testing the Final Multivariate Biomarker -- 16.8 Ancillary Information -- 17 Biomarker Discovery Study 2: Multivariate Biomarkers for Liver Cancer -- 17.1 Introduction -- 17.2 Data -- 17.2.1 RNA-Seq Liver Cancer Data from The Cancer Genome Atlas -- 17.2.2 Training and Test Sets -- 17.2.3 Removing Variables Representing Experimental Noise -- 17.2.4 Low-Dimensional Visualization of the Training Data -- 17.3 Feature Selection Experiments with Recursive Feature Elimination and Random Forests -- 17.3.1 Resampling without Balancing (Substudy 1) -- 17.3.2 Resampling with Rebalancing Class Proportions (Substudy 2) -- 17.3.3 Performing Feature Selection Experiments -- 17.3.4 Results of Feature Selection Experiments -- 17.3.5 Selecting Optimal Multivariate Biomarkers -- 17.3.6 Notes on Misclassification Costs -- 17.4 Feature Selection Experiments with Recursive Feature Elimination and Support Vector Machines (Substudy 3) -- 17.5 Summary of the Three Biomarker Discovery Substudies -- 17.6 Testing the Three Final Multivariate Biomarkers and Classifiers -- References -- Index.
    Weitere Ausg.: ISBN 1-316-51870-1
    Sprache: Englisch
    Bibliothek Standort Signatur Band/Heft/Jahr Verfügbarkeit
    BibTip Andere fanden auch interessant ...
Schließen ⊗
Diese Webseite nutzt Cookies und das Analyse-Tool Matomo. Weitere Informationen finden Sie auf den KOBV Seiten zum Datenschutz