UID:
almahu_9948025543202882
Format:
1 online resource (454 pages)
ISBN:
0-12-811789-3
,
0-12-811788-5
Content:
"A thorough and accessible presentation of machine learning techniques that can be employed by space weather professionals. Additionally, it presents an overview of real-world applications in space science to the machine learning community, offering a bridge between the fields. As this volume demonstrates, real advances in space weather can be gained using nontraditional approaches that take into account nonlinear and complex dynamics, including information theory, nonlinear auto-regression models, neural networks and clustering algorithms. Offering practical techniques for translating the huge amount of information hidden in data into useful knowledge that allows for better prediction, this book is a unique and important resource for space physicists, space weather professionals and computer scientists in related fields"--Page 4 of cover.
Note:
Front Cover -- Machine Learning Techniques for Space Weather -- Copyright -- Contents -- Contributors -- Introduction -- Machine Learning and Space Weather -- Scope and Structure of the Book -- Acknowledgments -- References -- Part I: Space Weather -- Chapter 1: Societal and Economic Importance of Space Weather -- 1 What is Space Weather? -- 2 Why Now? -- 3 Impacts -- 3.1 Geomagnetically Induced Currents -- 3.2 Global Navigation Satellite Systems -- 3.3 Single-Event Effects -- 3.4 Other Radio Systems -- 3.5 Satellite Drag -- 4 Looking to the Future -- 5 Summary and Conclusions -- Acknowledgments -- References -- Chapter 2: Data Availability and Forecast Products for Space Weather -- 1 Introduction -- 2 Data and Models Based on Machine Learning Approaches -- 3 Space Weather Agencies -- 3.1 Government Agencies -- 3.1.1 NOAA's Data and Products -- 3.1.2 NASA -- 3.1.3 European Space Agency -- 3.1.4 The US Air Force Weather Wing -- 3.2 Academic Institutions -- 3.2.1 Kyoto University, Japan -- 3.2.2 Rice University, USA -- 3.2.3 Laboratory for Atmospheric and Space Physics, USA -- 3.3 Commercial Providers -- 3.4 Other Nonprofit, Corporate Research Agencies -- 3.4.1 USGS -- 3.4.2 JHU Applied Physics Lab -- 3.4.3 US Naval Research Lab -- 3.4.4 Other International Service Providers -- 4 Summary -- References -- Part II: Machine Learning -- Chapter 3: An Information-Theoretical Approach to Space Weather -- 1 Introduction -- 2 Complex Systems Framework -- 3 State Variables -- 4 Dependency, Correlations, and Information -- 4.1 Mutual Information as a Measure of Nonlinear Dependence -- 4.2 Cumulant-Based Cost as a Measure of Nonlinear Dependence -- 4.3 Causal Dependence -- 4.4 Transfer Entropy and Redundancy as Measures of Causal Relations -- 4.5 Conditional Redundancy -- 4.6 Significance of Discriminating Statistics.
,
4.7 Mutual Information and Information Flow -- 5 Examples From Magnetospheric Dynamics -- 6 Significance as an Indicator of Changes in Underlying Dynamics -- 6.1 Detecting Dynamics in a Noisy System -- 6.2 Cumulant-Based Information Flow -- 7 Discussion -- 8 Summary -- Acknowledgments -- References -- Chapter 4: Regression -- 1 What is Regression? -- 2 Learning From Noisy Data -- 2.1 Prediction Errors -- 2.2 A Probabilistic Set-Up -- 2.3 The Least Squares Method for Linear Regression -- 2.3.1 The Least Squares Method and the Best Linear Predictor -- 2.3.2 The Least Squares Method and the Maximum Likelihood Principle -- 2.3.3 A More General Approach and Higher-Order Predictors -- 2.4 Overfitting -- 2.4.1 The Order Selection Problem -- Error Decomposition: The Bias Versus Variance Trade-Off -- Some Popular Order Selection Criteria -- 2.4.2 Regularization -- 2.5 From Point Predictors to Interval Predictors -- 2.5.1 Distribution-Free Interval Predictors -- 2.6 Probability Density Estimation -- 3 Predictions Without Probabilities -- 3.1 Approximation Theory -- Dense Sets -- Best Approximator -- 3.1.1 Neural Networks -- The Backpropagation Algorithm: High-Level Idea -- Multiple Layers Networks (Deep Networks) -- 4 Probabilities Everywhere: Bayesian Regression -- 4.1 Gaussian Process Regression -- 5 Learning in the Presence of Time: Identification of Dynamical Systems -- 5.1 Linear Time-Invariant Systems -- 5.2 Nonlinear Systems -- References -- Chapter 5: Supervised Classification: Quite a Brief Overview -- 1 Introduction -- 1.1 Learning, Not Modeling -- 1.2 An Outline -- 2 Classifiers -- 2.1 Preliminaries -- 2.2 The Bayes Classifier -- 2.3 Generative Probabilistic Classifiers -- 2.4 Discriminative Probabilistic Classifiers -- 2.5 Losses and Hypothesis Spaces -- 2.5.1 0-1 Loss -- 2.5.2 Convex Surrogate Losses.
,
2.5.3 Particular Surrogate Losses -- 2.6 Neural Networks -- 2.7 Neighbors, Trees, Ensembles, and All that -- 2.7.1 k Nearest Neighbors -- 2.7.2 Decision Trees -- 2.7.3 Multiple Classifier Systems -- 3 Representations and Classifier Complexity -- 3.1 Feature Transformations -- 3.1.1 The Kernel Trick -- 3.2 Dissimilarity Representation -- 3.3 Feature Curves and the Curse of Dimensionality -- 3.4 Feature Extraction and Selection -- 4 Evaluation -- 4.1 Apparent Error and Holdout Set -- 4.2 Resampling Techniques -- 4.2.1 Leave-One-Out and k-Fold Cross-Validation -- 4.2.2 Bootstrap Estimators -- 4.2.3 Tests of Significance -- 4.3 Learning Curves and the Single Best Classifier -- 4.4 Some Words About More Realistic Scenarios -- 5 Regularization -- 6 Variations on Standard Classification -- 6.1 Multiple Instance Learning -- 6.2 One-Class Classification, Outliers, and Reject Options -- 6.3 Contextual Classification -- 6.4 Missing Data and Semisupervised Learning -- 6.5 Transfer Learning and Domain Adaptation -- 6.6 Active Learning -- Acknowledgments -- References -- Part III: Applications -- Chapter 6: Untangling the Solar Wind Drivers of the Radiation Belt: An Information Theoretical Approach -- 1 Introduction -- 2 Data Set -- 3 Mutual Information, Conditional Mutual Information, and Transfer Entropy -- 4 Applying Information Theory to Radiation Belt MeV Electron Data -- 4.1 Radiation Belt MeV Electron Flux Versus Vsw -- 4.2 Radiation Belt MeV Electron Flux Versus nsw -- 4.3 Anticorrelation of Vsw and nsw and Its Effect on Radiation Belt -- 4.4 Ranking of Solar Wind Parameters Based on Information Transfer to Radiation Belt Electrons -- 4.5 Detecting Changes in the System Dynamics -- 5 Discussion -- 5.1 Geo-Effectiveness of Solar Wind Velocity -- 5.2 nsw and Vsw Anticorrelation.
,
5.3 Geo-Effectiveness of Solar Wind Density -- 5.4 Revisiting the Triangle Distribution -- 5.5 Improving Models With Information Theory -- 5.5.1 Selecting Input Parameters -- 5.5.2 Detecting Nonstationarity in System Dynamics -- 5.5.3 Prediction Horizon -- 6 Summary -- Acknowledgments -- References -- Chapter 7: Emergence of Dynamical Complexity in the Earth's Magnetosphere -- 1 Introduction -- 2 On Complexity and Dynamical Complexity -- 3 Coherence and Intermittent Features in Time Series Geomagnetic Indices -- 4 Scale-Invariance and Self-Similarity in Geomagnetic Indices -- 5 Near-Criticality Dynamics -- 6 Multifractional Features and Dynamical Phase Transitions -- 7 Summary -- Acknowledgments -- References -- Chapter 8: Applications of NARMAX in Space Weather -- 1 Introduction -- 2 NARMAX Methodology -- 2.1 Forward Regression Orthogonal Least Square -- 2.2 The Noise Model -- 2.3 Model Validation -- 2.4 Summary -- 3 NARMAX and Space Weather Forecasting -- 3.1 Geomagnetic Indices -- 3.1.1 SISO Dst Index -- 3.1.2 Continuous Time Dst model -- 3.1.3 MISO Dst -- 3.1.4 Kp Index -- 3.2 Radiation Belt Electron Fluxes -- 3.2.1 GOES High Energy -- 3.2.2 SNB3GEO Comparison With NOAA REFM -- 3.2.3 GOES Low Energy -- 3.3 Summary of NARMAX Models -- 4 NARMAX and Insight Into the Physics -- 4.1 NARMAX Deduced Solar Wind-Magnetosphere Coupling Function -- 4.2 Identification of Radiation Belt Control Parameters -- 4.2.1 Solar Wind Density Relationship With Relativistic Electrons at GEO -- 4.2.2 Geostationary Local Quasilinear Diffusion vs. Radial Diffusion -- 4.3 Frequency Domain Analysis of the Dst Index -- 5 Discussions and Conclusion -- References -- Chapter 9: Probabilistic Forecasting of Geomagnetic Indices Using Gaussian Process Models -- 1 Geomagnetic Time Series and Forecasting -- 2 Dst Forecasting.
,
2.1 Models and Algorithms -- 2.2 Probabilistic Forecasting -- 3 Gaussian Processes -- 3.1 Gaussian Process Regression: Formulation -- 3.2 Gaussian Process Regression: Inference -- 4 One-Hour Ahead Dst Prediction -- 4.1 Data Source: OMNI -- 4.2 Gaussian Process Dst Model -- 4.3 Gaussian Process Auto-Regressive (GP-AR) -- 4.4 GP-AR With eXogenous Inputs (GP-ARX) -- 5 One-Hour Ahead Dst Prediction: Model Design -- 5.1 Choice of Mean Function -- 5.2 Choice of Kernel -- 5.3 Model Selection: Hyperparameters -- 5.3.1 Grid Search -- 5.3.2 Coupled Simulated Annealing -- 5.3.3 Maximum Likelihood -- 5.4 Model Selection: Auto-Regressive Order -- 6 GP-AR and GP-ARX: Workflow Summary -- 7 Practical Issues: Software -- 8 Experiments and Results -- 8.1 Model Selection and Validation Performance -- 8.2 Comparison of Hyperparameter Selection Algorithms -- 8.3 Final Evaluation -- 8.4 Sample Predictions With Error Bars -- 9 Conclusion -- References -- Chapter 10: Prediction of MeV Electron Fluxes and Forecast Verification -- 1 Relativistic Electrons in Earth's Outer Radiation Belt -- 1.1 Source, Loss, Transport, and Acceleration, Variation -- 2 Numerical Techniques in Radiation Belt Forecasting -- 3 Relativistic Electron Forecasting and Verification -- 3.1 Forecast Verification -- 3.2 Relativistic Electron Forecasting -- 4 Summary -- References -- Chapter 11: Artificial Neural Networks for Determining Magnetospheric Conditions -- 1 Introduction -- 2 A Brief Review of ANNs -- 3 Methodology and Application -- 3.1 The DEN2D Model -- 4 Advanced Applications -- 4.1 The DEN3D Model -- 4.2 The Chorus and Hiss Wave Models -- 4.3 Radiation Belt Flux Modeling -- 5 Summary and Discussion -- Acknowledgments -- References -- Chapter 12: Reconstruction of Plasma Electron Density From Satellite Measurements Via Artificial Neural Networks.
,
1 Overview.
Language:
English
Bookmarklink