Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    London, England :Academic Press,
    UID:
    almahu_9949697871302882
    Format: 1 online resource (466 pages)
    Edition: Third edition.
    ISBN: 0-323-95577-0
    Content: Environmental Data Analysis with MATLAB, Third Edition, is a new edition that expands fundamentally on the original with an expanded tutorial approach, more clear organization, new crib sheets, and problem sets providing a clear learning path for students and researchers working to analyze real data sets in the environmental sciences. The work teaches the basics of the underlying theory of data analysis and then reinforces that knowledge with carefully chosen, realistic scenarios, including case studies in each chapter. The new edition is expanded to include applications to Python, an open source software environment. Significant content in Environmental Data Analysis with MATLAB, Third Edition is devoted to teaching how the programs can be effectively used in an environmental data analysis setting. This new edition offers chapters that can both be used as self-contained resources or as a step-by-step guide for students, and is supplemented with data and scripts to demonstrate relevant use cases.
    Note: Intro -- Environmental Data Analysis with MatLab: Principles, Applications, and Prospects -- Copyright -- Contents -- Preface -- Acknowledgment -- Advice on scripting for beginners -- Chapter 1: Data analysis with MATLAB or Python -- Part A: MATLAB -- 1A.1. Getting started -- 1A.2. Navigating folders -- 1A.3. Simple arithmetic and algebra -- 1A.4. Vectors and matrices -- 1A.5. Addition, subtraction and multiplication of column-vectors and matrices -- 1A.6. Element access -- 1A.7. Representing functions -- 1A.8. To loop or not to loop -- 1A.9. The matrix inverse -- 1A.10. Loading data from a file -- 1A.11. Plotting data -- 1A.12. Saving data to a file -- 1A.13. Some advice on writing scripts -- Think before you type -- Name variables consistently -- Save old scripts -- Cut and paste sparingly -- Start small -- Test your scripts -- Dont be too clever -- Comment your scripts -- Part B: Python -- 1B.1. Installation -- 1B.2. The first cell in the edapy_01 Jupyter Notebook -- 1B.3. A very simple Python script -- 1B.4. Navigating folders -- 1B.5. Simple arithmetic and algebra -- 1B.6. List-like data -- 1B.7. Creating column-vectors and matrices -- 1B.8. Addition, subtraction and multiplication of column-vectors and matrices -- 1B.9. Element access -- 1B.10. Representing functions -- 1B.11. To loop or not to loop -- 1B.12. The matrix inverse -- 1B.13. Loading data from a file -- 1B.14. Plotting data -- 1B.15. Saving data to a file -- 1B.16. Some advice on writing scripts -- Think before you type -- Name variables consistently -- Save old scripts -- Cut and paste sparingly -- Start small -- Test your scripts -- Dont be too clever -- Comment your scripts -- Problems -- Chapter 2: Systematic explorations of a new dataset -- 2.1. Case study: Black rock forest temperature data -- 2.2. More on graphics. , 2.3. Case study: Neuse River hydrograph used to explore rate data -- 2.4. Case study, Atlantic Rock dataset used to explore scatter plots -- 2.5. More on character strings -- Problems -- Chapter 3: Modeling observational noise with random variables -- 3.1. Random variables -- 3.2. Mean, median, and mode -- 3.3. Variance -- 3.4. Two important probability density functions -- 3.5. Functions of a random variable -- 3.6. Joint probabilities -- 3.7. Bayesian inference -- 3.8. Joint probability density functions -- 3.9. Covariance -- 3.10. The multivariate normal p.d.f. -- 3.11. Linear functions of multivariate data -- Problems -- Reference -- Chapter 4: Linear models as the foundation of data analysis -- 4.1. Quantitative models, data, and model parameters -- 4.2. The simplest of quantitative models -- 4.3. Curve fitting -- 4.4. Mixtures -- 4.5. Weighted averages -- 4.6. Examining error -- 4.7. Least squares -- 4.8. Examples -- 4.9. Covariance and the behavior of error -- Problems -- Chapter 5: Least squares with prior information -- 5.1. When least square fails -- 5.2. Prior information -- 5.3. Bayesian inference -- 5.4. The product of Normal probability density functions -- 5.5. Generalized least squares -- 5.6. The role of the covariance of the data -- 5.7. Smoothness as prior information -- 5.8. Sparse matrices -- 5.9. Reorganizing grids of model parameters -- Problems -- References -- Chapter 6: Detecting periodicities with Fourier analysis -- 6.1. Describing sinusoidal oscillations -- 6.2. Models composed only of sinusoidal functions -- 6.3. Going complex -- 6.4. Lessons learned from the integral transform -- 6.5. Normal curve -- 6.6. Spikes -- 6.7. Area under a function -- 6.8. Time-delayed function -- 6.9. Derivative of a function -- 6.10. Integral of a function -- 6.11. Convolution -- 6.12. Nontransient signals -- Problems -- Further reading. , Chapter 7: Modeling time-dependent behavior with filters -- 7.1. Behavior sensitive to past conditions -- 7.2. Filtering as convolution -- 7.3. Solving problems with filters -- 7.4. Case study: Empirically-derived filter for Hudson River discharge -- 7.5. Predicting the future -- 7.6. A parallel between filters and polynomials -- 7.7. Filter cascades and inverse filters -- 7.8. Making use of what you know -- Problems -- Reference -- Chapter 8: Undirected data analysis using factors, empirical orthogonal functions, and clusters -- 8.1. Samples as mixtures -- 8.2. Determining the minimum number of factors -- 8.3. Application to the Atlantic Rocks dataset -- 8.4. Spiky factors -- 8.5. Weighting of elements -- 8.6. Q-mode factor analysis -- 8.7. Factors and factor loadings with natural orderings -- 8.8. Case study: EOFs of the equatorial Pacific Ocean Sea surface temperature dataset -- 8.9. Clusters of data -- 8.10. K-mean cluster analysis -- 8.11. Clustering the Atlantic Rocks dataset -- Problems -- References -- Chapter 9: Detecting and understanding correlations among data -- 9.1. Correlation is covariance -- 9.2. Computing autocorrelation by hand -- 9.3. Relationship to convolution and power spectral density -- 9.4. Cross-correlation -- 9.5. Using the cross-correlation to align time series -- 9.6. Least squares estimation of filters -- 9.7. The effect of smoothing on time series -- 9.8. Band-pass filters -- 9.9. Case study: Coherence of the Reynolds Channel water quality data -- 9.10. Windowing before computing Fourier transforms -- 9.11. Optimal window functions -- Problems -- References -- Chapter 10: Interpolation, Gaussian process regression, and kriging -- 10.1. Interpolation requires prior information -- 10.2. Linear interpolation -- 10.3. Cubic interpolation -- 10.4. Gaussian process regression and kriging. , 10.5. Example of Gaussian process regression -- 10.6. Tuning of parameters in Gaussian process regression -- 10.7. Example of tuning -- 10.8. Interpolation in two-dimensions -- 10.9. Fourier transforms in two dimensions -- 10.10. Using Fourier transforms to fill in missing data -- Problems -- References -- Chapter 11: Approximate methods, including linearization and artificial neural networks -- 11.1. The value of simplicity -- 11.2. Polynomial approximations and Taylor series -- 11.3. Small number approximations -- 11.4. Small number approximation applied to distance on a sphere -- 11.5. Small number approximation applied to variance -- 11.6. Taylor series in multiple dimensions -- 11.7. Small number approximation applied to covariance -- 11.8. Solving nonlinear problems with iterative least squares -- 11.9. Fitting a sinusoid of unknown frequency -- 11.10. The gradient descent method -- 11.11. Precomputation of a function and table lookups -- 11.12. Artificial neural networks -- 11.13. Information flow in a neural net -- 11.14. Training a neural net -- 11.15. Neural net for a nonlinear filter -- Problems -- References -- Chapter 12: Assessing the significance of results -- 12.1. Rejecting the Null Hypothesis -- 12.2. The distribution of the total error -- 12.3. Four important probability density functions -- 12.4. Case study with common hypothesis testing scenarios -- 12.5. Chi-squared test for generalized least squares -- 12.6. Testing improvement in fit -- 12.7. Testing the significance of a spectral peak -- 12.8. Significance of a spectral peak for a correlated time series -- 12.9. Bootstrap confidence intervals -- Problems -- Chapter 13: Notes -- 13.1. Note 1.1 On the persistence of variables -- 13.2. Note 2.1 On time -- 13.3. Note 2.2 On reading complicated text files -- 13.4. Note 3.1 On the rule for error propagation. , 13.5. Note 3.2 On the eda_draw() function -- 13.6. Note 4.1 On complex least squares -- 13.7. Note 5.1 On the derivation of generalized least squares -- 13.8. Note 5.2 MATLAB and Python functions -- 13.9. Note 5.3 On reorganizing matrices -- 13.10. Note 6.1 On the atan2() function -- 13.11. Note 6.2 On the orthonormality of the discrete Fourier data kernel -- 13.12. Note 6.3 On the expansion of a function in an orthonormal basis -- 13.13. Note 8.1 On singular value decomposition -- 13.14. Note 9.1 On coherence -- 13.15. Note 9.2 On Lagrange multipliers -- 13.16. Note 10.1 Covariance matrix corresponding to prior information of smoothness -- 13.17. Note 10.2 Issues encountered tuning Gaussian Process Regression -- 13.18. Note 11.1 On the chain rule for partial derivatives -- References -- Index.
    Additional Edition: Print version: Menke, William Environmental Data Analysis with MatLab or Python San Diego : Elsevier Science & Technology,c2022 ISBN 9780323955768
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages