Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
Type of Medium
Language
Region
Years
Person/Organisation
Subjects(RVK)
Access
  • 1
    Online Resource
    Online Resource
    Amsterdam :Elsevier,
    UID:
    edoccha_BV042310022
    Format: 1 Online-Ressource (xviii, 494 Seiten) : , Diagramme.
    Edition: Second edition
    ISBN: 978-0-12-407839-0
    Series Statement: Elsevier insights
    Note: Previous edition: Amsterdam; London: Academic, 2009. - Covering a wide range of areas of application of Markov processes, this second edition is revised to highlight the most important aspects as well as the most recent trends and applications of Markov processes
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 978-0-12-407795-9
    Language: English
    Subjects: Mathematics
    RVK:
    Keywords: Markov-Prozess ; Stochastisches Modell
    URL: Volltext  (URL des Erstveröffentlichers)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Online Resource
    Online Resource
    London : Elsevier
    UID:
    gbv_1656133369
    Format: Online Ressource (xviii, 494 pages)
    Edition: Second edition
    Edition: Online-Ausg.
    ISBN: 0124077951 , 9780124078390 , 0124078397 , 9780124077959
    Series Statement: Elsevier insights
    Content: Covering a wide range of areas of application of Markov processes, this second edition is revised to highlight the most important aspects as well as the most recent trends and applications of Markov processes
    Note: Previous edition: Amsterdam; London: Academic, 2009. - Includes bibliographic references. - Description based on print version record , Front Cover; Markov Processes for Stochastic Modeling; Copyright page; Contents; Acknowledgments; Preface to the Second Edition; Preface to the First Edition; 1 Basic Concepts in Probability; 1.1 Introduction; 1.1.1 Conditional Probability; 1.1.2 Independence; 1.1.3 Total Probability and the Bayes' Theorem; 1.2 Random Variables; 1.2.1 Distribution Functions; 1.2.2 Discrete Random Variables; 1.2.3 Continuous Random Variables; 1.2.4 Expectations; 1.2.5 Expectation of Nonnegative Random Variables; 1.2.6 Moments of Random Variables and the Variance; 1.3 Transform Methods; 1.3.1 The s-Transform , 1.3.2 The z-Transform1.4 Bivariate Random Variables; 1.4.1 Discrete Bivariate Random Variables; 1.4.2 Continuous Bivariate Random Variables; 1.4.3 Covariance and Correlation Coefficient; 1.5 Many Random Variables; 1.6 Fubini's Theorem; 1.7 Sums of Independent Random Variables; 1.8 Some Probability Distributions; 1.8.1 The Bernoulli Distribution; 1.8.2 The Binomial Distribution; 1.8.3 The Geometric Distribution; 1.8.4 The Pascal Distribution; 1.8.5 The Poisson Distribution; 1.8.6 The Exponential Distribution; 1.8.7 The Erlang Distribution; 1.8.8 Normal Distribution; 1.9 Limit Theorems , 1.9.1 Markov Inequality1.9.2 Chebyshev Inequality; 1.9.3 Laws of Large Numbers; 1.9.4 The Central Limit Theorem; 1.10 Problems; 2 Basic Concepts in Stochastic Processes; 2.1 Introduction; 2.2 Classification of Stochastic Processes; 2.3 Characterizing a Stochastic Process; 2.4 Mean and Autocorrelation Function of a Stochastic Process; 2.5 Stationary Stochastic Processes; 2.5.1 Strict-Sense Stationary Processes; 2.5.2 Wide-Sense Stationary Processes; 2.6 Ergodic Stochastic Processes; 2.7 Some Models of Stochastic Processes; 2.7.1 Martingales; Stopping Times; 2.7.2 Counting Processes , 2.7.3 Independent Increment Processes2.7.4 Stationary Increment Process; 2.7.5 Poisson Processes; Interarrival Times for the Poisson Process; Compound Poisson Process; Combinations of Independent Poisson Processes; Competing Independent Poisson Processes; Subdivision of a Poisson Process; 2.8 Problems; 3 Introduction to Markov Processes; 3.1 Introduction; 3.2 Structure of Markov Processes; 3.3 Strong Markov Property; 3.4 Applications of Discrete-Time Markov Processes; 3.4.1 Branching Processes; 3.4.2 Social Mobility; 3.4.3 Markov Decision Processes , 3.5 Applications of Continuous-Time Markov Processes3.5.1 Queueing Systems; 3.5.2 Continuous-Time Markov Decision Processes; 3.5.3 Stochastic Storage Systems; 3.6 Applications of Continuous-State Markov Processes; 3.6.1 Application of Diffusion Processes to Financial Options; 3.6.2 Applications of Brownian Motion; 3.7 Summary; 4 Discrete-Time Markov Chains; 4.1 Introduction; 4.2 State-Transition Probability Matrix; 4.2.1 The n-Step State-Transition Probability; 4.3 State-Transition Diagrams; 4.4 Classification of States; 4.5 Limiting-State Probabilities; 4.5.1 Doubly Stochastic Matrix , 4.6 Sojourn Time , Machine generated contents note: 1.1.Introduction1.1.1.Conditional Probability -- 1.1.2.Independence -- 1.1.3.Total Probability and the Bayes' Theorem -- 1.2.Random Variables -- 1.2.1.Distribution Functions -- 1.2.2.Discrete Random Variables -- 1.2.3.Continuous Random Variables -- 1.2.4.Expectations -- 1.2.5.Expectation of Nonnegative Random Variables -- 1.2.6.Moments of Random Variables and the Variance -- 1.3.Transform Methods -- 1.3.1.The s-Transform -- 1.3.2.The z-Transform -- 1.4.Bivariate Random Variables -- 1.4.1.Discrete Bivariate Random Variables -- 1.4.2.Continuous Bivariate Random Variables -- 1.4.3.Covariance and Correlation Coefficient -- 1.5.Many Random Variables -- 1.6.Fubini's Theorem -- 1.7.Sums of Independent Random Variables -- 1.8.Some Probability Distributions -- 1.8.1.The Bernoulli Distribution -- 1.8.2.The Binomial Distribution -- 1.8.3.The Geometric Distribution -- 1.8.4.The Pascal Distribution -- 1.8.5.The Poisson Distribution -- 1.8.6.The Exponential Distribution -- 1.8.7.The Erlang Distribution -- 1.8.8.Normal Distribution -- 1.9.Limit Theorems -- 1.9.1.Markov Inequality -- 1.9.2.Chebyshev Inequality -- 1.9.3.Laws of Large Numbers -- 1.9.4.The Central Limit Theorem -- 1.10.Problems -- 2.1.Introduction -- 2.2.Classification of Stochastic Processes -- 2.3.Characterizing a Stochastic Process -- 2.4.Mean and Autocorrelation Function of a Stochastic Process -- 2.5.Stationary Stochastic Processes -- 2.5.1.Strict-Sense Stationary Processes -- 2.5.2.Wide-Sense Stationary Processes -- 2.6.Ergodic Stochastic Processes -- 2.7.Some Models of Stochastic Processes -- 2.7.1.Martingales -- 2.7.2.Counting Processes -- 2.7.3.Independent Increment Processes -- 2.7.4.Stationary Increment Process -- 2.7.5.Poisson Processes -- 2.8.Problems -- 3.1.Introduction -- 3.2.Structure of Markov Processes -- 3.3.Strong Markov Property -- 3.4.Applications of Discrete-Time Markov Processes -- 3.4.1.Branching Processes -- 3.4.2.Social Mobility -- 3.4.3.Markov Decision Processes -- 3.5.Applications of Continuous-Time Markov Processes -- 3.5.1.Queueing Systems -- 3.5.2.Continuous-Time Markov Decision Processes -- 3.5.3.Stochastic Storage Systems -- 3.6.Applications of Continuous-State Markov Processes -- 3.6.1.Application of Diffusion Processes to Financial Options -- 3.6.2.Applications of Brownian Motion -- 3.7.Summary -- 4.1.Introduction -- 4.2.State-Transition Probability Matrix -- 4.2.1.The n-Step State-Transition Probability -- 4.3.State-Transition Diagrams -- 4.4.Classification of States -- 4.5.Limiting-State Probabilities -- 4.5.1.Doubly Stochastic Matrix -- 4.6.Sojourn Time -- 4.7.Transient Analysis of Discrete-Time Markov Chains -- 4.8.First Passage and Recurrence Times -- 4.9.Occupancy Times -- 4.10.Absorbing Markov Chains and the Fundamental Matrix -- 4.10.1.Time to Absorption -- 4.10.2.Absorption Probabilities -- 4.11.Reversible Markov Chains -- 4.12.Problems -- 5.1.Introduction -- 5.2.Transient Analysis -- 5.2.1.The s-Transform Method -- 5.3.Birth and Death Processes -- 5.3.1.Local Balance Equations -- 5.3.2.Transient Analysis of Birth and Death Processes -- 5.4.First Passage Time -- 5.5.The Uniformization Method -- 5.6.Reversible CTMCs -- 5.7.Problems -- 6.1.Introduction -- 6.2.Renewal Processes -- 6.2.1.The Renewal Equation -- 6.2.2.Alternative Approach -- 6.2.3.The Elementary Renewal Theorem -- 6.2.4.Random Incidence and Residual Time -- 6.2.5.Delayed Renewal Process -- 6.3.Renewal-Reward Process -- 6.3.1.The Reward-Renewal Theorem -- 6.4.Regenerative Processes -- 6.4.1.Inheritance of Regeneration -- 6.4.2.Delayed Regenerative Process -- 6.4.3.Regenerative Simulation -- 6.5.Markov Renewal Process -- 6.5.1.The Markov Renewal Function -- 6.6.Semi-Markov Processes -- 6.6.1.Discrete-Time SMPs -- 6.6.2.Continuous-Time SMPs -- 6.7.Markov Regenerative Process -- 6.8.Markov Jump Processes -- 6.8.1.The Homogeneous Markov Jump Process -- 6.9.Problems -- 7.1.Introduction -- 7.2.Description of a Queueing System -- 7.3.The Kendall Notation -- 7.4.The Little's Formula -- 7.5.The PASTA Property -- 7.6.The M/M/1 Queueing System -- 7.6.1.Stochastic Balance -- 7.6.2.Total Time and Waiting Time Distributions of the M/M/1 Queueing System -- 7.7.Examples of Other M/M Queueing Systems -- 7.7.1.The M/M/c Queue: The c-Server System -- 7.7.2.The M/M/I/K Queue: The Single-Server Finite-Capacity System -- 7.7.3.The M/M/c/c Queue: The c-Server Loss System -- 7.7.4.The M/M/1//K Queue: The Single-Server Finite Customer Population System -- 7.8.M/G/1 Queue -- 7.8.1.Waiting Time Distribution of the M/G/1 Queue -- 7.8.2.The M/Ek/1 Queue -- 7.8.3.The M/D/1 Queue -- 7.8.4.The M/M/1 Queue Revisited -- 7.8.5.The M/Hk/1 Queue -- 7.9.G/M/1 Queue -- 7.9.1.The Ek/M/1 Queue -- 7.9.2.The D/M/1 Queue -- 7.10.M/G/1 Queues with Priority -- 7.10.1.Nonpreemptive Priority -- 7.10.2.Preemptive Resume Priority -- 7.10.3.Preemptive Repeat Priority -- 7.11.Markovian Networks of Queues -- 7.11.1.Burke's Output Theorem and Tandem Queues -- 7.11.2.Jackson or Open Queueing Networks -- 7.11.3.Closed Queueing Networks -- 7.12.Applications of Markovian Queues -- 7.13.Problems -- 8.1.Introduction -- 8.2.Occupancy Probability -- 8.3.Random Walk as a Markov Chain -- 8.4.Symmetric Random Walk as a Martingale -- 8.5.Random Walk with Barriers -- 8.6.Gambler's Ruin -- 8.6.1.Ruin Probability -- 8.6.2.Alternative Derivation of Ruin Probability -- 8.6.3.Duration of a Game -- 8.7.Random Walk with Stay -- 8.8.First Return to the Origin -- 8.9.First Passage Times for Symmetric Random Walk -- 8.9.1.First Passage Time via the Generating Function -- 8.9.2.First Passage Time via the Reflection Principle -- 8.9.3.Hitting Time and the Reflection Principle -- 8.10.The Ballot Problem and the Reflection Principle -- 8.10.1.The Conditional Probability Method -- 8.11.Returns to the Origin and the Arc-Sine Law -- 8.12.Maximum of a Random Walk -- 8.13.Random Walk on a Graph -- 8.13.1.Random Walk on a Weighted Graph -- 8.14.Correlated Random Walk -- 8.15.Continuous-Time Random Walk -- 8.15.1.The Master Equation -- 8.16.Self-Avoiding Random Walk -- 8.17.Nonreversing Random Walk -- 8.18.Applications of Random Walk -- 8.18.1.Web Search -- 8.18.2.Insurance Risk -- 8.18.3.Content of a Dam -- 8.18.4.Cash Management -- 8.18.5.Mobility Models in Mobile Networks -- 8.19.Summary -- 8.20.Problems -- 9.1.Introduction -- 9.2.Mathematical Description -- 9.3.Brownian Motion with Drift -- 9.4.Brownian Motion as a Markov Process -- 9.5.Brownian Motion as a Martingale -- 9.6.First Passage Time of a Brownian Motion -- 9.7.Maximum of a Brownian Motion -- 9.8.First Passage Time in an Interval -- 9.9.The Brownian Bridge -- 9.10.Geometric Brownian Motion -- 9.11.Introduction to Stochastic Calculus -- 9.11.1.Stochastic Differential Equation and the Ito Process -- 9.11.2.The Ito Integral -- 9.11.3.The Ito's Formula -- 9.12.Solution of Stochastic Differential Equations -- 9.13.Solution of the Geometric Brownian Motion -- 9.14.The Ornstein-Uhlenbeck Process -- 9.14.1.Solution of the OU SDE -- 9.14.2.First Alternative Solution Method -- 9.14.3.Second Alternative Solution Method -- 9.15.Mean-Reverting OU Process -- 9.16.Fractional Brownian Motion -- 9.16.1.Self-Similar Processes -- 9.16.2.Long-Range Dependence -- 9.16.3.Self-Similarity and Long-Range Dependence -- 9.16.4.FBM Revisited -- 9.17.Fractional Gaussian Noise -- 9.18.Multifractional Brownian Motion -- 9.19.Problems -- 10.1.Introduction -- 10.2.Mathematical Preliminaries -- 10.3.Models of Diffusion -- 10.3.1.Diffusion as a Limit of Random Walk: The Fokker-Planck Equation -- 10.3.2.The Langevin Equation -- 10.3.3.The Fick's Equations -- 10.4.Examples of Diffusion Processes -- 10.4.1.Brownian Motion -- 10.4.2.Brownian Motion with Drift -- 10.5.Correlated Random Walk and the Telegraph Equation -- 10.6.Introduction to Fractional Calculus -- 10.6.1.Gamma Function -- 10.6.2.Mittag-Leffler Functions -- 10.6.3.Laplace Transform -- 10.6.4.Fractional Derivatives -- 10.6.5.Fractional Integrals -- 10.6.6.Defini
    Additional Edition: ISBN 9780124077959
    Additional Edition: ISBN 0124077951
    Additional Edition: ISBN 9780124077959
    Additional Edition: Erscheint auch als Druck-Ausgabe Ibe, Oliver C. (Oliver Chukwudi), 1947- author Markov processes for stochastic modeling
    Language: English
    Subjects: Mathematics
    RVK:
    Keywords: Markov-Prozess ; Stochastisches Modell ; Markov-Prozess ; Stochastisches Modell ; Electronic books ; Electronic books ; Electronic books ; Lehrbuch
    URL: Volltext  (lizenzpflichtig)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    Online Resource
    Online Resource
    Amsterdam, Netherlands :Elsevier, | London :Elsevier,
    UID:
    edoccha_9959238935602883
    Format: 1 online resource (xviii, 494 pages) : , illustrations.
    Edition: 2nd ed.
    ISBN: 0-12-407839-7
    Series Statement: Elsevier insights Markov processes for stochastic modeling
    Content: Markov processes are processes that have limited memory. In particular, their dependence on the past is only through the previous state. They are used to model the behavior of many systems including communications systems, transportation networks, image segmentation and analysis, biological systems and DNA sequence analysis, random atomic motion and diffusion in physics, social mobility, population studies, epidemiology, animal and insect migration, queueing systems, resource management, dams, financial engineering, actuarial science, and decision systems. Covering a wide range of
    Note: Previous edition: Amsterdam; London: Academic, 2009. , Front Cover; Markov Processes for Stochastic Modeling; Copyright page; Contents; Acknowledgments; Preface to the Second Edition; Preface to the First Edition; 1 Basic Concepts in Probability; 1.1 Introduction; 1.1.1 Conditional Probability; 1.1.2 Independence; 1.1.3 Total Probability and the Bayes' Theorem; 1.2 Random Variables; 1.2.1 Distribution Functions; 1.2.2 Discrete Random Variables; 1.2.3 Continuous Random Variables; 1.2.4 Expectations; 1.2.5 Expectation of Nonnegative Random Variables; 1.2.6 Moments of Random Variables and the Variance; 1.3 Transform Methods; 1.3.1 The s-Transform , 1.3.2 The z-Transform1.4 Bivariate Random Variables; 1.4.1 Discrete Bivariate Random Variables; 1.4.2 Continuous Bivariate Random Variables; 1.4.3 Covariance and Correlation Coefficient; 1.5 Many Random Variables; 1.6 Fubini's Theorem; 1.7 Sums of Independent Random Variables; 1.8 Some Probability Distributions; 1.8.1 The Bernoulli Distribution; 1.8.2 The Binomial Distribution; 1.8.3 The Geometric Distribution; 1.8.4 The Pascal Distribution; 1.8.5 The Poisson Distribution; 1.8.6 The Exponential Distribution; 1.8.7 The Erlang Distribution; 1.8.8 Normal Distribution; 1.9 Limit Theorems , 1.9.1 Markov Inequality1.9.2 Chebyshev Inequality; 1.9.3 Laws of Large Numbers; 1.9.4 The Central Limit Theorem; 1.10 Problems; 2 Basic Concepts in Stochastic Processes; 2.1 Introduction; 2.2 Classification of Stochastic Processes; 2.3 Characterizing a Stochastic Process; 2.4 Mean and Autocorrelation Function of a Stochastic Process; 2.5 Stationary Stochastic Processes; 2.5.1 Strict-Sense Stationary Processes; 2.5.2 Wide-Sense Stationary Processes; 2.6 Ergodic Stochastic Processes; 2.7 Some Models of Stochastic Processes; 2.7.1 Martingales; Stopping Times; 2.7.2 Counting Processes , 2.7.3 Independent Increment Processes2.7.4 Stationary Increment Process; 2.7.5 Poisson Processes; Interarrival Times for the Poisson Process; Compound Poisson Process; Combinations of Independent Poisson Processes; Competing Independent Poisson Processes; Subdivision of a Poisson Process; 2.8 Problems; 3 Introduction to Markov Processes; 3.1 Introduction; 3.2 Structure of Markov Processes; 3.3 Strong Markov Property; 3.4 Applications of Discrete-Time Markov Processes; 3.4.1 Branching Processes; 3.4.2 Social Mobility; 3.4.3 Markov Decision Processes , 3.5 Applications of Continuous-Time Markov Processes3.5.1 Queueing Systems; 3.5.2 Continuous-Time Markov Decision Processes; 3.5.3 Stochastic Storage Systems; 3.6 Applications of Continuous-State Markov Processes; 3.6.1 Application of Diffusion Processes to Financial Options; 3.6.2 Applications of Brownian Motion; 3.7 Summary; 4 Discrete-Time Markov Chains; 4.1 Introduction; 4.2 State-Transition Probability Matrix; 4.2.1 The n-Step State-Transition Probability; 4.3 State-Transition Diagrams; 4.4 Classification of States; 4.5 Limiting-State Probabilities; 4.5.1 Doubly Stochastic Matrix , 4.6 Sojourn Time , English
    Additional Edition: ISBN 0-12-407795-1
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    Online Resource
    Online Resource
    Amsterdam, Netherlands :Elsevier, | London :Elsevier,
    UID:
    edocfu_9959238935602883
    Format: 1 online resource (xviii, 494 pages) : , illustrations.
    Edition: 2nd ed.
    ISBN: 0-12-407839-7
    Series Statement: Elsevier insights Markov processes for stochastic modeling
    Content: Markov processes are processes that have limited memory. In particular, their dependence on the past is only through the previous state. They are used to model the behavior of many systems including communications systems, transportation networks, image segmentation and analysis, biological systems and DNA sequence analysis, random atomic motion and diffusion in physics, social mobility, population studies, epidemiology, animal and insect migration, queueing systems, resource management, dams, financial engineering, actuarial science, and decision systems. Covering a wide range of
    Note: Previous edition: Amsterdam; London: Academic, 2009. , Front Cover; Markov Processes for Stochastic Modeling; Copyright page; Contents; Acknowledgments; Preface to the Second Edition; Preface to the First Edition; 1 Basic Concepts in Probability; 1.1 Introduction; 1.1.1 Conditional Probability; 1.1.2 Independence; 1.1.3 Total Probability and the Bayes' Theorem; 1.2 Random Variables; 1.2.1 Distribution Functions; 1.2.2 Discrete Random Variables; 1.2.3 Continuous Random Variables; 1.2.4 Expectations; 1.2.5 Expectation of Nonnegative Random Variables; 1.2.6 Moments of Random Variables and the Variance; 1.3 Transform Methods; 1.3.1 The s-Transform , 1.3.2 The z-Transform1.4 Bivariate Random Variables; 1.4.1 Discrete Bivariate Random Variables; 1.4.2 Continuous Bivariate Random Variables; 1.4.3 Covariance and Correlation Coefficient; 1.5 Many Random Variables; 1.6 Fubini's Theorem; 1.7 Sums of Independent Random Variables; 1.8 Some Probability Distributions; 1.8.1 The Bernoulli Distribution; 1.8.2 The Binomial Distribution; 1.8.3 The Geometric Distribution; 1.8.4 The Pascal Distribution; 1.8.5 The Poisson Distribution; 1.8.6 The Exponential Distribution; 1.8.7 The Erlang Distribution; 1.8.8 Normal Distribution; 1.9 Limit Theorems , 1.9.1 Markov Inequality1.9.2 Chebyshev Inequality; 1.9.3 Laws of Large Numbers; 1.9.4 The Central Limit Theorem; 1.10 Problems; 2 Basic Concepts in Stochastic Processes; 2.1 Introduction; 2.2 Classification of Stochastic Processes; 2.3 Characterizing a Stochastic Process; 2.4 Mean and Autocorrelation Function of a Stochastic Process; 2.5 Stationary Stochastic Processes; 2.5.1 Strict-Sense Stationary Processes; 2.5.2 Wide-Sense Stationary Processes; 2.6 Ergodic Stochastic Processes; 2.7 Some Models of Stochastic Processes; 2.7.1 Martingales; Stopping Times; 2.7.2 Counting Processes , 2.7.3 Independent Increment Processes2.7.4 Stationary Increment Process; 2.7.5 Poisson Processes; Interarrival Times for the Poisson Process; Compound Poisson Process; Combinations of Independent Poisson Processes; Competing Independent Poisson Processes; Subdivision of a Poisson Process; 2.8 Problems; 3 Introduction to Markov Processes; 3.1 Introduction; 3.2 Structure of Markov Processes; 3.3 Strong Markov Property; 3.4 Applications of Discrete-Time Markov Processes; 3.4.1 Branching Processes; 3.4.2 Social Mobility; 3.4.3 Markov Decision Processes , 3.5 Applications of Continuous-Time Markov Processes3.5.1 Queueing Systems; 3.5.2 Continuous-Time Markov Decision Processes; 3.5.3 Stochastic Storage Systems; 3.6 Applications of Continuous-State Markov Processes; 3.6.1 Application of Diffusion Processes to Financial Options; 3.6.2 Applications of Brownian Motion; 3.7 Summary; 4 Discrete-Time Markov Chains; 4.1 Introduction; 4.2 State-Transition Probability Matrix; 4.2.1 The n-Step State-Transition Probability; 4.3 State-Transition Diagrams; 4.4 Classification of States; 4.5 Limiting-State Probabilities; 4.5.1 Doubly Stochastic Matrix , 4.6 Sojourn Time , English
    Additional Edition: ISBN 0-12-407795-1
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    Online Resource
    Online Resource
    Amsterdam, Netherlands :Elsevier, | London :Elsevier,
    UID:
    almahu_9948026563602882
    Format: 1 online resource (xviii, 494 pages) : , illustrations.
    Edition: 2nd ed.
    ISBN: 0-12-407839-7
    Series Statement: Elsevier insights Markov processes for stochastic modeling
    Content: Markov processes are processes that have limited memory. In particular, their dependence on the past is only through the previous state. They are used to model the behavior of many systems including communications systems, transportation networks, image segmentation and analysis, biological systems and DNA sequence analysis, random atomic motion and diffusion in physics, social mobility, population studies, epidemiology, animal and insect migration, queueing systems, resource management, dams, financial engineering, actuarial science, and decision systems. Covering a wide range of
    Note: Previous edition: Amsterdam; London: Academic, 2009. , Front Cover; Markov Processes for Stochastic Modeling; Copyright page; Contents; Acknowledgments; Preface to the Second Edition; Preface to the First Edition; 1 Basic Concepts in Probability; 1.1 Introduction; 1.1.1 Conditional Probability; 1.1.2 Independence; 1.1.3 Total Probability and the Bayes' Theorem; 1.2 Random Variables; 1.2.1 Distribution Functions; 1.2.2 Discrete Random Variables; 1.2.3 Continuous Random Variables; 1.2.4 Expectations; 1.2.5 Expectation of Nonnegative Random Variables; 1.2.6 Moments of Random Variables and the Variance; 1.3 Transform Methods; 1.3.1 The s-Transform , 1.3.2 The z-Transform1.4 Bivariate Random Variables; 1.4.1 Discrete Bivariate Random Variables; 1.4.2 Continuous Bivariate Random Variables; 1.4.3 Covariance and Correlation Coefficient; 1.5 Many Random Variables; 1.6 Fubini's Theorem; 1.7 Sums of Independent Random Variables; 1.8 Some Probability Distributions; 1.8.1 The Bernoulli Distribution; 1.8.2 The Binomial Distribution; 1.8.3 The Geometric Distribution; 1.8.4 The Pascal Distribution; 1.8.5 The Poisson Distribution; 1.8.6 The Exponential Distribution; 1.8.7 The Erlang Distribution; 1.8.8 Normal Distribution; 1.9 Limit Theorems , 1.9.1 Markov Inequality1.9.2 Chebyshev Inequality; 1.9.3 Laws of Large Numbers; 1.9.4 The Central Limit Theorem; 1.10 Problems; 2 Basic Concepts in Stochastic Processes; 2.1 Introduction; 2.2 Classification of Stochastic Processes; 2.3 Characterizing a Stochastic Process; 2.4 Mean and Autocorrelation Function of a Stochastic Process; 2.5 Stationary Stochastic Processes; 2.5.1 Strict-Sense Stationary Processes; 2.5.2 Wide-Sense Stationary Processes; 2.6 Ergodic Stochastic Processes; 2.7 Some Models of Stochastic Processes; 2.7.1 Martingales; Stopping Times; 2.7.2 Counting Processes , 2.7.3 Independent Increment Processes2.7.4 Stationary Increment Process; 2.7.5 Poisson Processes; Interarrival Times for the Poisson Process; Compound Poisson Process; Combinations of Independent Poisson Processes; Competing Independent Poisson Processes; Subdivision of a Poisson Process; 2.8 Problems; 3 Introduction to Markov Processes; 3.1 Introduction; 3.2 Structure of Markov Processes; 3.3 Strong Markov Property; 3.4 Applications of Discrete-Time Markov Processes; 3.4.1 Branching Processes; 3.4.2 Social Mobility; 3.4.3 Markov Decision Processes , 3.5 Applications of Continuous-Time Markov Processes3.5.1 Queueing Systems; 3.5.2 Continuous-Time Markov Decision Processes; 3.5.3 Stochastic Storage Systems; 3.6 Applications of Continuous-State Markov Processes; 3.6.1 Application of Diffusion Processes to Financial Options; 3.6.2 Applications of Brownian Motion; 3.7 Summary; 4 Discrete-Time Markov Chains; 4.1 Introduction; 4.2 State-Transition Probability Matrix; 4.2.1 The n-Step State-Transition Probability; 4.3 State-Transition Diagrams; 4.4 Classification of States; 4.5 Limiting-State Probabilities; 4.5.1 Doubly Stochastic Matrix , 4.6 Sojourn Time , English
    Additional Edition: ISBN 0-12-407795-1
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    Online Resource
    Online Resource
    Amsterdam :Elsevier,
    UID:
    edocfu_BV042310022
    Format: 1 Online-Ressource (xviii, 494 Seiten) : , Diagramme.
    Edition: Second edition
    ISBN: 978-0-12-407839-0
    Series Statement: Elsevier insights
    Note: Previous edition: Amsterdam; London: Academic, 2009. - Covering a wide range of areas of application of Markov processes, this second edition is revised to highlight the most important aspects as well as the most recent trends and applications of Markov processes
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 978-0-12-407795-9
    Language: English
    Subjects: Mathematics
    RVK:
    Keywords: Markov-Prozess ; Stochastisches Modell
    URL: Volltext  (URL des Erstveröffentlichers)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    Online Resource
    Online Resource
    Amsterdam :Elsevier,
    UID:
    almafu_BV042310022
    Format: 1 Online-Ressource (xviii, 494 Seiten) : , Diagramme.
    Edition: Second edition
    ISBN: 978-0-12-407839-0
    Series Statement: Elsevier insights
    Note: Previous edition: Amsterdam; London: Academic, 2009. - Covering a wide range of areas of application of Markov processes, this second edition is revised to highlight the most important aspects as well as the most recent trends and applications of Markov processes
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 978-0-12-407795-9
    Language: English
    Subjects: Mathematics
    RVK:
    Keywords: Markov-Prozess ; Stochastisches Modell
    URL: Volltext  (URL des Erstveröffentlichers)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Did you mean 0824078497?
Did you mean 0124079393?
Did you mean 082407839x?
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages