UID:
edoccha_9958071904402883
Umfang:
1 online resource (x, 452 pages) :
,
illustrations.
ISBN:
1-283-52576-3
,
9786613838216
,
0-08-095528-2
Serie:
Mathematics in science and engineering
Originaltitel:
Osnovy teorii optimalńykh avtomaticheskikh sistem.
Inhalt:
In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank
Anmerkung:
Front Cover; Optimal Control Systems; Copyright Page; Contents; Foreword; Chapter I. Problem of the Optimal System; 1. Significance of the Theory of Optimal Systems; 2. Classification of Optimal Systems; 3. Optimality Criteria; 4. The Information Input into the Controller about the Controlled Object; 5. Statement of Problems in the Theory of Optimal Systems; Chapter II. Mathematical Methods Applicable to the Theory of Optimal Systems; 1. Probability Theory; 2. Variational Methods; 3. Dynamic Programming; 4. The Maximum Principle
,
Chapter III. Optimal Systems with Complete Information about the Controlled Object; 1. The Problem of Maximal Speed of Response; the Phase Space Method; 2. Application of Classical Variational Methods; 3. Application of the Method of Dynamic Programming; 4. Application of the Maximum Principle; Chapter IV. Optimal Systems with Maximal Partial Information about the Controlled Object; 1. Continuous Systems with Maximal Information about the Object; 2. Discrete-Continuous and Purely Discrete Systems with Maximal Information about the Object
,
Chapter V. Optimal Systems with Independent (Passive) Storage of Information about the Object; 1. Fundamental Problems in the Theory of Optimal Systems with Independent Information Storage; 2. Theory of Two-Alternative Decisions; 3. Elements of General Statistical Decision Theory; 4. Statistical Decision Theory for Application to Automatic Control Systems; Chapter VI. Optimal Systems with Active Information Storage; 1. Statement of the Simplest Problem for an Optimal Dual Control System; 2. Solution of the Problem and Very Simple Examples; 3. Examples of Irreducible Systems; 4. Generalization to the Problem with Inertial Objects; 5. Generalization to the Problem with Markov Objects; 6. On Block Diagrams of Optimal Controllers; Conclusion; Bibliography; Author Index; Subject Index
,
English
Weitere Ausg.:
ISBN 0-12-251950-7
Sprache:
Englisch
Bookmarklink