On the minimax feedback control of uncertain dynamic systems.In this paper the problem of optimal feedback control of uncertain discrete-time dynamic systems is considered where the uncertain quantities do not have a stochastic description but instead are known to belong to given sets. The problem is converted to a sequential minimax problem and dynamic programming is suggested as a general method for its solution. The notion of a sufficiently informative function, which parallels the notion of a sufficient statistic of stochastic optimal control, is introduced, and conditions under which the optimal controller decomposes into an estimator and an actuator are identified.
Document ID
19720040140
Acquisition Source
Legacy CDMS
Document Type
Conference Proceedings
Authors
Bertsekas, D. P. (Stanford University Stanford, Calif., United States)
Rhodes, I. B. (Washington University St. Louis, Mo., United States)