Abstract: | The control of a linear system with random coefficients is studied here. The cost function is of a quadratic form and the random coefficients are assumed to be partially observable by the controller. By means of the stochastic Bellman equation, the optimal control of stochastic dynamic models with partially observable coefficients is derived. The optimal control is shown to be a linear function of the observable states and a nonlinear function of random parameters. The theory is applied to an optimal control design of an aircraft landing in wind gust. |