Optimal control of semilinear stochastic evolution equations

Borkar, V. S. ; Govindan, T. E. (1994) Optimal control of semilinear stochastic evolution equations Nonlinear Analysis, Theory, Methods & Applications, 23 (1). pp. 15-35. ISSN 0362-546X

Full text not available from this repository.

Official URL: http://eprints.iisc.ernet.in/8160/

Related URL: http://dx.doi.org/10.1016/0362-546X(94)9

Abstract

The aim of this paper is to initiate a semigroup theory-based approach to characterization of optimal Markov controls for controlled semilinear stochastic evolution equations. (It may be recalled that Markov controls are those that depend only on the current state at each time.) For finite dimensional controlled stochastic differential equations with a nondegenerate diffusion matrix, this task is traditionally achieved through the Hamilton-Jacobi-Bellman equation of dynamic programming associated with the problem and an accompanying verification theorem. The latter states that an optimal Markov control can be explicitly obtained by the pointwise minimization of a Hamiltonian derivable from the solution of the HJB equation. Moreover, any optimal Markov control is obtainable in this manner.

Item Type:Article
Source:Copyright of this article belongs to Elsevier Science.
Keywords:Markov Control; Markov Selection; Nisio Semigroup; Optimal Control; Stochastic Evolution Equation; Verification Theorem
ID Code:5287
Deposited On:18 Oct 2010 08:29
Last Modified:20 May 2011 09:34

Repository Staff Only: item control page