Bellman equations for scalar linear convex stochastic control problems
A family of discrete time stochastic control problems with linear dynamics and convex cost functionals are studied. For the case of a scalar control for such a model with additive finite time horizon, discounted, and average cost per unit time convex cost functionals as well as multiplicative (exponential) finite time horizon, discounted and long run average convex functionals explicit solutions are described for suitable Bellman equations. In the particular case of a linear quadratic control problem a general continuous time problem is described. The form of the optimal strategies for each of these control problems is characterized.