Approximate Dynamic Programming with Applications
This thesis studies approximate optimal control of nonlinear systems. Particular attention is given to global solutions and to the computation of approximately optimal feedback controllers. The solution to an optimal control problem is characterized by the optimal value function. For a large class of problems the optimal value function must satisfy a Hamilton-Jacobi-Bellman type equation. Two comm
