Abstract
This note presents a very simple method for deriving the necessary optimality conditions for optimal control of jump (point) processes. By means of Bellman's principle of optimality, the original stochastic control problem is transformed into a simple optimization problem. The derivation is remarkably simpler than the existing ones in the literature.
| Original language | English |
|---|---|
| Pages (from-to) | 765-774 |
| Number of pages | 10 |
| Journal | Economic Theory |
| Volume | 3 |
| Issue number | 4 |
| DOIs | |
| Publication status | Published - Dec 1993 |
| Externally published | Yes |