Dynamic Programming and Inventory Control

Editors

Bensoussan, A.

Publication date

# of pages

384

Cover

Hardcover

ISBN print

978-1-60750-769-7

ISBN online

978-1-60750-770-3

Description

This book presents a unified theory of dynamic programming and Markov decision processes and its application to a major field of operations research and operations management: inventory control. Models are developed in discrete time as well as in continuous time. For continuous time, this book concentrates only on models of interest to inventory control. For discrete time, the focus is mainly on infinite horizon models.
The book also covers the difference between impulse control and continuous control. Ergodic control is considered in the context of impulse control, and some simple rules currently used in practice are justified. Chapter 2 introduces some of the classical static problems which are preliminary to the dynamic models of interest in inventory control.
This book is not a general text on control theory and dynamic programming, in that the systems dynamics are mostly limited to inventory models. For these models, however, it seeks to be as comprehensive as possible, although finite horizon models in discrete time are not developed, since they are largely described in existing literature. On the other hand, the ergodic control problem is considered in detail, and probabilistic proofs as well as analytical proofs are provided.
The techniques developed in this work can be extended to more complex models, covering additional aspects of inventory control.