# Bertsekas

### From Wikimization

(→Biographical Sketch) |
|||

Line 20: | Line 20: | ||

In 2001, he was elected to the United States National Academy of Engineering. | In 2001, he was elected to the United States National Academy of Engineering. | ||

- | Dr. Bertsekas' recent books are ''Introduction to Probability'' (2002), ''Convex Analysis and Optimization'' (2003), ''Dynamic Programming and Optimal Control: 3rd Edition'' (2007), all published by | + | Dr. Bertsekas' recent books are ''Introduction to Probability'' (2002), ''Convex Analysis and Optimization'' (2003), ''Dynamic Programming and Optimal Control: 3rd Edition'' (2007), all published by Athena Scientific. |

- | He is writing a new book on Convex Optimization Theory (to appear in 2008). | + | He is writing a new book on Convex Optimization Theory (to appear in 2008): |

+ | |||

+ | == ''Convex Optimization Theory'', Dimitri P. Bertsekas, [http://www.athenasc.com Athena Scientific], forthcoming == | ||

+ | Excerpt from the Preface: | ||

+ | |||

+ | This textbook aims to provide a simple, intuitive, and mathematically rigorous intoduction to convexity theory and its connections to optimization. The book evolved from an earlier (2003) book of the author on the subject '''('''written with collaboration by Angelia Nedic and Asuman Ozdaglar''')''', but has different objectives and substantially different content. A major focus of the 2003 book was to bridge the gap between convex and nonconvex optimization, by placing special emphasis on the convex analysis topics that bear a connection to both. By contrast, the present book provides a simpler, more focused exposition, by concentrating exclusively on convexity, and developing in greater depth core subjects such as polyhedral convexity, subdifferential theory, and conjugate function theory. The methodology and choice of topics are in the spirit and tradition of Fenchel's 1951 lecture notes and Rockafellar's 1970 text on convex analysis: the material and the ideas are similar, but the exposition, while on occasion less comprehensive, is more accessible, and includes much theory and algorithms developed in the long period since the appearance of Fenchel's and Rockafellar's seminal works. | ||

+ | |||

+ | Despite the differences, the styles and levels of mathematical sophistication of the present book and the 2003 book are similar. | ||

+ | In particular, both books place primary emphasis on theory, rather than algorithms and applications. Also, both books use geometric visualization as a principal tool for exposition, and also use end-of-chapter exercises to extend the range of exposition with substantial theoretical and algorithmic results. Detailed solutions of all the exercises are internet-posted in the book's www page | ||

+ | |||

+ | Another common stylistic element of the two books is the close link | ||

+ | between the theoretical treatment of convexity and its application | ||

+ | to optimization. For example, simultaneously with the development of some | ||

+ | of the basic facts about convexity in Chapters 1 and 2, we discuss elementary optimality conditions, and the question of existence of optimal solutions; in Chapter 3, after presenting the theory of hyperplane separation, we develop some of its applications | ||

+ | to duality and saddle point theory; in Chapter 4, after the discussion of polyhedral | ||

+ | convexity, we discuss its application in linear, integer, and convex programming; and in Chapter 6, | ||

+ | after the discussion of subgradients, we discuss their use in optimality | ||

+ | conditions and minimization algorithms. We follow this style in the remaining chapters, | ||

+ | although having developed in Chapters 1-6 most of the needed convexity theory, | ||

+ | the discussion in Chapters 7-8 is more heavily weighted towards | ||

+ | optimization. | ||

+ | |||

+ | Some of the novel expository features of the 2003 book have been maintained in the present version as well. In particular, minimax theory and constrained optimization duality have been developed in a unified way, as | ||

+ | special cases of the duality between two simple but fundamental geometrical problems: the min common | ||

+ | point problem and the max crossing point problem. The min common/max crossing framework is essentially a geometric version of convex conjugate function theory, but is simpler and unencumbered by functional descriptions. In several contexts, including duality, it provides a powerful and insightful analytical machinery, which is far more convenient than conjugate function theory. The min common/max crossing framework is also useful in the analysis and interpretation of conjugate functions, theorems of the alternative, and subdifferential theory. | ||

+ | |||

+ | Another feature shared with the 2003 book is the unified approach for developing conditions for existence of solutions of convex | ||

+ | optimization problems, conditions for the minimax equality to hold, and conditions for the | ||

+ | absence of a duality gap in constrained optimization. This unification is traced to a simple and fundamental issue: the question whether a nested family of closed sets has a nonempty intersection. | ||

+ | |||

+ | To provide an alternative, more accessible path to convexity, the most advanced sections have been marked with a star, and can be skipped at first reading. The principle followed here is that ``non-starred" sections do not make use of material in ``starred" sections. | ||

+ | |||

+ | An instructor who wishes to teach a course from the book has a choice between a few different plans: | ||

+ | |||

+ | * Cover the entire book. This requires a full semester course at a fast pace. | ||

+ | |||

+ | * Cover primarily the first six chapters, emphasizing convexity theory, yet maintaining substantial optimization content. | ||

+ | |||

+ | * Cover the entire book, but omit the starred sections (or just summarize them). This is the option followed by the author in his one-semester MIT class (course slides available on the internet). | ||

+ | |||

+ | Other plans may include some applications or some additional theoretical topics of the instructor's choice. |

## Revision as of 14:50, 1 August 2008

# DIMITRI P. BERTSEKAS

## Biographical Sketch

Dimitri P. Bertsekas received a combined B.S.E.E. and B.S.M.E. from the National Technical University of Athens, Greece,

an M.S.E.E. from George Washington University, and a Ph.D. in system science from the Massachusetts Institute of Technology.

Dr. Bertsekas has held faculty positions with the Engineering-Economic Systems Dept. at Stanford University (1971-1974)

and the Electrical Engineering Dept. of the University of Illinois, Urbana (1974-1979).

Since 1979 he has been teaching at the Electrical Engineering and Computer Science Department of the Massachusetts Institute of Technology (M.I.T.), where he is currently McAfee Professor of Engineering.

He consults regularly with private industry and has held editorial positions in several journals.

His research at M.I.T. spans several fields, including optimization, control, large-scale computation, and data communication networks, and is closely tied to his teaching and book authoring activities.

He has written numerous research papers, and thirteen books, several of which are used as textbooks in MIT classes.

Professor Bertsekas was awarded the INFORMS 1997 Prize for Research Excellence in the Interface Between Operations Research and Computer Science

for his book *Neuro-Dynamic Programming* (co-authored with John Tsitsiklis), the 2000 Greek National Award for Operations Research, and the 2001 ACC John R. Ragazzini Education Award.

In 2001, he was elected to the United States National Academy of Engineering.

Dr. Bertsekas' recent books are *Introduction to Probability* (2002), *Convex Analysis and Optimization* (2003), *Dynamic Programming and Optimal Control: 3rd Edition* (2007), all published by Athena Scientific.

He is writing a new book on Convex Optimization Theory (to appear in 2008):

## *Convex Optimization Theory*, Dimitri P. Bertsekas, Athena Scientific, forthcoming

Excerpt from the Preface:

This textbook aims to provide a simple, intuitive, and mathematically rigorous intoduction to convexity theory and its connections to optimization. The book evolved from an earlier (2003) book of the author on the subject **(**written with collaboration by Angelia Nedic and Asuman Ozdaglar**)**, but has different objectives and substantially different content. A major focus of the 2003 book was to bridge the gap between convex and nonconvex optimization, by placing special emphasis on the convex analysis topics that bear a connection to both. By contrast, the present book provides a simpler, more focused exposition, by concentrating exclusively on convexity, and developing in greater depth core subjects such as polyhedral convexity, subdifferential theory, and conjugate function theory. The methodology and choice of topics are in the spirit and tradition of Fenchel's 1951 lecture notes and Rockafellar's 1970 text on convex analysis: the material and the ideas are similar, but the exposition, while on occasion less comprehensive, is more accessible, and includes much theory and algorithms developed in the long period since the appearance of Fenchel's and Rockafellar's seminal works.

Despite the differences, the styles and levels of mathematical sophistication of the present book and the 2003 book are similar. In particular, both books place primary emphasis on theory, rather than algorithms and applications. Also, both books use geometric visualization as a principal tool for exposition, and also use end-of-chapter exercises to extend the range of exposition with substantial theoretical and algorithmic results. Detailed solutions of all the exercises are internet-posted in the book's www page

Another common stylistic element of the two books is the close link between the theoretical treatment of convexity and its application to optimization. For example, simultaneously with the development of some of the basic facts about convexity in Chapters 1 and 2, we discuss elementary optimality conditions, and the question of existence of optimal solutions; in Chapter 3, after presenting the theory of hyperplane separation, we develop some of its applications to duality and saddle point theory; in Chapter 4, after the discussion of polyhedral convexity, we discuss its application in linear, integer, and convex programming; and in Chapter 6, after the discussion of subgradients, we discuss their use in optimality conditions and minimization algorithms. We follow this style in the remaining chapters, although having developed in Chapters 1-6 most of the needed convexity theory, the discussion in Chapters 7-8 is more heavily weighted towards optimization.

Some of the novel expository features of the 2003 book have been maintained in the present version as well. In particular, minimax theory and constrained optimization duality have been developed in a unified way, as special cases of the duality between two simple but fundamental geometrical problems: the min common point problem and the max crossing point problem. The min common/max crossing framework is essentially a geometric version of convex conjugate function theory, but is simpler and unencumbered by functional descriptions. In several contexts, including duality, it provides a powerful and insightful analytical machinery, which is far more convenient than conjugate function theory. The min common/max crossing framework is also useful in the analysis and interpretation of conjugate functions, theorems of the alternative, and subdifferential theory.

Another feature shared with the 2003 book is the unified approach for developing conditions for existence of solutions of convex optimization problems, conditions for the minimax equality to hold, and conditions for the absence of a duality gap in constrained optimization. This unification is traced to a simple and fundamental issue: the question whether a nested family of closed sets has a nonempty intersection.

To provide an alternative, more accessible path to convexity, the most advanced sections have been marked with a star, and can be skipped at first reading. The principle followed here is that ``non-starred" sections do not make use of material in ``starred" sections.

An instructor who wishes to teach a course from the book has a choice between a few different plans:

- Cover the entire book. This requires a full semester course at a fast pace.

- Cover primarily the first six chapters, emphasizing convexity theory, yet maintaining substantial optimization content.

- Cover the entire book, but omit the starred sections (or just summarize them). This is the option followed by the author in his one-semester MIT class (course slides available on the internet).

Other plans may include some applications or some additional theoretical topics of the instructor's choice.