*********************************
There is now a CONTENT FREEZE for Mercury while we switch to a new platform. It began on Friday, March 10 at 6pm and will end on Wednesday, March 15 at noon. No new content can be created during this time, but all material in the system as of the beginning of the freeze will be migrated to the new platform, including users and groups. Functionally the new site is identical to the old one. webteam@gatech.edu
*********************************
Decisions often need to be made before all the facts are in. A facility must be built to withstand storms, floods, or earthquakes of magnitudes
that can only be guessed at. A portfolio must be purchased in the face of only statistical knowledge, at best, about how markets will perform.
In optimization, this implies that constraints may need to be envisioned in terms of safety margins instead of exact requirements. But what does
that really mean in model formulation, and what are the consequences for
optimization structure and computation?
Common approaches to a risky environment include worst-case analysis
and the reduction of uncertain quantities to their expectations. Another
idea with traditional appeal is that of a chance constraint, where for
instance one gives up on requiring a barrier along a river to hold back
the water absolutely and settles for 99.9% assurance. Unfortunately,
such strategies can lead to many technical difficulties, including
nonconvexity or even constraint infeasibility. Nowadays, however,
newer techniques are available which avoid such troubles and arguably
do a much better job anyway at capturing the essence of the situation
while greatly enhancing numerical capabilities.