Dear Friends,

I have an optimization which in some of equality constraints there is a parameter with uncertainty. (f(x)=h, which h is an uncertain parameter). My model is non-linear. How can I solve this model? Could you also introduce me a reference?

asked 28 Dec '12, 07:39

Reza%20Salehizadeh's gravatar image

Reza Salehiz...
586
accept rate: 0%

If your model is convex (and then some -- there are additional requirements), look into conic programming. Otherwise, look into multiscenario optimization.

(28 Dec '12, 11:59) Gilead ♦

Usually, the term "robust optimization" involves constructing a model that minimizes the potential damage from non-ideal values of the uncertain parameters. That might include maximizing over your decision variables (assuming you want to maximize) the minimum value of the objective over the parameter \(h\) or it might involve including a term in the objective to be minimized representing the variance or the downside risk associated with \(h\).

If you have a two-stage or multistage problem, where you make a decision then have a chance to respond to the outcome, you can construct a scenario-based model (as @Gilead suggests).

Once you have the model, you can apply the appropriate methods.

link

answered 28 Dec '12, 12:35

Matthew%20Saltzman's gravatar image

Matthew Salt... ♦
4.7k310
accept rate: 17%

edited 28 Dec '12, 18:38

Here's my crude way of dealing with an uncertain parameter in a nonconvex optimization problem. It may not be the best way (and I'd be very interested in better approaches if there are any -- please drop me a comment if you know of any!).

Suppose I have a problem that looks like this: $$ \begin{align} & \min \phi(x)\\ \text{s.t. } & f(x,h) = 0\\ & g(x,h) \leq 0\\ & h \in [a,b] \end{align} $$

I discretize the uncertain parameter by breaking it up into \(n+1\) knots: $$ \begin{align} \Delta h = (b-a)/n\\ h_i = a + i\cdot \Delta h, \quad i=0,\ldots,n \end{align} $$ If there is more than one uncertain parameter, they are all discretized, and the resulting indices are simply the Cartesian product of all the individual parameter's index set.

I then write the following multi-scenario problem (where \(h_i\) are constants): \begin{align} & \min \phi(x)\\ \text{s.t. } & f_i(x,h_i) = 0, \quad i=0,\ldots,n\\ & g_i(x,h_i) \leq 0, \quad i=0,\ldots,n \end{align}

Depending on the value of \(n\), the problem can explode into a huge one. But the Jacobian is sparse and has a special block structure that can be exploited.

Although I use this approach, it is somewhat unsatisfactory. The problem is that it is difficult to know how fine the discretization needs to be to get a correct answer, and how to balance that with the problem size.

link

answered 28 Dec '12, 16:59

Gilead's gravatar image

Gilead ♦
2.3k513
accept rate: 15%

edited 28 Dec '12, 21:27

1

One approach might be to discretize roughly at first, then make intervals smaller near the optimum and resolve. Starting by extending the current solution to the new grid might help speed the process. That sort of follows the idea of multigrid methods for PDEs.

(28 Dec '12, 18:42) Matthew Salt... ♦

Thank you. But the resulting problem may be infeasible by considering all scenarios simlultanously.

(28 Dec '12, 20:30) Reza Salehiz...
1

@Reza: consider the replication of constraints \(f\) and \(g\); this means each set of constraints is independent apart from the linking variables \(x\). Essentially we're searching for a single \(x\) that is robust to all realizations of \(h\).

The reasoning is, if indeed the problem is infeasible, then we can deduce that there does not exist a solution that is robust (discounting the discretization) to given uncertainty interval of \(h\). That is to say, you may be able to find \(x\) that are feasible for certain values of \(h\), but not for all values of \(h\).

(28 Dec '12, 21:02) Gilead ♦

But I should hasten to add that this is only one approach. You may want a solution that is robust to some worst-case scenario (resulting a max-min problem), or to some probability distribution of the uncertain parameter.

(28 Dec '12, 21:14) Gilead ♦

@Matthew: Thanks, that's a good idea. I need to think about this a bit more, because the exact value of \(h\) (which we assume to be exogeneous to the problem) doesn't actually appear in the objective because we are searching for a single \(x\) that attains the optimum as well as is feasible for all values (or knots) of \(h\) (this gives a necessarily conservative \(x\)). Another application of your idea might be to make the intervals smaller for values of \(h\) with the highest probability. For a normally distributed \(h\), we might put more knots around the mean, for instance.

(28 Dec '12, 21:52) Gilead ♦
1

@Matthew: I re-read your answer, and it's becoming clearer to me that perhaps what I need to be doing is solving the minimax problem to find the worst-case \(h\) through this discretization and refinement method (assuming a priori that the problem is feasible for all realizations of \(h\). I'll think about this a bit more. Thanks!

(28 Dec '12, 22:08) Gilead ♦
showing 5 of 6 show 1 more comments
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Tags:

×190
×2

Asked: 28 Dec '12, 07:39

Seen: 1,352 times

Last updated: 28 Dec '12, 22:08

OR-Exchange! Your site for questions, answers, and announcements about operations research.