# optimization of Neither convex or concave function

 0 Hi all, I have a nonlinearly constrained optimization problem. I solve it using Matlab's fmincon and it finds a solution. I need to learn and prove if Matlab's solution is global optimal one or not. I try to show that the nonlinear constraints are convex or concave using primary minors and Hessian. I found that all constraints are indefinite, so the functions are neither convex nor concave. For a function that is neither convex nor concave, is there only global optima or there local ones? How can I prove this? Thank you asked 30 Sep '16, 13:32 GKGKGK 11●3 accept rate: 0%

 1 In general, there can be local optima that are not global. Examples are easy to come by. Try plotting $f(x)=\frac{\sin(\pi x)}{\pi x}$over the domain [1, 10]. There is a local maximum or minimum at 1.5, 2.5, ..., 9.5. answered 30 Sep '16, 15:19 Paul Rubin ♦♦ 14.6k●4●12 accept rate: 19% Just out of curiosity, your first version used "relative maximum or minimum" instead of "local maximum or minimum". Is "relative" rather than "local" used in some communities, such as at business schools? Or maybe an old-fashioned term? I don't recall seeing that, even in old books and papers, when I started in optimization in 1980. Update: I just googled "relative minimum" and got About 124,000 results. Is it maybe popular in high school and lower division college non-hard-core courses? (30 Sep '16, 18:39) Mark L Stone You and I seem to have started in optimization around the same time. Honestly, I can't recall where I picked up the phrase "relative maximum/minimum"; it's just stuck in my head (along with other phrases, such as "good enough for government work" to mean a satisficing solution). I think it came from upper division/graduate courses -- definitely not high school. I also can't recall the precise definition or usage, which is why, when I noticed I'd used it, that I switched to the better known/less ambiguous (?) "local". (30 Sep '16, 19:39) Paul Rubin ♦♦
 1 This answer is based on your problem being a minimization problem, which is the form used by FMINCON. This answer is meant to be complementary to that provided by @Paul Rubin, so read that first. You might find some of the material in this post to be shocking. If so, don't complain to me, complain to The Mathworks. My own optimizers don't do business in the rather appalling way which FMINCON does. I can only speak definitively regarding the version of FMINCON in MATLAB R2014A. I don't know whether this behavior has changed in newer versions. There can also be saddle points. FMINCON might terminate at a saddlepoint, and it may or may not declare that it is a local optimum. To distinguish local minimum from saddlepoint, you need to look at second order optimality conditions. Specifically, look at the minimum eigenvalue of the Hessian of the Lagrangian projected into the nullspace of the Jacobian of active constraints. If the minimum eigenvalue < 0 and maximum eigenvalue > 0, your "solution" is a saddlepoint. If minimum eigenvalue > 0, your solution is a local minimum, which may or may not be a global minimum. FMINCON might declare a local optimum and terminate at what is actually a local or global maximum (see example below). If FMINCON terminates at the starting point, which it might do if the projected gradient there is zero, it could declare a local optimum when it actually is a local or global maximum. If FMONCON doesn't terminate at the starting point, I think it is very unlikely that it will declare a local optimum when it is actually at a (local or global) maximum. Presuming you specify the long (enough) form of return argument, FMINCON will provide the Lagrange multipliers which you need to form the Lagrangian. You can use MATLAB's null to get the matrix Z of nullspace of the Jacobian of active constraints. Then the Lagrangian projected into the nullsapce of the Jacobian of active constraints = Z' * (Hessian of Lagrangian) * Z . I don't feel like writing out how to compute the Jacobian of active constraints, but it is straightforward. Example in which FMINCON declares a local optimum which is actually a global maximum: For convenience and ease of exposition, I will use YALMIP as a front end to FMINCON for this example problem. It is solving the problem min -x^2 , with no constraints, and starting point x = 0. Note that contrary to FMINCON"s claim, x =- 0 is the global maximum, and is not a local minimum. >> x = sdpvar % the scalar x is an optimization variable >> assign(x,0) % Use starting point x = 0 >> optimize([],-x^2,sdpsettings('solver','fmincon','usex0',1)) % Solve the problem Iter F-count f(x) Feasibility optimality Norm of step 0 1 0.000000e+00 0.000e+00 0.000e+00 Initial point is a local minimum that satisfies the constraints. Optimization completed because at the initial point, the objective function is non-decreasing in feasible directions to within the selected value of the function tolerance, and constraints are satisfied to within the selected value of the constraint tolerance. Optimization completed: The final point is the initial point. The first-order optimality measure, 0.000000e+00, is less than options.TolFun = 1.000000e-06, and the maximum constraint violation, 0.000000e+00, is less than options.TolCon = 1.000000e-06. Optimization Metric Options first-order optimality = 0.00e+00 TolFun = 1e-06 (selected) max(constraint violation) = 0.00e+00 TolCon = 1e-06 (selected)  answered 30 Sep '16, 17:10 Mark L Stone 447●3●10 accept rate: 15% Thank you for both of your answers. I checked the eigenvalues and I realized that my solution is a saddle point. For the neither convex nor concave optimization problems, can be the saddle point the global optima? What does a saddle point mean in this situation? (02 Oct '16, 10:24) GKGKGK A saddlepoint can't be global optimum. What did you check the eignevalues of? Per my answer, you have to look at the eigenvalues of the Hessian of the Lagrangian projected into the nullspace of the Jacobian of active constraints. It's possible that the Hessian of the objective function can have min eigenvalue which is negative, but the Hessian of the Lagrangian projected into the nullspace of the Jacobian of active constraints be psd. Too bad you're not using my optimizer - I have an optional 2nd order optimum check that distinguishes between local minimum, saddlepoint, and local maximum. (02 Oct '16, 11:19) Mark L Stone A saddlepoint means that it is a local minimum in at least one direction, and a local maximum in at least one direction. So it can not be a global optimum, or even a local optimum. Because I don't know your "situation", I can not elaborate any further. if you provide more information on your situation, perhaps someone will be able to assess what it means for you. If you use FMINCON without really understanding optimization, it's a recipe for disaster. If you do really understand optimization, you may not be satisfied with FMINCON. (02 Oct '16, 11:22) Mark L Stone BTW, you are allowed, and even encouraged, to upvote and/or accept answers. You can upvote more than one answer. (02 Oct '16, 11:25) Mark L Stone Can you show your optimization problem? Show your call to FMINCON, and include all your user functions, such as fun and nonlcon. Also show at least the beginning and end of the output from FMOINCON. Set the 'Display' to a value to show a lot. FMONCON has several different algorithm and Hessian options which can greatly affect how it works. The starting value, x0, can have a huge effect on where FMINCON goes and ends. A different x0 might go to a different "solution". A different algorithm or Hessian choice or option might cause FMONCON to end at a different point. (02 Oct '16, 11:31) Mark L Stone Yes, you are right I have just checked the eigenvalues of Hessian. By the way, I also use genetic algorithm tool ga in matlab and I obtain the same results. (04 Oct '16, 06:16) GKGKGK showing 5 of 6 show 1 more comments
 toggle preview community wiki

By Email:

Markdown Basics

• *italic* or _italic_
• **bold** or __bold__
• image?![alt text](/path/img.jpg "Title")
• numbered list: 1. Foo 2. Bar
• to add a line break simply add two spaces to where you would like the new line to be.
• basic HTML tags are also supported

Tags: