I am trying to solve a problem of the form:
where both f and g are linear functions of x. This is a convex problem, but the dimension of x is quite large, and the nonlinear solvers I have tried (SNOPT, KNITRO) have been quite slow. However, the problem almost looks like a quadratic programming problem, except for the "max" function that appears in the objective. So, my first question is whether there is a way to reformulate this problem as a standard quadratic program? If not, any other ideas on how I could exploit the structure of the objective to rewrite the problem as something that might be solved more quickly? asked 19 Oct '15, 12:54 evencoil 
There's a standard technique for transforming these problems. Add an auxilliary variable t, along with the constraints t>=0 and t>=g(x). Then minimize f(x)^2+t^2. answered 19 Oct '15, 14:15 Brian Borchers Thanks. Is there a good reference to find these types of standard techniques? I often have to formulate and solve optimization problems, but have never been formally trained in OR. So to me these types of techniques seem like very clever tricks and are not at all obvious. Yet I haven't been able to find a good textbook treatment that catalogs some of the more common ones.
(19 Oct '15, 14:56)
evencoil
I'd strongly recommend the textbook on convex optimization by Vandenberghe and Boyd.
(19 Oct '15, 20:30)
Brian Borchers

You seem to have a convex optimization problem and then you using nonconvex solvers. That could be reason for your problems. I suggest you reformulate the problem as follows
$$
\begin{array}{lc}
\min & s^2+t^2 \\
st & s=f(x), \\
& t \geq g(x), \\
& \mbox{linear constraints}, \\
& t \geq 0 \\
\end{array} You should be able to solve that with any decent convex QP optimizer such as MOSEK (http://mosek.com). Here are a couple of links you may find interesting: answered 19 Oct '15, 14:19 Erling_MOSEK It seems Brian came before me. Note Brians Hessian of the objective might be very dense whereas mine is not.
(19 Oct '15, 14:22)
Erling_MOSEK
Could you elaborate on the issue of the dense Hessian? It seems that your formulation only differs from Brian's in that you have added the variables s and the additional equality constraint s = f(x). How/why would this make the problem easier to solve? Wouldn't AMPL (for example) just substitute f(x) for s everywhere before sending the problem to a solver?
(19 Oct '15, 14:58)
evencoil
I cannot say for sure what AMPL is doing, but I think all optimizers that requires an explicit Hessian prefers my form. That is true for MOSEK and the most obvious competitors. Btw f(x) = sum_j x_j is an example.
(20 Oct '15, 00:59)
Erling_MOSEK
