Hi all,

JuMP is a domain-specific modeling language for mathematical programming embedded in Julia. It currently supports a number of open-source and commercial solvers ( CPLEX, COIN Clp, COIN Cbc, ECOS, GLPK, Gurobi, Ipopt, KNITRO, MOSEK, and NLopt) for a variety of problem classes, including linear programming, (mixed) integer programming, second-order conic programming, and nonlinear programming.

We're announcing today (2014-12-30) the version 0.7 release.

JuMP makes it easy to specify and solve optimization problems without expert knowledge, yet at the same time allows experts to implement advanced algorithmic techniques such as exploiting efficient hot-starts in linear programming or using callbacks to interact with branch-and-bound solvers. JuMP is also fast - benchmarking has shown that it can create problems at similar speeds to special-purpose commercial tools such as AMPL while maintaining the expressiveness of a generic high-level programming language. JuMP can be easily embedded in complex work flows including simulations and web servers.

The main new features in 0.7 are:

  • Support for MINLP (only solver available is KNITRO at this point)
  • Support for SOCP, for the ECOS solver.
  • Improvements to variable creation using @defVar, including better syntax for column-wise model generation, initial values, fixed variables.
  • Numerous improvements to nonlinear programming, including model generation time, modeling syntax, resolving...

Features we support include:

  • Modeling:
  • Objectives: linear, quadratic, and nonlinear (convex and nonconvex)
  • Constraints: linear, convex quadratic, SOC, and nonlinear (convex and nonconvex)
  • Variable types: continuous, integer, semi-continuous, semi-integer
  • Solver independent warm and hot starts. If your underlying choice of solver supports it, problem modification to LPs will hot start from the previous solution if possible (if you solver doesn't, the problem will be solved from scratch - all this is hidden from the user). For IPs we will incrementally build the model (if supported by the solver) so you don't have to wait for the full model build time again.
  • Efficient column-wise model construction, e.g. for column generation.
  • Solver independent callbacks, including user-provided cuts, heuristics, and lazy constraints, with support on a per-solver basis (currently Gurobi, GLPK, and CPLEX). Julia interacts very efficiently with C code so there is no majority speed penalties here. See this blog post, "Using JuMP to Solve a TSP with Lazy Constraints", for an application, or check out the documentation.

You can read the

We are still extending and polishing JuMP in many ways and have an active group of users. We'd love to hear about your experiences if you try it - get in touch!

Cheers, Iain Dunning, Miles Lubin, Joey Huchette

(edited 2014/12/30 - for release of v0.7)

asked 03 Oct '13, 13:19

Iain%20Dunning's gravatar image

Iain Dunning
accept rate: 33%

edited 29 Dec '14, 17:07

I've been following this project for some time (when it was known as MathProg), and it's remarkable how much progress it has made over the past few months - to include support for Gurobi, amongst other great to-haves.

Having set up some of the optimization libraries for Python before (which was a fairly painful experience) before settling for cplex's python API, in a paper on routing optimization last year, and in setting up models to compare with a recent C++ library for robust optimization. The process of setting up Julia and JuMP, on the other hand, was a breeze.

There is alot of interest in building a framework for robust optimization (similar to ROME and ), building upon a nice generic solver-independent interface (openopt being another nice alternative), and I think the library (and the language) hits the sweet spot. Unfortunately I'll be available only after december to work on it, and haven't had the time to pick up enough Julia to work on it, much less clean up my own code to put on github.

May I ask if there will be support for user-defined and lazy constraints? Or do you all foresee difficulties in the inclusion of lower-level interventions in solver behavior?


answered 09 Oct '13, 05:52

yeesian's gravatar image

accept rate: 3%


I'm working on robust optimization that will be similar to ROME but with some features and generalizations to approach a wider class of problems that I think MATLAB is not really capable of handling. I'm also hoping that the speed of JuMP for MILP is maintained for robust problems. We have lazy constraints but not in the version in the package manager, hopefully tidied up and documented and coming soon!

(09 Oct '13, 11:49) Iain Dunning

Just to follow up on what Iain said, over the past few days we've pretty painlessly got callbacks working with the low-level Gurobi interface and have a mock-up of the JuMP syntax for lazy constraints. We'd like to do this in a solver-independent way, which will require some thought, but it's definitely in the pipeline.

(09 Oct '13, 12:32) Miles

@Iain @Miles: Rumors [aka tweets] say, CPLEX and Mosek interfaces "might" be in the pipeline, too - nice! (P.S. @Miles: Congrats on your COIN-OR Cup win, btw.)

(09 Oct '13, 14:22) fbahr ♦

Update: user defined lazy constraints are in!

(16 Dec '13, 15:48) Iain Dunning

Awesome! It is really well-documented too. For a lower-level interface, there's also MathProgBase, and for an overview of other solvers supported (in Julia), see JuliaOpt.

(16 Dec '13, 20:26) yeesian
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here



Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported



Asked: 03 Oct '13, 13:19

Seen: 7,826 times

Last updated: 29 Dec '14, 17:07

OR-Exchange! Your site for questions, answers, and announcements about operations research.