Let's say that I have a Mixed Integer Problem to solve with a large number of variables and constraints. I would probably need as much CPU time as available but what about the memory RAM? Is there a rule to approximately know the amount or it depends on many factors such as the efficiency of the solver implementation?

asked 06 Aug '12, 07:13

Emer's gravatar image

Emer
836
accept rate: 0%

edited 06 Aug '12, 07:28

fbahr's gravatar image

fbahr ♦
4.6k717

Look for the keyword sizing, which is a term often used for estimation of computer resources needed to run software X.

(07 Aug '12, 02:51) Geoffrey De ... ♦

Copying from IBM TechNote » Guidelines for estimating CPLEX memory requirements based on problem size:

Memory usage [...] can vary dramatically depending on the type of problem, the parameter settings used to solve it, and the number of threads available [...] to use during the optimization.

Yet, the technote provides some basic rules of thumb to estimate RAM requirements that can be applied if/when using CPLEX solver.

link

answered 06 Aug '12, 07:46

fbahr's gravatar image

fbahr ♦
4.6k717
accept rate: 13%

edited 06 Aug '12, 11:43

You can bump into a memory constraint in more than one way. If the model is sufficiently big, you may run out of memory while building the model. Assuming you reach the point of solving it, memory is consumed by the (initially growing) node tree -- but there you may be able to exploit compression or virtual memory (write parts of the tree to disk). CPLEX does both those, and I assume other solvers do as well. So the constraint may not be just RAM, but rather RAM plus disk space.

link

answered 07 Aug '12, 17:23

Paul%20Rubin's gravatar image

Paul Rubin ♦♦
14.6k513
accept rate: 19%

I concur, virtualization means that performance degrades nicely when memory consumption starts to exceed available physical memory. There is one hard memory limit though, which is the addressable space. If you're using a 32 bit executable then the size of the models you can solve can hit a hard limit that virtualization cannot overcome. The limit comes from the fact that you cannot have arrays longer than 2 billion elements. It means that you cannot have a model with more than 2 billion non zero coefficients for instance. The actual limit is lower since during presolving the matrix an be expanded a bit before shrinking again.

This is why there are 64 bits versions of CPLEX now. With those memory is no longer what can limit the ability to solve models.

link

answered 08 Aug '12, 22:07

jfpuget's gravatar image

jfpuget
2.5k310
accept rate: 8%

Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Tags:

×190
×71
×4

Asked: 06 Aug '12, 07:13

Seen: 2,147 times

Last updated: 08 Aug '12, 22:07

OR-Exchange! Your site for questions, answers, and announcements about operations research.