Let's say that I have a Mixed Integer Problem to solve with a large number of variables and constraints. I would probably need as much CPU time as available but what about the memory RAM? Is there a rule to approximately know the amount or it depends on many factors such as the efficiency of the solver implementation? |

Copying from IBM TechNote » Guidelines for estimating CPLEX memory requirements based on problem size:
Yet, the technote provides some basic rules of thumb to estimate RAM requirements that can be applied if/when using CPLEX solver.
answered
fbahr ♦ |

You can bump into a memory constraint in more than one way. If the model is sufficiently big, you may run out of memory while building the model. Assuming you reach the point of solving it, memory is consumed by the (initially growing) node tree -- but there you may be able to exploit compression or virtual memory (write parts of the tree to disk). CPLEX does both those, and I assume other solvers do as well. So the constraint may not be just RAM, but rather RAM plus disk space.
answered
Paul Rubin ♦♦ |

I concur, virtualization means that performance degrades nicely when memory consumption starts to exceed available physical memory. There is one hard memory limit though, which is the addressable space. If you're using a 32 bit executable then the size of the models you can solve can hit a hard limit that virtualization cannot overcome. The limit comes from the fact that you cannot have arrays longer than 2 billion elements. It means that you cannot have a model with more than 2 billion non zero coefficients for instance. The actual limit is lower since during presolving the matrix an be expanded a bit before shrinking again. This is why there are 64 bits versions of CPLEX now. With those memory is no longer what can limit the ability to solve models.
answered
jfpuget |

Look for the keyword

sizing, which is a term often used forestimation of computer resources needed to run software X.