Hi, A book I often return to (and like) and use as a source of knowledge when building a new model is the book "Model building in mathematical programming" by Williams. In a section on decomposition, Williams writes: "Decomposition algorithms are also of considerable computational interest, as they offer a possibility of avoiding the large amount of time needed to solve large models should those models be structured. Unfortunately, decomposition has only met with limited computational success. The most widely used algorithm is the Dantzig–Wolfe algorithm that applies to block angular structured models. There are many circumstances, however, in which it is more efficient to solve the total model rather than to use decomposition. The main advice to someone building a structured model should not be to attempt decomposition on other than an experimental basis" As someone working with a large MILP, that was considering using decomposition, I became very hesitant on continuing down the path of decomposition as a tool to reduce solution time after reading the last sentence above. I have some experience with DanztigWolfe decompositon (and Branch and Price) for solving large MIPs, but only with partial success. That is, it required a large effort in the implementation, with only limited reduction in solution time. Though I know there are quite a few articles out there that show positive results with utilizing decomposition, I wondered if the community at large agree with Williams statement about, that "The main advice to someone building a structured model should not be to attempt decomposition on other than an experimental basis", and/or if you know of real world "success stories" utilizing decomposition for MILPs. asked 20 Jan '16, 10:08 ErlendT 
The answer is a clear "that depends!". I am used to perform decompositions, so I see "column generation based models" everywhere I look. Still, the effort of implementing your own code can be huge. Not the basic version. For that, there are good frameworks out there. But to actually make the code fly you may still need a lot of knowledge, about your problem (good heuristics, for the whole problem, and for pricing), about the algorithm (stabilization, cutting planes, branching rules, all tailored to your problem). My experience is: many people try (the basic version of) a branchandprice code and see that it does not work, i.e., it is neither faster, nor can handle larger instances. For me this is like the line of argumentation many heuristics people use (who do not know how to proper model with MILPs): "We tried MILP, it did not work, so we have to use something different". My standard response is: well, how hard did you try? Same applies to DW. We found this situation frustrating, so there may be help. We have a framework that tries to do the decomposition for you. It can produce nice pictures about the structure of your MILP (and I know several users who learned something about their problem). It may work. It may not work. But it may give you a hint on whether it pays to embark on your own implementation. answered 21 Jan '16, 05:52 Marco Luebbecke ♦ For another decomposition framework, see DIP at https://projects.coinor.org/Dip
(22 Jan '16, 12:26)
Matthew Salt... ♦

I've had fairly good (but not uniformly good) luck with Benders decomposition for certain MILPs. I would agree with Williams's statement if it is interpreted as "don't bet the farm on decomposition until you've tested it on a selection of realistic instances". I would also agree with that statement if you swapped out decomposition and swapped in any other modeling (or solving) technique. MILPs are contrary beasts, and there is typically a high degree of variability in solution times, even for the same algorithm on multiple instances of the same problem. answered 20 Jan '16, 15:47 Paul Rubin ♦♦ 
For another decomposition framework (a commercial one), see SAS/DECOMP at http://support.sas.com/documentation/cdl/en/ormpug/68156/HTML/default/viewer.htm#ormpug_decomp_toc.htm We encountered several client cases where DW outperformed standard MILP techniques  this was the motivation for adding DECOMP to our solver suite. answered 22 Jan '16, 15:37 mgalati 
Decomposition is one of the model structures that matters. It is worth thinking about how your problem can be decomposed before writing it down, whether or not you use a decomposition solver at deployment time. What is (usually) not worth it is to implement a decomposition framework yourself, especially as the decomposition frameworks have improved. It is very easy to experiment with a decomposition solver now. Even in the 2000s, this was not true. Caveat emptor: I have access to a decomposition framework embedded in the modeling language. I might give you a different answer if I had to add, distribute, and support an additional library, or even implement the textbook version of the decomposition in a modeling language. answered 26 Jan '16, 03:25 Leo 
In the real world, Conway's law applies more often than not: software  including solvers  are structured the way the company is structured. For example: the train car scheduling department is separated from the train crew scheduling department, so so are their solvers (even though that unifying them can be more efficient). So there's a business structure based decomposition instead of a mathematically responsible decomposition (Benders, DantzigWolfe, ...).