The algorithms described in the textbooks are designed to solve "any" instance of a problem. It means that the algorithm solves this problem(all the instances of this problem) exactly. However in practice my observation is that the "instance"(data, input, all the assumptions) itself becomes the problem. Clients do not care if you can (or not) solve all the instances but their unique "instance".

The curriculum at my university is quite old and does not cover how to handle this kind of situations as Mr. Barry Render pointed in a previous post: http://www.or-exchange.com/questions/1149/operations-research-qualification-exam-topics We can analyze input data and overfit a complex solution method for a specific client but then for another similar client we should start all the procedure again. What about if the input changes over time? I think this notion is related to overfitting - underfitting, model complexity problems in regression (statistics).

asked 12 Feb '11, 11:28

Arman's gravatar image

Arman
3292714
accept rate: 0%

edited 12 Feb '11, 11:40


I'd suggest calling this "overengineering" rather than "overfitting", as the latter has a fairly specific connotation in statistics. I also have to confess to a tendency to overengineer solutions some times.

Regarding the instance becoming the problem, that's valid for a one-off instance (we're going to solve this once and then never see it again). You have to be careful, though, because what looks like a one-off can sometimes find itself being repeated down the road. So maybe there's reason to build some flexibility into the model (and solution process) just to be safe.

I'd say there are least two questions to ask when building a model/solution mechanism more general or more flexible than the immediate instance:

  1. are you making the problem too expensive to solve? and
  2. are you making the setup, solution process or output too confusing for the user to handle?

Whether you build the ultimate flexible model or a quick solves-this-one-instance model, an important consideration is to document the assumptions. If, down the road, the user encounters what the user thinks is another instance (which may or may not actually be the same problem -- remember that some people think Excel is a word processor), you want them to be able to in essence run down a checklist of things that need to be true for your approach to work. (And it hopefully will be a short checklist.)

link

answered 12 Feb '11, 16:07

Paul%20Rubin's gravatar image

Paul Rubin ♦♦
14.6k412
accept rate: 19%

Excellent answer. I too was confused by the word "overfitting" (because it does have a mathematical meaning incongruent with what is being discussed), but "overengineering" is definitely something I can relate to, especially when writing software. Do I write a method that just works for one instance, or do I make it a bit more general? Well, I like to take the iterative approach where I first start by solving the specific case, and as the need arises, I start generalizing the problem one step at a time. For this, it helps to have tools that allow one to tear things down and refactor easily.

(12 Feb '11, 23:59) Gilead ♦

This is an important consideration in practice - as your level of "overfitting" increases, your code complexity and maintenance costs go up, as you scramble to handle changing input. The solution approach chosen should be scalable, flexible, and sustainable commensurate with the relative importance of the decisions that come out of the approach. Mission-critical solutions with keenly reviewed output require relatively more robust methods, while at the other end, for "give me something better, i dont care" problems or use-and-throw analyses, textbook approaches may suffice.

Unlike econometrics and statistics that are driven by data, most if not all traditional OR textbooks rarely focus on data aspects. This is sad because the OR pioneers who gave us these great methods spent a lot of time neck-deep in data using slide-rules and punch-card machines to produce those elegant results. My expectation is that this will change due to the current 'analytics' trend that is so data-driven.

In general, as market conditions change, one can expect the business rules, inputs, and constraints to change in accordance, so choosing an inflexible framework begins to hurt. Before we choose a particular approach, we must understand, even anticipate, their limitations and approximation of reality over time. A semi-serious look at what optimization methods tend to work relatively better in practice over a period of time: http://dualnoise.blogspot.com/2009/10/whats-your-favorite-optimization-method.html

link

answered 12 Feb '11, 13:56

shiva%204's gravatar image

shiva 4
2635
accept rate: 11%

I've experienced overfitting as very costly: not just because the data or the constraints change, but also because my algorithms continuously improve and overfitted examples can perform worse with better algorithms. It's falls under one of those general programming guidelines:

Premature optimization is the root of all evil.

It's like using StringBuffer to concatenate non-looped String's in Java 1.0: that's actually slower than just += them in Java 1.6. And it's painting the bikeshed: the relevant performance/scalability problems are in the design, not a single line of code. Keep your code clean and simple to allow you to recognize and refactor your design.

The same goes for overfitting: to get a little a higher score today, you're making your code more complex, making it harder for you to refactor your algorithms (and correctly test them) which yield far better score improvements tomorrow.

I originally throw away my first simulated annealing implementation as being useless due to a bug in some complex code (which didn't affect my tabu search implementation). Only after I simplified and cleaned up that code, my next iterative attempt at simulated annealing became a success.

link

answered 13 Feb '11, 10:05

Geoffrey%20De%20Smet's gravatar image

Geoffrey De ... ♦
3.6k32764
accept rate: 6%

Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Tags:

×29

Asked: 12 Feb '11, 11:28

Seen: 2,117 times

Last updated: 13 Feb '11, 10:05

OR-Exchange! Your site for questions, answers, and announcements about operations research.