Hi all, I have a problem with the following substructure \[\begin{align} &\sum_{i\in I_j}y_i\geq 1,\quad \forall j\in J \\ &\sum_{i\in I}y_i=K \\ &y_i\in\{0,1\}, \end{align}\] where \(I\) is the set of variables and \( \cup_{j\in J} I_j =I\) and the sets \(I_j\) may be overlapping: we do not in general have \(I_j\cap I_l=\emptyset\) for \(j\neq l\).

It seems to me, that there must be some clever combinatorial arguments I can use to fix variable to zero and one based on the inequalities/equation. I am, however, not able to find literature dealing with this (might be due to me giving the problem a wrong name). Can anyone help me here? Thanks in advance.

asked 20 Jun '14, 06:56

Sune's gravatar image

Sune
958413
accept rate: 20%

edited 20 Jun '14, 06:58


To me, your formulation more closely resembles that of the hitting set problem. However, hitting set is essentially set cover phrased differently. The main difference is that the \(k\)-HITTING SET problem asks if there are at most \(k\) elements from \(I\) that hit all sets. The \(k\)-SET COVER problem asks if there are at most \(k\) sets that cover the universe \(I\). (Note: in literature you will also find that people use \(k\) to refer to the size of the largest set or to the number of sets that an element appears in.)

The problem of determining whether that 0-1 program is feasible is called \(k\)-HITTING SET. The special case \(k\)-Dominating Set (\(k\ge 3\)) can be solved in time \(O(n^{k+o(1)})\) and there is reason to believe that no better is possible. So, for general problems of this type you cannot expect to be able to fix variables--or the computational effort necessary to ensure the fixing of variables will probably be cumbersome. Namely, suppose you could always fix at least one variable in time \(O(n^{k-1})\), then the repeated application of this would yield an algorithm running in time \(O(n^{k})\)--that is, faster than currently known algorithms.

However, for many instances encountered in practice, the following reduction rules might apply.

  1. Suppose there are sets \(I_1 \subseteq I_2\), then you can remove the constraint for \(I_2\).
  2. If a set \(I_j\) is a singleton containing only \(i\), then you can fix the variable \(y_i\) to 1.
  3. If you are solving a minimization problem (with no other constraints) and an element \(i\) hits all sets that \(p\) does (and possibly others), then you can fix \(y_p=0\).

Rules 1 and 2 apply to any problem. Rule 3 can rule out feasible solutions and hence may not apply to every problem. However, the repeated application of rule 3 may allow you to apply rules 1 and 2. I imagine, though, that any good IP solver will do this automatically.

Probably you can find more by searching "reduction rules hitting set". In the parameterized complexity literature, the term kernelization will pop up.

link

answered 20 Jun '14, 15:01

Austin%20Buchanan's gravatar image

Austin Buchanan
1.3k313
accept rate: 42%

edited 21 Jun '14, 03:25

Nice but also a kind of depressing answer :-)

(22 Jun '14, 10:21) Sune
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Tags:

×101
×28
×3

Asked: 20 Jun '14, 06:56

Seen: 918 times

Last updated: 22 Jun '14, 10:21

OR-Exchange! Your site for questions, answers, and announcements about operations research.