As with many sequence enumeration problems, this one yields to the standard lexicographic enumeration algorithm (quoted from Algorithm for "consolidating" N items into K):
- Start with the lexicographically smallest possible sequence
- While possible:
- a. Scan backwards to find the last element which could be increased. ("Could be" means that increasing that element would still result in the prefix of some valid sequence.)
- b. Increment that element to the next largest possible value
- c. Fill in the rest of the sequence with the smallest possible suffix.
If there are no other restrictions, the length-j
sequence p == p0, p1, …, pj−1
could be a prefix of a length-k
combination of n
things iff k−j < n−pj−1
. We can rewrite that as pj−1 < n − k + j)
.
So the following simple function would accomplish steps a and b of the above generic algorithm for an unrestricted k, n combination:
/* Finds the (lexicographically) next prefix of a k-combination from n
* values, and returns the length of that prefix.
* If there is no possible next prefix, returns 0.
* Note: k is implicit in the length of the combination
*/
size_t nextPrefix(std::vector<size_t>& comb, size_t n) {
size_t k = comb.size();
for (size_t j = k; j;) {
--j;
if (comb[j] < n - k + j) {
++comb[j];
return j + 1;
}
}
return 0;
}
To move on to the actual problem, we note that any restriction of the form "the combination includes ki
of the first ni
values" will end up being exactly the same test, since that restriction could be rephrased as "the ki
-length prefix is a combination of ni
values."
So we could replace the n
parameter in the above function with a vector of pairs (where the last pair is precisely <k, n>
):
size_t nextPrefix(std::vector<size_t>& comb,
std::vector<std::pair<size_t, size_t>> const& restrictions) {
for (size_t j = comb.size(); j;) {
--j;
/* Test all applicable conditions */
if (std::all_of(restrictions.begin(), restrictions.end(),
[&j, &comb](std::pair<size_t, size_t> const& rest) {
return j >= rest.first
or comb[j] < rest.second - rest.first + j;})) {
++comb[j];
return j + 1;
}
}
return 0;
}
(This is not optimal, obviously. Instead of checking all the restrictions on every element, we could just compute a simple vector of maximum values, and then test the element against that maximum.)
To actually generate the sequences, we need to be able to fill in the suffix (the last step of the generic algorithm) and construct the loop:
void firstSuffix(std::vector<size_t>& comb, size_t pfx_length) {
size_t val = pfx_length ? comb[pfx_length - 1] + 1 : 0;
for (auto it = comb.begin() + pfx_length; it != comb.end(); ++it)
*it = val++;
}
And then we can write the loop:
int main() {
std::vector<std::pair<size_t, size_t>> restrictions = {{1, 2}, {2, 4}, {3, 7}};
size_t k = std::max_element(restrictions.begin(), restrictions.end())->first;
if (k == 0) return 0; /* Empty set */
std::vector<size_t> comb(k);
std::size_t pfx = 0;
do {
firstSuffix(comb, pfx);
for (auto const& val : comb) std::cout << std::setw(3) << val;
std::cout << '\n';
} while (pfx = nextPrefix(comb, restrictions));
return 0;
}
(live on coliru)