Creation date: 2020-04-26

This is part 20 of an ongoing series about: How to build a constraint solver? You might want to check out earlier posts if you're new. See first post.

I didn't only code a Sudoku solver in Python two years ago I also extended it to solve Str8ts. Basically I tried a lot of what I do now again but this time I'm more structured and Julia feels like a much better language for it. We need the table constraint to solve Str8ts this time and I'll show you in this post how the constraint works in it's basics and how you can add your own constraint to the solver. In the next post I'll show you how to use the constraint to solve Str8ts and I also want to use this to solve small Eternity puzzles

Normally I have a puzzle in mind which I want to solve and build the constraint around it. This time I looked which constraints might be useful and wanted to do something about scheduling problems as constraint solvers are the most used tool for those problems.

Unfortunately I wasn't able to find the best strategy to have interval variables inside JuMP. I will think about it next again. The next thing I found was the table constraint which is a really powerful constraint.

Currently we can limit the search space of a single variable easily by just setting bounds like:

```
@variable(m, 1 <= x[1:5] <= 7, Int)
# or
@variable(m, x[1:5], CS.Integers([1,2,3,4,5,6,7]))
```

with the table constraint you can define a set of possibilities for several variables combined. This can be also used to create other constraints by just enumerating all possibilities like all possible combinations which fulfill the all different constraint.

A small example constraint:

```
table = [
1 2 3 1 1;
1 3 3 2 1;
1 1 3 2 1;
1 1 1 2 4;
1 8 3 1 2;
4 5 5 3 4;
]
@constraint(model, x in CS.TableSet(table))
```

This concrete set of options can't be transformed into `==`

, `<=`

or all different constraints in an easy way.

You'll see in the next posts how this constraint can be used for solving other puzzles.

We always need a way to check feasibility and prune values if we got new knowledge. A very naive way would be to run over all possibilities and check whether the current search space is feasible.

Let's have a look at the table again:

```
table = [
1 2 3 1 1;
1 3 3 2 1;
1 1 3 2 1;
1 1 1 2 4;
1 8 3 1 2;
4 5 5 3 4;
]
```

For pruning: if setting `x[1]`

we can remove the last row and need to check which possible values are only used in this row i.e `x[2] = 5`

is not used by any other row therefore we can remove it from the search space of x`[2]`

.

The goal is always to prune as much as possible and if we have millions of rows it takes a long time to do these checks and keep track of which rows are possible and which aren't. We can implement a way to not iterate over rows which aren't possible anymore and so on but in general it feels like there is probably a better way to do it.

After thinking about sparse matrices and some other stuff for not too long I decided to just read the paper I've found beforehand.

I've found out about the table constraint on minicp.org which I mentioned here before. They've linked to the following paper: Compact-Table: Efficiently Filtering Table Constraints with Reversible Sparse Bit-Sets.

I recommend you to read that paper if you want to know all the details but I will also explain the rough idea:

We'll do quite a few things when the constraint is added to the model to make it faster in the feasibility and pruning steps.

Remove rows which can not be fulfilled because of the initial variable constraints

Compute a so called support for each variable and value combination

In the above example:

```
@variable(m, 1 <= x[1:5] <= 7, Int)
table = [
1 2 3 1 1;
1 3 3 2 1;
1 1 3 2 1;
1 1 1 2 4;
1 8 3 1 2;
4 5 5 3 4;
]
@constraint(model, x in CS.TableSet(table))
```

We will first iterate over all rows and check whether the corresponding variable (column) has the value associated with that row. This will remove `1 8 3 1 2`

as `x[2]`

is bounded above by 7.

You might wonder why the user gives the stupid table in the first place but it's quite handy if you want to easily define your model and let the solver do the work.

I've tested my implementation on some small examples like the one above first and then used it to solve sudokus. What else? :D

There I defined the table once with:

`table = permutedims(reduce(hcat, collect(permutations(1:9))))`

⚠ Note

`permutations`

gives a vector of vectors and we need a 2d arrayand used it for all 27 constraints and let the solver remove rows which based on the start setting of the sudoku aren't possible anymore.

Next we define the support on the remaining rows:

```
[
1 2 3 1 1;
1 3 3 2 1;
1 1 3 2 1;
1 1 1 2 4;
4 5 5 3 4;
]
```

in the following way: We list all 5 variables and their values in one column:

var = val | support |
---|---|

`x[1] = 1` | 1 1 1 1 0 |

`x[1] = 2` | 0 0 0 0 0 |

`x[1] = 3` | 0 0 0 0 0 |

`x[1] = 4` | 0 0 0 0 1 |

`x[1] = 5` | 0 0 0 0 0 |

`x[1] = 6` | 0 0 0 0 0 |

`x[1] = 7` | 0 0 0 0 0 |

`x[2] = 1` | 0 0 1 1 0 |

`x[2] = 2` | 1 0 0 0 0 |

`x[2] = 3` | 0 1 0 0 0 |

`x[2] = 4` | 0 0 0 0 0 |

`x[2] = 5` | 0 0 0 0 1 |

`x[2] = 6` | 0 0 0 0 0 |

`x[2] = 7` | 0 0 0 0 0 |

... the same for the other three variables.

For each variable value pair the support shows in which rows of the table constraint this pair is set i.e `x[2] = 5`

is only true in the last row which is represented by `0 0 0 0 1`

whereas `x[1] = 1`

is true in all but the last row => `1 1 1 1 0`

. Which means that `x[1] = 1, x[2] = 5`

is never possible.

You can see that there are quite a lot of supports which are `0 0 0 0 0`

so no support which is the case because that setting is never possible. These pairs can be removed immediately.

Now those permutations are not actually stored like you might imagine (in a boolean vector) but instead in a vector of `UInt64`

and we use the bits. This later has some advancements in checking fast whether a whole chunk of rows are invalid.

If we want to check whether a setting `1:3, 2`

is valid we can use `|`

over the supports of `x[1]={1,2,3}`

and `&`

them with `x[2] = 2`

.

In our case for `x[1]`

:

```
_ 1 1 1 1 0
| 0 0 0 0 0
| 0 0 0 0 0
= 1 1 1 1 0
```

for `x[2]`

`1 0 0 0 0`

and then `&`

```
_ 1 1 1 1 0
& 1 0 0 0 0
= 1 0 0 0 0
!= 0 0 0 0 0
=> feasible
```

This assumes that we currently haven't removed any other row of the initial table which here is a column. If we removed the first row already otherwise we would compute:

```
_ 0 1 1 1 1
& 1 1 1 1 0
& 1 0 0 0 0
= 0 0 0 0 0
=> infeasible
```

Which brings us to some things we store inside our constraint at all times and update when something changes:

current possible rows are represented by a

`UInt64`

vector called`words`

it has some specialties which I explain after the pruning part

a

`mask`

(`UInt64`

vector of the same length) which stores the in between results

So the above example more precisely:

```
mask___: 1 1 1 1 1
______ & 1 1 1 1 0
______ & 1 0 0 0 0
words_ & 0 1 1 1 1
words` & 0 0 0 0 0
=> infeasible
```

In pruning we want to remove values from the search space based on new information about the search space from other constraints or from trial and error in the backtracking phase.

First we:

check which variables have changed since last time by checking whether the size of the possible values have changed

update the domain size for next time

check which variables are not fixed

For the changed variables we compute the mask for all possible values to check feasibility as described before.

Next step is to filter the domains based on the new knowledge which uses all unfixed variables. For each variable value pair we check feasibility. If not feasible we remove the value from the search space.

Now this can be slow which brings us to the next point.

We kind of assume/hope that we remove quite a lot of rows in the first few backtracking steps ideally in the root node. Let's say we have 1million rows in our table. If we can't remove any of them in `init`

the `UInt64`

vector has about 16000 elements. In the first `prune`

run it might be the case that we can remove 90% of the rows. Then it would be quite stupid to iterate over the 16000 elements every time and do some `&`

operation if `words[idx] == 0`

. Instead we can use the a similar trick as with our general variables where we have values and indices to store which values are still possible. Here we also have `words`

and `indices`

and indices simply shows which indices of `words`

are non zero.

A small example:

We start with: (`1`

in words simply indicates that it is not zero.)

```
words__: [1,1,1,1,1,1]
indices: [1,2,3,4,5,6]
```

and then we remove some rows such that some words (64 rows in the same block) get removed/0.

```
words__: [1,1,0,1,1,0]
indices: [1,2,4,5,3,6]
```

Now we iterate over the indices with `for idx in indices`

and then use `words[idx]`

. Additionally we save how many words are non zero to stop early.

Another thing we can improve is speed up our check if a value is feasible/possible is that we check whether there exists at least one still feasible row which has the variable/value pair which we currently check.

For this a naive approach is to use `words & support[var, val]`

and check whether it is the zero vector but if it's not the zero vector because the last possible index is non zero we would basically iterate over a lot of nonsense again every time we check that. But if we forget reverse pruning for now we will never change the words in a way such that it will be non zero before the last index if it is the last index now. That means that we can save the position of the last non zero and every time start from there and update that last position if it changes.

If you want to have all the details I suggest that you read the paper as it's not that hard to understand.

I sure made a lot of bugs in implementing it and fixes over fixes but that's just me :D

If you want to add your own constraint to the package this is the correct section. I'll also add some docs in the official docs but this is a rough overview on how I think the workflow should be at the moment.

This will also give you a better overview of how the constraint solver works internally if you want to add some code to it.

Currently new constraints are created in form of `x in Set(y)`

style where `x`

is a variable or variables `Set`

is the name of the constraint (more or less) and `y`

can hold some other information needed. One `Set`

is the `AllDifferentSet`

and now the `TableSet`

.

For different constraints you need different information for the table constraint we need `words`

, `mask`

, ...

Therefore we create a new `ConstraintType`

in `src/types.jl`

.

```
mutable struct TableConstraint <: Constraint
std::ConstraintInternals
current::RSparseBitSet
...
end
```

the `std::ConstraintInternals`

has to exist as this will store information which every constraint needs like `indices`

to know which constraints must be called if a variable changed.

Additionally you need to define the `set`

which is the user interface. In this case we need `TableSet(table)`

whereas in the `AllDifferentSet`

case we don't have any parameters.

Nevertheless both need a `dimension`

field which was the reason why you had to write `x[1:9] in AllDifferentSet(9)`

before instead of `x[1:9] in AllDifferentSet()`

.

This means we as a developer need to define two `types`

:

```
struct TableSetInternal <: MOI.AbstractVectorSet
dimension :: Int
table :: Array{Int, 2}
end
struct TableSet <: JuMP.AbstractVectorSet
table :: Array{Int, 2}
end
JuMP.moi_set(ts::TableSet, dim) = TableSetInternal(dim, ts.table)
```

The internal type is as the name suggests for the internal structure which holds the information of the `dimension`

.

`JuMP.moi_set(ts::TableSet, dim) = TableSetInternal(dim, ts.table)`

is used to add the dimension.

All constraints are in `src/MOI_wrapper/constraints.jl`

where you need to implement at least two functions:

First you need to tell that there is a new constraint which this solver supports:

```
MOI.supports_constraint(
::Optimizer,
::Type{MOI.VectorOfVariables},
::Type{TableSetInternal},
) = true
```

as mentioned before you now only need `TableSetInternal`

.

Then you need to define the function `add_constraint`

.

```
function MOI.add_constraint(
model::Optimizer,
vars::MOI.VectorOfVariables,
set::TableSetInternal,
)
com = model.inner
internals = ConstraintInternals(
length(com.constraints) + 1,
vars,
set,
Int[v.value for v in vars.variables]
)
constraint = TableConstraint(
internals,
... # define the initials of all other fields
)
push!(com.constraints, constraint)
for (i, ind) in enumerate(constraint.std.indices)
push!(com.subscription[ind], constraint.std.idx)
end
# this is to show the number of constraints in each type.
# Go ahead and add yours in `types.jl` see: NumberConstraintTypes
com.info.n_constraint_types.table += 1
return MOI.ConstraintIndex{MOI.VectorOfVariables,TableSetInternal}(length(com.constraints))
end
```

The `internals`

should be as mentioned here. Important is that the index is just one more than the previous and that vars, set and indices are defined.

`push!(com.constraints, constraint)`

pushes the new constraint to the list of constraints. Currently one has to fill in the subscription fields manually... I should change that ;)

In the end you have to return a `MOI.ConstraintIndex`

.

Next up you have to define a `still_feasible`

method in a new file in `src/constraints/`

with the following arguments:

```
function still_feasible(
com::CoM,
constraint::TableConstraint,
fct::MOI.VectorOfVariables,
set::TableSetInternal,
value::Int,
index::Int,
)
```

and return `true`

if setting index (represents the global index of the variable) to value has still a feasible solution and `false`

otherwise.

That check is often quite easy. Harder is to prune the constraint:

```
function prune_constraint!(
com::CS.CoM,
constraint::TableConstraint,
fct::MOI.VectorOfVariables,
set::TableSetInternal;
logs = true,
)
```

Here you also have to return `true`

if this is feasible and `false`

if not but the idea is to prune as much as possible if it doesn't take tooo much time :)

You can remove a value from the search space with:

```
if !rm!(com, com.search_space[vidx], value)
feasible = false
end
```

this removes `value`

from `variables[vidx]`

. If this makes the problem infeasible `feasible`

is set to false. You can either `return false`

immediately or do some stuff before you return if it's necessary for your data structure.

This can already be enough for basic constraints as it was for everything besides the table constraint.

There are a number of other methods which might need to be implemented by your constraint.

```
function init_constraint!(
com::CS.CoM,
constraint::TableConstraint,
fct::MOI.VectorOfVariables,
set::TableSetInternal,
)
return true/false (feasibility)
end
```

Will be called before the first pruning.

```
function finished_pruning_constraint!(com::CS.CoM,
constraint::TableConstraint,
fct::MOI.VectorOfVariables,
set::TableSetInternal)
```

This function will be called after all pruning steps are done and before the next backtracking part starts. Here you can save the state at the end of a node which can be handy for backtracking.

```
function update_best_bound_constraint!(com::CS.CoM,
constraint::TableConstraint,
fct::MOI.VectorOfVariables,
set::TableSetInternal,
var_idx::Int,
lb::Int,
ub::Int
)
```

It might be quite handy to provide some extra information to the lp solver if one exists i.e the range of the sum of variables.

This function gets one variable index and the lower und upper bound it will be set to. The goal here is change an lp variable which can be created in `init`

to update the bounds of the variable (which might equal the sum) to give some hints to the lp solver.

You'll find more about this in the documentation (ones I'm done with it :D). At the moment you can check out the source code if you want to add something like this.

Currently there are three more functions which are there if you save some information in the constraint which depends on the node and might need to be reversed if we backtrack.

```
function single_reverse_pruning_constraint!(
com::CoM,
constraint::TableConstraint,
fct::MOI.VectorOfVariables,
set::TableSetInternal,
var_idx::Int,
changes::Vector{Tuple{Symbol,Int,Int,Int}}
)
```

Will be called when reversing a node with the `var_idx`

and a list of changes.

Each tuple holds the following information of what your constraint did in the pruning step.

```
# [1] :fix, :rm, :rm_below, :rm_above
# [2] To which value got it fixed, which value was removed, which value was the upper/lower bound
# [3] Only if fixed it saves the last ptr to revert the changes otherwise 0
# [4] How many values got removed (0 for fix)
```

i.e `(:rm, 2, 0, 1)`

means that you removed the value 2. In this function you need to reverse this (or you can use the next function).

A more general function which gets called after `single_reverse_pruning_constraint!`

:

```
function reverse_pruning_constraint!(
com::CoM,
constraint::TableConstraint,
fct::MOI.VectorOfVariables,
set::TableSetInternal,
backtrack_id::Int
)
```

There you can revert the whole backtrack_id in one go. If you are interested in the parent `backtrack_id`

you can use:

`parent_id = com.backtrack_vec[backtrack_id].parent_idx`

And the last function:

```
function restore_pruning_constraint!(
com::CoM,
constraint::TableConstraint,
fct::MOI.VectorOfVariables,
set::TableSetInternal,
prune_steps::Union{Int, Vector{Int}}
)
```

This gets called if we need to prune on some nodes which we already pruned before. This is the case if jump from one part of the tree to another probably because that node has a better bound.

`prune_steps`

includes a list of all `backtrack_ids`

on the path down to the one we want to reach. If you saved all the information you need in `finished_pruning_constraint!`

you only need to care about the last element in `prune_steps`

which you can get with `last(prune_steps)`

which works also if prune_steps is an integer.

I hope you learned something in this post and in the next one I'll explain how this constraint can be used to solve Str8ts.

**Thanks for reading and special thanks to my five patrons!**

List of patrons paying more than 4$ per month:

Site Wang

Gurvesh Sanghera

Szymon Bęczkowski

Currently I get 18$ per Month using Patreon and PayPal.

For a donation of a single dollar per month you get early access to the posts. Try it out at the start of a month and if you don't enjoy it just cancel your subscription before pay day (end of month).

Any questions, comments or improvements? Please comment here and/or write me an e-mail o.kroeger@opensourc.es and for code feel free to open a PR/issue on GitHub ConstraintSolver.jl

I'll keep you updated on Twitter OpenSourcES as well as my more personal one: Twitter Wikunia_de

Want to be updated? Consider subscribing and receiving a mail whenever a new post comes out.

© Ole Kröger. Last modified: June 10, 2020. Website built with Franklin.jl.