Creation date: 2021-11-07

First of all I can't really believe that I do this for like the fifth time now but maybe I get better at it 😄. I'm talking about building some kind of branch and bound system. This time I don't really have much code to begin with so I can start building a general branch and bound solver.

Anyhow, welcome back to my blog 😉. Ready for the second part of the TSP series? If you haven't please check out the first part where I introduce the problem a bit and work on both an upper bound and a lower bound.

Branch and bound is an idea that is often used in mathematical optimization like my constraint solver or in the MINLP solver Juniper .

In general we have some kind of problem that we can't solve directly. Instead we can only provide some bounds to the solution. Then we make a choice like set this boolean variable to true in one branch and set it to false in another. This way we have two problems now instead of one but they both are a bit simpler as they are restricted. Important is however that we never remove an actual solution by splitting up the problem instead we remove only the current relaxed solution. Maybe that boolean variable was set to `0.7`

before which can never be the case in the solution we are looking for.

This way we can basically try all the different options to reach the solution. Of course we don't really wanna try out everything. Instead we also do bounding which means the following.

Let's assume we have found a solution in one of those branches maybe without even fixing all the variables. That way we have an upper bound for the best solution. (I assume we have a minimization problem here.)

If in some other branch the lower bound is bigger than this upper bound we don't have to investigate that branch any further and can cut that branch from the tree.

There are a bunch of different ways to go about this problem which I have detailed in the constraint solver backtracking post.

Creating some kind of recursive structure which calls first the left branch and then the right branch

An actual tree structure where every node has a parent and normally 2 but maybe more children

A more abstract structure which is basically just a vector of nodes

A balance act between the second and third option

Before we talk about the last approach which is the one I'll take we should probably get at least one negative point about the other variants 😄

The recursive structure makes stuff often very straightforward to code at first until you get your first recursion bug that you can't wrap your head around. In general however this approach isn't very scalable as you'll hit the recursion limit pretty soon.

The tree option sounds pretty nice as the structure is very much a tree. However iterating through a tree is sometimes not that nice. Like if we want to pick the next node we want to branch on we have to somehow iterate over all of them to pick the next good one.

Note

Maybe a small blue box about terminology 😉

A node is a state where in which we can evaluate both a lower and an upper bound. To branch on a node means to split the node up into normally two children. In our case later we will either fix on connection between two points or we disallow it.

Normally there are two main important questions in branch and bound:

Which node to check next

How to branch on it i.e which edge do we either fix or disallow next?

Alright, up to the next point which what BranchAndBound.jl uses. Wait a minute there is already a package for it ...? Yeah that's right, I'm just as a lot of time not satisfied enough... but might end up with a crappier solution 😄. In that package you have a list of `nodes`

and a list of `processed`

nodes. The problem I see there is that it stores the nodes in a sorted vector instead of a priority queue. Which saved me a lot of time in my ConstraintSolver . You can read about priority queues in that series.

Therefore, my plan is to create a general branch and bound structure which holds

the current best solution/incumbent

which holds how it is achieved and the objective value

the nodes as priority queue

the lower bound (I'll keep with minimization and if you want to use it for maximization simply multiply your values with

`-1`

😉)maybe some kind of info structure which holds information about the number of nodes created and bounded

The reasons above made me create yet another repository Bonobo , which aims to be a very general branch and bound framework in Julia . Seeing the support and interest in the project already now about a week after I created it, took me by surprise. It seems like there is a need for something like it for longer. I'll explain now my current way of how I try to fill this gap.

Bonobo consists of a general tree structure:

```
mutable struct BnBTree{Node<:AbstractNode,Root,Value,Solution<:AbstractSolution{Node,Value}}
incumbent::Float64
lb::Float64
solutions::Vector{Solution}
nodes::PriorityQueue{Int,Node}
root::Root
branching_indices::Vector{Int}
num_nodes::Int
sense::Symbol
options::Options
end
```

Which consists of the value for the current best objective (`incumbent`

) the current lower bound `lb`

a vector of solutions, the open nodes in a priority queue as discussed above. Then a `root`

field which holds the information that is needed by the program that builds on top of this general framework. As an example it can hold the information of a JuMP model. Then in some cases not all of the variables have to be discrete therefore `branching_indices`

stores a list of variables that should be discrete or should be branched on in general. As an example in our TSP solver it has nothing to do with being discrete therefore the change in name here. (It was called `discrete_indices`

before).

`num_nodes`

is just a counter of how many nodes were created. The `sense`

is either `:Min`

or `:Max`

which defines whether we want to minimize or maximize the objective. In the case of `:Max`

we still use it as if we were minimizing but when receiving the information from solving a node it will be flipped and when we return the solution it gets flipped again.

The last field `options`

holds all kinds of options currently only the branching strategy and the traverse strategy though the latter is currently fixed 😄 .

Now we have to initialize this struct which is done with the following method (here I only show the arguments):

```
function initialize(;
traverse_strategy = BFS,
branch_strategy = FIRST,
atol = 1e-6,
rtol = 1e-6,
Node = DefaultNode,
Value = Vector{Float64},
Solution = DefaultSolution{Node,Value},
root = nothing,
sense = :Min,
)
```

Fell free to click on the "See on GitHub" link above to see the whole definition. Everything has a default value but at least the `root`

should probably be overwritten.

In the general structure above you have probably also seen the various types I used:

`mutable struct BnBTree{Node<:AbstractNode,Root,Value,Solution<:AbstractSolution{Node,Value}}`

those can be specialized in the `initialize`

function as well as you can see. You often want to have your own type `Node`

to hold the information you need in each node. In some cases it also makes sense to store the solution of a node not as a vector of floats like in our TSP solver case where this would be probably a list of edges.

Let me walk you through some more functions before I show you a concrete example. We now have initialized the struct or at least know the function to do so. The next step is to set the initial node. This is the root node which holds the information of the initial solve of the problem. Remember the branch and bound part mostly consists of splitting up the problem in easier ones and do some tricks to avoid the need to evaluate all of them. It does however not solve the smaller problems. Its goal is simply to maintain the structure and information but the evaluation of the nodes is done by someone else. It is kind of the managing system and it starts with information about the root node which will be evaluated first.

This is done by the `set_root!`

function:

```
function set_root!(tree::BnBTree, node_info::NamedTuple)
add_node!(tree, node_info)
end
```

It does nothing else than adding a node to the structure by inserting it into our priority queue.

After that we are ready to start the branch and bound process with the `optimize!`

function. It is also fairly simple at its core functionality as you can see here:

```
function optimize!(tree::BnBTree)
while !terminated(tree)
node = get_next_node(tree, tree.options.traverse_strategy)
lb, ub = evaluate_node!(tree, node)
# if the problem was infeasible we simply close the node and continue
if isnan(lb) && isnan(ub)
close_node!(tree, node)
continue
end
set_node_bound!(tree.sense, node, lb, ub)
updated = update_best_solution!(tree, node)
updated && bound!(tree, node.id)
close_node!(tree, node)
branch!(tree, node)
end
end
```

We check whether we are done which normally is only the case if there are no more nodes to evaluate. Then we check which node we take on next. Afterwards we evaluate the node itself and obtain again a lower and upper bound. This can update the general tree if we obtained a new upper bound (solution which might or might not be optimal).

Then we close the node itself and create new nodes with the `branch!`

call. Our main functions to check are:

`get_next_node`

`evaluate_node!`

`branch!`

The first one is rather simple as we have specified in the initialize function that we want to use BFS as our traverse strategy. This means we obtain the node with the lowest lower bound in each round. The next function `evaluate_node!`

needs to be implemented by the library or code that uses Bonobo as Bonobo doesn't really know anything about how to obtain the lower and upper bounds of your problem. Those are the things that we had a look at in the last post of this series.

Probably the most interesting one is `branch!`

```
function branch!(tree, node)
variable_idx = get_branching_variable(tree, tree.options.branch_strategy, node)
# no branching variable selected => return
variable_idx == -1 && return
nodes_info = get_branching_nodes_info(tree, node, variable_idx)
for node_info in nodes_info
add_node!(tree, node_info)
end
end
```

It consists of two things:

We need to obtain the variable we want to branch on which could be

a standard variable in JuMP or an edge in our TSP problem

We need to actually construct those new nodes for example setting a binary variable to 0 in one node and to 1 in the other

The first step can be handled by Bonobo with calling some more functions but Bonobo itself can hold the logic of which branching variable might be a wise choice. In general it uses mostly the function `get_relaxed_values`

which should return the current values of the nodes. In our case the current edges set in our lower bound.

`get_branching_nodes_info`

however needs to be implemented outside of Bonobo again. There you decide how to split up the problem.

Then those new nodes are added to the tree structure.

As the final step for this blog post I want to show you an example of how to use Bonobo to solve a general mixed integer problem using a linear solver.

A basic JuMP model with the Cbc solver:

```
m = Model(Cbc.Optimizer)
set_optimizer_attribute(m, "logLevel", 0)
@variable(m, x[1:3] >= 0)
@constraint(m, 0.5x[1]+3.1x[2]+4.2x[3] <= 6.1)
@constraint(m, 1.9x[1]+0.7x[2]+0.2x[3] <= 8.1)
@constraint(m, 2.9x[1]-2.3x[2]+4.2x[3] <= 10.5)
@objective(m, Max, x[1]+1.2x[2]+3.2x[3])
```

Here I specified the variables \(x_1, x_2, x_3\) to be positive but not that they should be discrete. (Yes Cbc can solve MIPs directly but we want to showcase Bonobo here 😄)

The we initialize Bonobo

```
bnb_model = BB.initialize(;
Node = MIPNode,
root = m,
sense = objective_sense(m) == MOI.MAX_SENSE ? :Max : :Min
)
```

(I used `const BB = Bonobo`

) As you can see we want to have a specific node structure to save the information necessary for us to evaluate the subproblems. This information should be stored in `MIPNode`

which I'll show in a second. Then the `root`

is the model itself and will store the problem itself and we can use this to dispatch.

The last just checks using JuMP whether it's a minimization or maximization problem. In our case we could simply set it to `:Max`

directly.

We define `MIPNode`

as this:

```
mutable struct MIPNode <: AbstractNode
std :: BnBNode
lbs :: Vector{Float64}
ubs :: Vector{Float64}
status :: MOI.TerminationStatusCode
end
```

It needs to be a subtype of `AbstractNode`

and has to have the field named `std`

with type `BnBNode`

as Bonobo itself also stores some information in each node. Then for our problem we need to store the lower and upper bounds of each variable and the status code of our node.

We specify our initial bounds for the root node:

```
BB.set_root!(bnb_model, (
lbs = zeros(length(x)),
ubs = fill(Inf, length(x)),
status = MOI.OPTIMIZE_NOT_CALLED
))
```

as well as setting the status to indicate that this node wasn't evaluated yet. I've set the lower bounds to `0`

here as in the model and the upper bound for each variable is `Inf`

.

Then we can call:

`BB.optimize!(bnb_model)`

However this will fail as we haven't implemented a bunch of other functions for example how to we evaluate a node?

```
function BB.evaluate_node!(tree::BnBTree{MIPNode, JuMP.Model}, node::MIPNode)
m = tree.root
vids = MOI.get(m ,MOI.ListOfVariableIndices())
vars = VariableRef.(m, vids)
JuMP.set_lower_bound.(vars, node.lbs)
JuMP.set_upper_bound.(vars, node.ubs)
optimize!(m)
status = termination_status(m)
node.status = status
if status != MOI.OPTIMAL
return NaN,NaN
end
obj_val = objective_value(m)
if all(BB.is_approx_feasible.(tree, value.(vars)))
node.ub = obj_val
return obj_val, obj_val
end
return obj_val, NaN
end
```

This is the first Bonobo function that we dispatch on based on our structs to not have type piracy. It is itself a pretty basic function which sets the lower and upper bounds then calls `optimize!`

which is the JuMP function to optimize the model. Then we check the status which could be infeasible when there is no solution. In that case we would return that there is neither a lower nor an upper bound.

Otherwise we obtain the objective value and also check whether we found a discrete solution. Depending on that we either only return the lower bound or both a lower and an upper one.

The for the branching part we need the following two functions to get the branching variable:

```
function BB.get_relaxed_values(tree::BnBTree{MIPNode, JuMP.Model}, node)
vids = MOI.get(tree.root ,MOI.ListOfVariableIndices())
vars = VariableRef.(tree.root, vids)
return JuMP.value.(vars)
end
function BB.get_branching_indices(model::JuMP.Model)
# every variable should be discrete
vis = MOI.get(model, MOI.ListOfVariableIndices())
return 1:length(vis)
end
```

The latter one is actually only called once in `initialize`

but it is necessary for this step. There we specify that every variable should have an integer solution in the end. The `get_relaxed_values`

function simply uses the `JuMP.value`

function to get the current values of each variable and return it.

The very last step is to have the `get_branching_nodes_info`

function which returns two nodes one where we set the upper bound of the variable with index `vidx`

which is based on the branching strategy and one where we set the lower bound.

I'll include the full function for completeness even though it's not especially interesting.

```
function BB.get_branching_nodes_info(tree::BnBTree{MIPNode, JuMP.Model}, node::MIPNode, vidx::Int)
m = tree.root
node_info = NamedTuple[]
var = VariableRef(m, MOI.VariableIndex(vidx))
# first variable which is not discrete
lbs = copy(node.lbs)
ubs = copy(node.ubs)
val = JuMP.value(var)
# left child set upper bound
ubs[vidx] = floor(Int, val)
push!(node_info, (
lbs = copy(node.lbs),
ubs = ubs,
status = MOI.OPTIMIZE_NOT_CALLED,
))
# right child set lower bound
lbs[vidx] = ceil(Int, val)
push!(node_info, (
lbs = lbs,
ubs = copy(node.ubs),
status = MOI.OPTIMIZE_NOT_CALLED,
))
return node_info
end
```

The important part is that it returns a vector of
` NamedTuple `

which includes all the information needed to create our `MIPNode`

besides the `std`

field.

With this you can solve a MIP problem without the need of a MIP solver, you just need any kind of linear solver. It will not be fast based on our current limited branching strategies but more on that at another time.

This post is already quite long enough I think and even though I might have promised to show you our very first TSP solver in this post at some stage, as this was the plan 😄 I hope you agree that this branch and bound framework deserved its own post here and I'll implement and the write about the actual usage of this for our TSP problem in the next post.

As always: Thanks for reading and see you soon. If you don't want to miss any posts and show me your interest please subscribe to the free email newsletter at the end.

I want to especially thank Mathieu Besançon this time as he helped me with Bonobo quite a bit by reviewing my PRs. Sometimes you don't need to provide code or even documentation to contribute to open source. Reviewing the PRs of people on projects with a single maintainer is very valuable as well. He has some more insight into the topic from another perspective and I'm very grateful for his ideas and support.

I finally managed to combine Bonobo and my TSPSolver and you can read about how you can use Bonobo to help solving TSPs in this post.

**Special thanks to my 12 patrons!**

Special special thanks to my >4$ patrons. The ones I thought couldn't be found 😄

Anonymous

Kangpyo

Gurvesh Sanghera

Szymon Bęczkowski

Colin Phillips

Jérémie Knuesel

For a donation of a **single** dollar per month you get early access to these posts. Your support will increase the time I can spend on working on this blog.

There is also a special tier if you want to get some help for your own project. You can checkout my mentoring post if you're interested in that and feel free to write me an E-mail if you have questions: o.kroeger <at> opensourc.es

I'll keep you updated on Twitter opensourcesblog.

Want to be updated? Consider subscribing and receiving a mail whenever a new post comes out.

© Ole Kröger. Last modified: May 28, 2022. Website built with Franklin.jl.