Building a COIN-OR project - DisCO

23rd June 2016


Last year I started working on a project with Ted Ralphs while he was visiting ZIB. This project involved using the solver DisCO recently developed by his PhD student Aykut Bulut. Now, I was not trying to do anything fancy, I just wanted to use it. But I just could not get my head around what I should be doing to build the software. I have now since worked out what is going on and I think that the build system has many benefits. I thought that my experiences could entertain or aid someone else in my position.

What is a DisCO?

Now, in the field of computational mathematical programming I feel like the title for the best acronym creator has to go to Ted Ralphs. I am not the one who can hand out such awards, but Ted has come up with some interesting names for his solvers. The first of his solvers, probably one of the more reserved names, was named SYMPHONY. Some of the more interesting names are BiCePS, CHiPPS, ALPS and MibS. I won't go through them all, but you get the idea.

So DisCO is not a style of dance and music from the 70's but stands for Discrete Conic Optimisation. As the name suggests, DisCO solves discrete conic problems, in particular mixed integer second order cone optimisation problems. The interest for me was the parallelisation part of the solver, where DisCO has been developed to run on multiple processors using MPI. I wanted to see the performance of DisCO as a parallel solver for mixed integer programs.

Building a DisCO

I have been a user of SCIP for a long time now, and for a little while a developer, so I am quite familiar with the SCIP build system. Building SCIP is not always the easiest task either, but with many years of experience I have become accustomed to the common issues that arise. I am also aware that things do not work always as the original developer intended. This mainly has to do with the specifics of your machine other than errors on the developers part.

I needed to use DisCO, but I was not so familiar with the build system. However, the README indicated it should not be a difficult task to build this package. In fact, it should only take three lines:

git clone --branch=stable/0.8 https://github.com/coin-or-tools/BuildTools
bash BuildTools/get.dependencies.sh fetch > /dev/null
bash BuildTools/get.dependencies.sh build

The first part is downloading, or cloning, the BuildTools. This is a collection of tools used by COIN-OR packages to manage all of the downloads of dependencies and third-party software. The second just downloads all of the dependencies. This can take some time. The third builds all dependencies and the complete package.

Needless to say, these three lines did not work for me.

First of all, I wanted to use the parallelisation of the solver. How to do this is clearly documented in the README and it involves just changing the third line above to

./configure --disable-shared --enable-static --with-mpi-lib=/usr/lib/libmpich.so.3.2 --with-mpi-incdir=/usr/lib/mpich/include MPICC=mpicc.mpich MPICXX=mpic++.mpich"

Once again, that didn't work either. There were actually three problems here. First is that you should not be using ./configure but BuildTools/get.dependencies.sh build instead. The second is that if you have previously called BuildTools/get.dependencies.sh build, then you should not change the parameters. You will need to use a different build directory (more on this later). Finally, there is an issue with different MPI installations. My machine had an installation of Open MPI. So I needed to change all of the references to mpich to openmpi and also modify some of the paths.

The next step, I was trying to solve MIPs, not second order cone problems. So I needed a good LP solver, like CPLEX. So that required some additional parameters for the build command. These parameters are:

--disable-cplex-libcheck --with-cplex-lib="-L/optimi/usr/sw/cplex/lib/x86-64_linux/static_pic -lcplex -lm -lpthread" --with-cplex-incdir="/optimi/usr/sw/cplex/include/ilcplex/"

Notice here that I have used --disable-cplex-libcheck. This was particularly important, but I can remember why. So if you are having issues using CPLEX with DisCO, then try that flag.

There was also an issue with using some GNU packages. So I needed an additional flag --enable-gnu-packages. Finally, I wanted to do all of this build process in parallel. This required --parallel-jobs=8.

So my final build command was

BuildTools/get.dependencies.sh build --enable-gnu-packages --parallel-jobs=8 --disable-cplex-libcheck --with-cplex-lib="-L/optimi/usr/sw/cplex/lib/x86-64_linux/static_pic -lcplex -lm -lpthread" --with-cplex-incdir="/optimi/usr/sw/cplex/include/ilcplex/" --disable-shared --enable-static --with-mpi-lib=/usr/lib/libmpi.so.12.0.2 --with-mpi-incdir=/usr/lib/openmpi/include/ MPICC=mpicc.openmpi MPICXX=mpic++.openmpi

Compare this to SCIP for the same thing (here there are two packages that you need to build SCIP and UG). Assuming that you have the directories for SCIP and UG in you home directory, the build commands are:

cd scip
make LPS=cpx PARASCIP=true -j8
cd ../ug
make LPS=cpx COMM=mpi -j8

Now this is not completely fair. When you use the parameter LPS=cpx you are asked for the paths to the include and lib directories, so the commands for SCIP and DisCO are some what comparable.

Things to note

One of the biggest things that I failed to understand was how to build with different parameters. This is something that tried to do a couple of times and when I did it wrong once, it all seemed to fail. So what you must remember is that you can only build with one set of parameters in one build directory. Now, here is the great thing about this process, you can build with different parameters, you just need to use a different build directory. This part is fairly easy when you know that is what you are supposed to do. When you try to build with a different set of parameters you are given a warning and then an option to use a new build directory.

My failing was that this idea was very different to what I was used to with SCIP. With SCIP, you can build multiple times in the same directory and you end up with different binaries.

In the end, I have found that the build system for DisCO is very useful and it works quite well. I think I understand what is going on with it now. So that should stop me hassling Ted for now with any issues I am having with his solvers.