Non-linear optimization with constraints and extra arguments

The content provided so far around optimization within SCILAB allowed you to solve different kind of problems: linear, quadratic, semi-definite, non-linear least squares, discrete, multiobjective, global ; leveraging differents techniques such as gradient methods, simplex, Genetic Algorithms and even Simulated Annealing.

It is actually a set of pretty powerful capabilities I won't be focusing on in this tutorial. Instead, I will try to kill a myth which subsists since beginning of time ... Since SCILAB's beginning, actually, but trust me ... that's old!

SCILAB is actually providing tools to solve the most generic problem ever: Nonlinear optimization with constraints!

Isn't that epic? Well ... more that it sounds, actually. So let me show you what some people think isn't even possible.

The problem

For this tutorial, we will take the example of the Rosenbrock's Post Office problem which is stated as follow:

The extra argument trick

In this case, the objective function is pretty easy to set. To simplify the constraints, we will wrote them as a matrix inequality:

The associated Scilab code is as follow

In that case, we added some extra parameters to the cost function. To be honnest, that was not mandatory. We could obviously have hard coded the inequalities within the function. But in some cases, extra arguments computation is more complex or is coming from an imported file. In such cases, it is preferable to use this syntax rather than using global variables. Last step is then to use a list object to concatenate the cost function with the extra parameters you would like to use.

This trick can be used in pretty much every optimization function available within SCILAB.