3

I am trying to use the SHGO algorithm implemented in SciPy, but I have trouble when the objective function takes more than one argument. If I understand the error correctly, I am not passing the additional arguments (parameters) to the objective function as expected, however, I don't see where my syntax is wrong. Can someone explain what the root cause of the error is and how to fix it?

Below is a reproducible example of the issue I am facing.
Image presenting the mathematical formulation of the problem implemented in the Python code

import numpy as np
import scipy.optimize as opt


def fobj(x, y, z):
    return (x+y+z).sum()

x0 = np.array([0.5, 0.5, 0.5, 0.5])
y = np.array([1, 3, 5, 7])
z = np.array([10, 20, 30, 40])
bnds = list(zip([0, 1, 2, 3], [2, 3, 4, 5]))
cons = {'type': 'eq', 'fun': lambda x: x.sum() - 14}
min_kwargs = {'method': 'SLSQP', 'options': {'maxiter': 100, 'disp': True}}
ret = opt.shgo(func=fobj, bounds=bnds, args=(y, z), constraints=cons, minimizer_kwargs=min_kwargs, options={'disp': True})

When run the following traceback is shown.

Splitting first generation
Traceback (most recent call last):
  File "C:\path\lib\site-packages\scipy\optimize\_shgo_lib\triangulation.py", line 630, in __getitem__
    return self.cache[x]
KeyError: (0, 0, 0, 0)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\path\lib\site-packages\scipy\optimize\_shgo.py", line 420, in shgo
    shc.construct_complex()
  File "C:\path\lib\site-packages\scipy\optimize\_shgo.py", line 733, in construct_complex
    self.iterate()
  File "C:\path\lib\site-packages\scipy\optimize\_shgo.py", line 876, in iterate
    self.iterate_complex()
  File "C:\path\lib\site-packages\scipy\optimize\_shgo.py", line 895, in iterate_hypercube
    self.HC = Complex(self.dim, self.func, self.args,
  File "C:\path\lib\site-packages\scipy\optimize\_shgo_lib\triangulation.py", line 25, in __init__
    self.n_cube(dim, symmetry=symmetry)
  File "C:\path\lib\site-packages\scipy\optimize\_shgo_lib\triangulation.py", line 76, in n_cube
    self.C0.add_vertex(self.V[origintuple])
  File "C:\path\lib\site-packages\scipy\optimize\_shgo_lib\triangulation.py", line 634, in __getitem__
    xval = Vertex(x, bounds=self.bounds,
  File "C:\path\lib\site-packages\scipy\optimize\_shgo_lib\triangulation.py", line 557, in __init__
    self.f = func(x_a, *func_args)
  File "C:\path\lib\site-packages\scipy\optimize\_optimize.py", line 466, in function_wrapper
    fx = function(np.copy(x), *(wrapper_args + args))
TypeError: fobj() takes 3 positional arguments but 5 were given

I don't get why the TypeError is raised. It's saying that 5 arguments rather than 3 were passed to the objective function fobj, but as far as I can understand I am only passing (y, z), so I can't see how they are 5!
Note that I tried also to re-write the local minimizer dictionary as min_kwargs = {'method': 'SLSQP', 'args': (x0), 'options': {'maxiter': 100, 'disp': True}}, but I kept facing the same error. I am sure I am passing the arguments incorrectly, but I can't understand how to do it right. Any help will be greatly appreciated.


I am using: Python 3.10.5, Numpy 1.22.4, and SciPy 1.8.1.
AB8
  • 51
  • 6
  • If the missing parentheses after `sum` is a typo that was created when you created this question, you should edit the question and fix the code, so that the *actual* problem that you have is reproducible with the code in the question. – Warren Weckesser Jun 29 '22 at 17:41
  • I corrected the typo (added the missing parentheses after `.sum` in the `return` line of the function `fobj`. I didn't think it mattered as another user had already caught and mentioned it in an answer below. Sorry for the confusion. In any case, it seems this is an actual bug of the current implementation of SHGO (as of SciPy v1.8.1) as reported on this [GitHub issue](https://github.com/scipy/scipy/issues/14589). Any idea of a quick fix? – AB8 Jun 29 '22 at 20:48

2 Answers2

4

As you discovered, the problem is a bug in shgo.

You can avoid the bug in shgo by avoiding the use of the args parameter. Instead of using args, use a wrapper of fobj that captures y and z in a closure. A simple way to do that is with a lambda expression: change the first argument of shgo from func=fobj to func=lambda x, y=y, z=z: fobj(x, y, z), and remove the args parameter from the call.

Warren Weckesser
  • 110,654
  • 19
  • 194
  • 214
  • This wrapper approach works! Thanks. I guess this is the only viable approach until the bug will be fixed. @Bob in a comment to his answer mentioned he submitted a PR, so if/when it is merged (I had the impression someone proposed the same fix months ago and it wasn't well-received), the syntax I used should work, right? – AB8 Jun 30 '22 at 17:01
  • Not sure if this is the right place to post this comment, but when I run `shgo`, the console shows `bounds in kwarg: [[0.0, 1.0], [1.0, 2.0], [2.0, 3.0], [3.0, 4.0]]`. **What does it mean?** The bounds I intended to set are [0, 2], [1, 3], [2, 4], and [3, 5] for _x1_, _x2_, _x3,_, and _x4_, respectively. – AB8 Jun 30 '22 at 17:18
0

The problem is that sum is a method, not a property of the array.

It should work if you make your objective function

def fobj(x, y, z):
    return (x+y+z).sum()

Additionally, there is the bug on the development version (updates on this soon).

The method seems not very robust as well, for sum(x) = 14, it has only one solution this could be a problem because the valid search space is 0-dimensional, but even if I relax it to things like sum(x) = 12 the method can't give the solution.

Here you have a more complete working example

import numpy as np
import scipy.optimize as opt
def fobj(x, y, z):
    return (x+y+z).sum()

x0 = np.array([0.5, 0.5, 0.5, 0.5])
y = np.array([1, 3, 5, 7])
z = np.array([10, 20, 30, 40])
bnds = list(zip([0, 1, 2, 3], [3,4,5,6]))
cons = {'type': 'eq', 'fun': lambda x: x.sum() - 9}
min_kwargs = {'method': 'SLSQP', 'options': {'maxiter': 10000, 'disp': True}}
ret = opt.shgo(func=lambda x: fobj(x, y, z), bounds=bnds, args=(), constraints=cons, 
               minimizer_kwargs=min_kwargs, options={'disp': True})
Bob
  • 13,867
  • 1
  • 5
  • 27
  • Hi Bob, thank you for the response. Could you please clarify it? I tried to correct the objective function `fobj` the way you suggested (replacing `.sum` with `.sum()`), but I keep having the same error (`TypeError: fobj() takes 3 positional arguments but 5 were given`). Does it work for you? Have you done other changes to my posted code? – AB8 Jun 29 '22 at 17:10
  • Submitted a fix https://github.com/scipy/scipy/pull/16506 – Bob Jun 30 '22 at 16:13
  • Incidentally, the objective function is minimized within the bounds for _x_ when _x_ is at its lower bound (aka, _x_ = (0, 1, 2, 3)). At that value of _x_, _f(x, y, z)_ = 122. Note that the sum of the elements of _x_ at that point is 6. Now, if we re-write the constraint as @Bob proposed (sum of elements of _x_ = 9), the solution spitted by `shgo` is _x_ = (0.75, 1.75, 2.75, 3.75) and _f_ = 125, but we know this is wrong! If we were to re-write the constraint so that sum of elements of _x_ is 6 (as it should), then `shgo` returns the expected solution (0, 1, 2, 3) and _f_ = 122. Why is that? – AB8 Jun 30 '22 at 17:12
  • Do you have any strong reason for using shgo instead any of the other [available global optimization methods](https://docs.scipy.org/doc/scipy/reference/optimize.html#global-optimization)? – Bob Jul 01 '22 at 05:54
  • I initially used `basinhopping`, but I wasn't fully satisfied because it returns solutions that violates the bounds and the constraint for the actual problem at hand (I haven't tried it on the simplified example here) notwithstanding I use a custom `take_step` and `accept_step` classes. Is there any other global optimizer in SciPy that allows for bounds and constraints? – AB8 Jul 01 '22 at 13:55
  • So you found bugs on basinhopping as well? Have you tried to post to https://scicomp.stackexchange.com/? Maybe there is some mathematical treatment to make it for the numerical methods. Depending on your constraints it could be worth using converting the problem to an unconstrained optimization using Lagrangian multipliers. But these areas are beyond the scope of stackoverflow. – Bob Jul 03 '22 at 05:58