0

I am trying to find the global minimum of the Sum of Squared Differences which contains five different parameters x[0], x[1], x[2], x{3], x[4] coming from an affine transformation of data. Since I have a lot of data to compare the different_evolution approach was speeded up with the use of numba as you can see in the code part below. (More details to the code can be found here: How to speed up differential_evolution to find the minimum of Sum of Squared Errors with five variables)

My problem now is, that I get different values for my parameters for running the same Code several times. This problem is also described in Differential evolution algorithm different results for different runs but I use the scipy.optimize package to import differential_evolution instead of mystic. I tried to use mystic instead but then I get de following error message:

TypeError: CPUDispatcher(<function tempsum3 at 0x000002AA043999D0>) is not a Python function

My question now is, if there is any appproach that really gives me back the global minimum of my function? An approach using the powell-method with the minimize command gives back worse values for my parameters than differential_evolution (gets stuck in local minmia even faster). And if there is no other way than different_evolution, how can I be sure that I really have the global Minimum at some point? And in case the mystic approach is expedient, how do i get it going? I would be thankful for any kind of advice,

Here's my function to minimize with the use of numba to speed it up and the mystic pacakge, that gives me back an error message:

from math import floor
import matplotlib.pyplot as plt
import numpy as np
import mystic as my
from mystic.solvers import diffev
import numba as nb

w1=5 #A1
h1=3
w2=8#A2
h2=5
hw1=w1*h1
hw2=w2*h2
A1=np.ones((h1,w1))
A2=np.ones((h2,w2))

for n1 in np.arange(2,4):
    A1[1][n1]=1000

for n2 in np.arange(5,7):
    A2[3][n2]=1000
    A2[4][n2]=1000

#Norm the raw data
maxA1=np.max(A1)
A1=1/maxA1*A1
#Norm the raw data
maxA2=np.max(A2)
A2=1/maxA2*A2
   
fig, axes = plt.subplots(1, 2)
axes[0].imshow(A2)
axes[0].set_title('original-A2')
axes[1].imshow(A1)
axes[1].set_title('shifted-A1')
plt.show()

@nb.njit
def xy_to_pixelvalue(x,y,dataArray): #getting the values to the normed pixels
    height=dataArray.shape[0]
    width=dataArray.shape[1]
    xcoord=floor((x+1)/2*(width-1)) 
    ycoord=floor((y+1)/2*(height-1))
    if -1 <=x<=1 and -1<=y<=1:
        return dataArray[ycoord][xcoord]
    else:
        return(0)

#norming pixel coordinates
A1x=np.linspace(-1,1,w1)
A1y=np.linspace(-1,1,h1)

A2x=np.linspace(-1,1,w2)
A2y=np.linspace(-1,1,h2)

#normed coordinates of A2 in a matrix
Ap2=np.zeros((h2,w2,2))
for i in np.arange(0,h2):
    for j in np.arange(0,w2):
     Ap2[i][j]=(A2x[j],A2y[i])
#defining a vector with the coordinates of A2
d=[]
cdata=Ap2[:,:,0:2]
for i in np.arange(0,h2):
    for j in np.arange(0,w2):
        d.append((cdata[i][j][0],cdata[i][j][1]))
d=np.asarray(d)
coora2=d.transpose()
coora2=np.array(coora2, dtype=np.float64)

@nb.njit('(float64[::1],)')
def tempsum3(x):
    tempsum=0
    for l in np.arange(0,hw2):
     tempsum += (xy_to_pixelvalue(np.cos(np.radians(x[2]))*x[0]*coora2[0][l]-np.sin(np.radians(x[2]))*x[1]*coora2[1][l]+x[3],np.sin(np.radians(x[2]))*x[0]*coora2[0][l]+np.cos(np.radians(x[2]))*x[1]*coora2[1][l]+x[4],A1)-xy_to_pixelvalue(coora2[0][l],coora2[1][l],A2))**2
    return tempsum

x0 = np.array([1,1,-0.5,0,0])
bounds = [(0.1,5),(0.1,5),(-1,2),(-4,4),(-4,4)]
result = diffev(tempsum3, x0, npop = 5*15, bounds = bounds, ftol = 1e-11, gtol = 3500, maxiter = 1024**3, maxfun = 1024**3, full_output=True, scale =  0.8) 
print(result)

van25tam
  • 11
  • 4
  • 1
    I'm the `mystic` author. The error you are seeing is not coming from `mystic`. Also, if you see my answer in the SO question you linked to (that is similar to this question) -- not finding the global minimum is normal. you have to tune the optimizer settings a bit, in general, to find the minimum. – Mike McKerns May 28 '22 at 09:14
  • Note that the code is not complete enough so we can actually reproduce it. What is `coorlbic` for example? Many other variables are missing. Please update the code so it can be complete and reproducible (while being relatively small). – Jérôme Richard May 28 '22 at 11:40
  • @MikeMcKerns thanks for your answer. That was my idea too at the beginning, but what's weird is, that the code as it is shown above works fine for the scipy.optimize package for differential_evolution, but I get an error message with the mystic package. Since the Code also works without using the numba command above the tempsum3 function (but takes way too long then), I guess the error message has to do with the combination of numba and mystic as in my code. As soon as I get it going, I would try to edit the optimizer settings. Any ideas how to combine numba and mystic in that way as I did? – van25tam May 30 '22 at 06:14
  • @JérômeRichard Thanks for that hint, I edited the Code, so I hope it should work now for external use. – van25tam May 30 '22 at 06:15
  • `mystic` uses class-based optimizers, and has different optimizer settings and components than the `scipy.optimize` functions. The code is also not the same (`mystic` DE code is a decade older than the one in `scipy.optimize`). It's on my list of things to try to integrate `numba` with `mystic` -- but I don't see what's giving your error message. I'll have to try it out. – Mike McKerns May 31 '22 at 01:02
  • @Mike McKerns I guess the error message comes from the tempsum3 object in my code that changes with the use of numba. I would be thankful to hear, what you've found out... Since I am quite new on Python I am not sure about another question - is it the same principle with scipy.optimize, in case I optimize the settings, do I get back the global minimium? I would open a seperate question then for how choosing those parameters since the mystic package was intuitively a little bit more comprehensible for me... – van25tam May 31 '22 at 04:54
  • There's no guarantee from either `mystic` or `scipy.optimize` or any other optimizer that you will get back the global minimum. In theory, given an infinite amount of time, DE will give you the global minimum... however, the termination conditions often stop the optimization before it is found. Even if an optimizer is repeatedly converging to the same point, there's no guarantee that it is finding the global minimum. You can only build a confidence that you have found it. `mystic` has some unique tools that try to help ensure you have the global minimum, but there's still no guarantee. – Mike McKerns May 31 '22 at 09:15
  • Thanks a lot @MikeMcKerns, that's very helpful for me! So I guess my main problem now is to arrange the mystic package with a speeding up function except numba, to use those unique tools as a benefit of mystic. May you also have some recommendations on this problem? – van25tam Jun 01 '22 at 08:12
  • `mystic` can leverage parallel resources like multithreading and multiprocessing... but I think your most recent question is hard to answer in this format. As to why you are experiencing the issue with a jitted function, it might be helpful to open a GitHub issue which would enable more exploration in that regard. – Mike McKerns Jun 01 '22 at 09:33
  • Thank's a lot for your help @MikeMcKerns . I am not sure if I get an approach with multithreading or multiprocessing going to use the mystic package, any advices to implement this in my code? Anyway you helped me a lot to understand the function of differential_evolution, thanks. – van25tam Jun 02 '22 at 04:54
  • With the differential evolution solver, you just have to pass it a parallel `map` function, and it should run in parallel. Check the solver docs for the keyword `map`. The ensemble solvers also run in parallel when a map is provided, and are an alternate to differential evolution. – Mike McKerns Jun 02 '22 at 16:15

0 Answers0