I have a discrete optimization problem containing a complicated objective function that is a float resulting from parameters passed to it, which are only available in discretized steps (here ints).
Is there a standard steepest descent tool for bounded discrete optimizations?
Python is ideal since I have everything brute-forcing all this in that language already.
I've investigated scipy.optimize.brute, but I'd like to avoid brute forcing the whole problem repeatedly since I am varying the objective function and this type of problem is repeatedly occurring in my research.
My objective function here is the RMS error of a set of values (tabulated using the discretized parameters) versus a reference set of values.
EDIT:
Examining a simulated annealing package (https://github.com/perrygeo/python-simulated-annealing) per recommendation in Discrete optimzation in python This still isn't ideal as the surface shouldn't be that complex (here 4-5-dimensional with weak dependence on parameters).
EDIT2:
Still seems like there should be a better way to do this, but the simulated annealing approach with a finite time scale seems to be passable- including my code below
from random import *
from copy import deepcopy
def move(params): #has specific steps of parameters included
p=deepcopy(params)
a=randint(0,3)
b=randint(0,1)
#print a,b
if b==1:
if a!=3:
p[a]=params[a]+5
else:
p[a]=params[a]+50
else:
if a!=3:
p[a]=params[a]-5
else:
p[a]=params[a]-50
return p
def isvalid(params,temp): #checks for boundary conditions and temperature constraint
for i in xrange(3):
if params[i]<60:
return False
if params[i]>135:
return False
if params[-1]>1500:
return False
if params[-1]<600:
return False
if objfun(params,quiet=True)<temp:
return True
else:
return False
params=[75,85,70,1400]
temp=0.075
best=deepcopy(params)
bestval=objfun(best)
runlength=1000
for i in xrange(runlength):
newp=move(params)
if isvalid(newp,temp):
params=newp
en=objfun(params)
if en<bestval:
best=deepcopy(params)
bestval=en
print best,bestval