10

I use minimize from the Scipy module on Python 3.4, specifically:

resultats=minimize(margin_rate, iniprices, method='SLSQP',
jac=margin_rate_deriv, bounds=pricebounds, options={'disp': True,
'maxiter':2000}, callback=iter_report_margin_rate)

The maximum number of iterations can be set (as above), but is there a way to tell minimize to stop searching for a solution after a given set time? I looked at the general options of minimize as well as the specific options of the SLSQP solver, but could not work it out.

Thanks

Charles
  • 613
  • 2
  • 8
  • 18
  • Usually you would limit search time indirectly, using the `maxiter` argument. Is there some reason why you can't just reduce `maxiter` to achieve a reasonable maximum execution time? – ali_m Jul 18 '14 at 13:48
  • Thanks Ali. That's because I have a problem where each iteration can be very very long. So I would like to be able to stop the very first iteration if it exceeds a given time. Also depending on the data fed to the solver, the duration of individual iterations will vary a lot - so indirectly controlling execution time through the number of iterations is not practical. – Charles Jul 18 '14 at 13:51
  • Umh I briefly saw an answer which I think suggested the use of the callback function to stop the solver according to the time elapsed. That sounded promising but that answer was apparently removed. Any idea why? – Charles Jul 18 '14 at 13:59

2 Answers2

12

You can use the callback argument to raise a warning or exception if the execution time exceeds some threshold:

import numpy as np
from scipy.optimize import minimize, rosen
import time
import warnings

class TookTooLong(Warning):
    pass

class MinimizeStopper(object):
    def __init__(self, max_sec=60):
        self.max_sec = max_sec
        self.start = time.time()
    def __call__(self, xk=None):
        elapsed = time.time() - self.start
        if elapsed > self.max_sec:
            warnings.warn("Terminating optimization: time limit reached",
                          TookTooLong)
        else:
            # you might want to report other stuff here
            print("Elapsed: %.3f sec" % elapsed)

# example usage
x0 = [1.3, 0.7, 0.8, 1.9, 1.2]
res = minimize(rosen, x0, method='Nelder-Mead', callback=MinimizeStopper(1E-3))
ali_m
  • 71,714
  • 23
  • 223
  • 298
  • thanks a lot Ali. that would work. A seems relatively simple to implement. – Charles Jul 18 '14 at 14:25
  • Very nice answer indeed. Easy to personalise also on `differential_evolution` – AlessioX Feb 02 '20 at 12:45
  • Hey this is from over a year ago now - apologies - but what happens if it does take too long and `pass` is called ? Does minimize just send back the `OptimisationResult` it got too ? – jolene Apr 19 '21 at 12:54
  • I had to use def __call__(self, xk=None, convergence = 0) in Python 3.11.1 for DE – craigB Aug 03 '23 at 02:58
6

No. What you can do is start the optimizer in a separate process, keep track of how long it has been running and terminate it if necessary:

from multiprocessing import Process, Queue
import time
import random
from __future__ import print_function

def f(param, queue):
    #do the minimization and add result to queue
    #res = minimize(param)
    #queue.put(res)

    #to make this a working example I'll just sleep a 
    #a random amount of time
    sleep_amount = random.randint(1, 10)
    time.sleep(sleep_amount)
    res = param*sleep_amount
    queue.put(res)

q = Queue()
p = Process(target=f, args=(2.2, q))
max_time = 3
t0 = time.time()

p.start()
while time.time() - t0 < max_time:
    p.join(timeout=1)
    if not p.is_alive():
        break

if p.is_alive():
    #process didn't finish in time so we terminate it
    p.terminate()
    result = None
else:
    result = q.get()
print(result)
numentar
  • 1,049
  • 8
  • 21
  • That's great. A bit more general than the alternative approach above, because would work on the first iteration.Thanks numentar. – Charles Jul 18 '14 at 14:25
  • No problem! The callback solution is also interesting but unfortunately many other scipy.optimize methods don't have the option of providing a callback function (e.g., curve_fit). This approach can be applied to any function whose execution time has to be limited. I use a similar approach to run thousands of jobs in parallel while ensuring that no single jobs gets stuck for too long. – numentar Jul 18 '14 at 14:52