I am currently trying to find some use cases to differentiate between the two optimization techniques i.e. Nevergrad (PSO) and SHGO. or which is more useful. Please help me with this..
Asked
Active
Viewed 120 times
0
-
It looks like shgo is a method implemented in scipy (https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.shgo.html), while Nevergrad is a standalone library (https://github.com/facebookresearch/nevergrad). It looks neither of them uses gradients, so they may be the same in this respect. You can test both, or specify your goals (if you already use scipy, I would recommend to use its module). – Yaroslav Nikitenko May 26 '22 at 11:27
-
Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer. – Community May 26 '22 at 12:07