4

One way to do gradient descent in Python is to code it myself. However, given how popular a concept it is in machine learning, I was wondering if there is a Python library that I can import that gives me a gradient descent method (preferably mini-batch gradient descent since it's generally better than batch and stochastic gradient descent, but correct me if I'm wrong).

I checked NumPy and SciPy but couldn't find anything. I have no experience with TensorFlow but looked through their online API. I found tf.train.GradientDescentOptimizer, but there is no parameter that lets me choose a batch size, so I'm rather fuzzy on what it actually is.

Sorry if I sound naive. I'm self-learning a lot of this stuff.

Kevin Trinh
  • 55
  • 1
  • 5
  • Look at that example. https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/udacity/4_convolutions.ipynb – koPytok May 08 '18 at 04:59
  • In the graph X is a placeholder of batch_size, and in training the full data is divided into batches at every step – koPytok May 08 '18 at 05:04
  • First, I advise you to choose Adam Optimizer instead of GradientDescent (concerning Stochastic Gradient descent, this can be "hand-wavily" seen as a mini-batch gradient descent with mini-batch size equals to `1`). As for TensforFlow, even if you cannot chose the mini-batch size, you can still code yourself that part and use the GradientDescentOptimizer on these minibatches. – picnix_ May 08 '18 at 06:49

1 Answers1

5

To state the obvious, gradient descent is optimizing a function. When you use some implementation of gradient descent from some library, you need to specify the function using this library's constructs. For example, functions are represented as computation graphs in TensorFlow. You cannot just take some pure python function and ask TensorFlow's gradient descent optimizer to optimize it.

If your use case allows you to use TensorFlow computation graphs (and all the associated machinery - how to run the function, compute its gradient, ), tf.train.*Optimizer would be an obvious choice. Else, it is unusable.

If you need something light, https://github.com/HIPS/autograd is probably the best option of all the popular libraries. Its optimizers can be found here: https://github.com/HIPS/autograd/blob/master/autograd/misc/optimizers.py

iga
  • 3,571
  • 1
  • 12
  • 22