I'm trying to use a gradient method to find a minimizer of a function f
, where x(k+1)=x(k)-α▽f(x)
, and α=0.1
. The function f
is f(x) = 2*(x^3+1)^(-1/2)
, x(0)=1
Here is the pytorch sample
import torch
import torch.nn as nn
from torch import optim
alpha = 0.1
class MyModel(nn.Module):
def __init__(self):
super().__init__()
self.x = 1
def forward(self, x):
return 2*pow((pow(x,3)+1),(-1/2))
model = MyModel()
optimizer = optim.SGD(model.parameters(),lr=alpha)
terminationCond = False
while not terminationCond:
f = model.forward(x)
f.backward()
optimizer.step()
if x.grad < 0.001:
terminationCond = True
optimizer.zero_grad()
But I cannot output the correct value of x
, how to modify my code in order to find a minizer of a function f
?