I am very new to python. I was studing some code from qiskit - https://qiskit.org/textbook/ch-machine-learning/machine-learning-qiskit-pytorch.html. I have a lot of doubt in the below code. After doing some study of neural network, I can intrepret that below code is for back progpogation. But above the function backward, the @staticmethod is written. I did some study and came to know that these are decorators. They run before the backward function is executed. Not sure what exactly this function do. I studied the neural network code for classical computer. They donot add this. I would be really if someone can explain me its use.
@staticmethod
def backward(ctx, grad_output):
""" Backward pass computation """
input, expectation_z = ctx.saved_tensors
input_list = np.array(input.tolist())
shift_right = input_list + np.ones(input_list.shape) * ctx.shift
shift_left = input_list - np.ones(input_list.shape) * ctx.shift
gradients = []
for i in range(len(input_list)):
expectation_right = ctx.quantum_circuit.run(shift_right[i])
expectation_left = ctx.quantum_circuit.run(shift_left[i])
gradient = torch.tensor([expectation_right]) - torch.tensor([expectation_left])
gradients.append(gradient)
gradients = np.array([gradients]).T
return torch.tensor([gradients]).float() * grad_output.float(), None, None