implement custom optimizer pytorch

optimizer = MySOTAOptimizer(my_model.parameters(), lr=0.001)
for epoch in epochs:
	for batch in epoch:
		outputs = my_model(batch)
		loss  = loss_fn(outputs, true_values)
		loss.backward()
		optimizer.step()

Here is what the above code is Doing:
1. We create a model, which is a subclass of nn.Module.
2. We create an optimizer, which is a subclass of torch.optim.Optimizer.
3. We iterate over our data in epochs.
4. For each batch in an epoch, we:
a. Compute the output of the model on the batch.
b. Compute the loss.
c. Backpropagate the loss.
d. Step the optimizer.