# add l2 regularization to optimzer by just adding in a weight_decay optimizer = torch.optim.Adam(model.parameters(),lr=1e-4,weight_decay=1e-5)
Here is what the above code is Doing:
1. We are creating a model object.
2. We are creating an optimizer object.
3. We are adding a weight_decay parameter to the optimizer.
Now, let’s train the model.