Ben Chuanlong Du's Blog

It is never too late to learn.

Gradient Clipping in PyTorch

Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!

optimizer.zero_grad()        
output = model(data)
loss = F.nll_loss(output, target)
loss.backward()
torch.nn.utils.clip_grad_norm_(model.parameters(), args.clip)
optimizer.step()
  1. Use torch.nn.utils.clips_grad_norm_ (which is in-place) instead of torch.nn.utils.clips_grad_norm (which returns a copy and has been deprecated).

References

https://discuss.pytorch.org/t/proper-way-to-do-gradient-clipping/191

About torch.nn.utils.clip_grad_norm

How to do gradient clipping in pytorch?

Comments