Gradient Clipping

Gradient Clipping

Correct way to do gradient clipping

  • after loss.backward() & before optimizer.step()
clip_value = 1.0

optimizer.zero_grad()        
loss, hidden = model(data, hidden, targets)
loss.backward()

torch.nn.utils.clip_grad_norm_(model.parameters(), clip_value) # clip_value - maximum norm
optimizer.step()