AdamW and Super-convergence is now the fastest way to train neural nets

1 · fast.ai · July 2, 2018, midnight
Note from Jeremy: Welcome to fast.ai’s first scholar-in-residence, Sylvain Gugger. What better way to introduce him than to publish the results of his first research project at fast.ai. We’ll be using the results of this research to change how we train models in the next version of our course and in our fastai library, as a result of which students and practitioners will be able to reliably train their models far faster than previous approaches. The Adam roller-coaster The journey of the Adam ...