Review of 'Google AI Blog: Reformer: The Efficient Transformer'

1 · Shane Mulligan · Jan. 19, 2020, 11 a.m.
Summary
Original article Google AI Blog: Reformer: The Efficient Transformer 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 Reformer [Transformer model] Designed to handle context windows of up to 1 million words, all on a single accelerator and using only 16GB of memory. It combines two crucial techniques to solve the problems of attention and memory allocation that limit Transformer’s application to long context windows....