General and Scalable Parallelization for Neural Networks

3 · Google AI Research · Dec. 8, 2021, 7:43 p.m.
Posted by Yuanzhong Xu and Yanping Huang, Software Engineers; Google Research, Brain Team Scaling neural networks, whether it be the amount of training data used, the model size or the computation being utilized, has been critical for improving model quality in many real-world machine learning applications, such as computer vision, language understanding and neural machine translation. This, in turn, has motivated recent studies to scrutinize the factors that play a critical role in the succes...