目录
参考学界 | 深度神经网络的分布式训练概述:常用方法和技巧全面总结
参考分布式深度学习新进展:让“分布式”和“深度学习”真正深度融合
Asynchronous Stochastic Gradient Descent with Delay Compensation, ICML2017
其实这个在tensorRS里提到了呢
Ensemble-Compression: A New Method for Parallel Training of Deep Neural Networks, ECML 2017
Convergence Analysis of Distributed Stochastic Gradient Descent with Shuffling