NVIDIA Corporation
DISTRIBUTED WEIGHT UPDATE FOR BACKPROPAGATION OF A NEURAL NETWORK
Last updated:
Abstract:
Speed of training a neural network is improved by updating the weights of the neural network in parallel. In at least one embodiment, after back propagation, gradients are distributed to a plurality of processors, each of which calculate a portion of the updated weights of the neural network.
Status:
Application
Type:
Utility
Filling date:
5 Nov 2019
Issue date:
6 May 2021