Microsoft Corporation
DYNAMIC GRADIENT AGGREGATION FOR TRAINING NEURAL NETWORKS

Last updated:

Abstract:

The disclosure herein describes training a global model based on a plurality of data sets. The global model is applied to each data set of the plurality of data sets and a plurality of gradients is generated based on that application. At least one gradient quality metric is determined for each gradient of the plurality of gradients. Based on the determined gradient quality metrics of the plurality of gradients, a plurality of weight factors is calculated. The plurality of gradients is transformed into a plurality of weighted gradients based on the calculated plurality of weight factors and a global gradient is generated based on the plurality of weighted gradients. The global model is updated based on the global gradient, wherein the updated global model, when applied to a data set, performs a task based on the data set and provides model output based on performing the task.

Status:
Application
Type:

Utility

Filling date:

31 Jul 2020

Issue date:

3 Feb 2022