International Business Machines Corporation
Pre-training of neural network by parameter decomposition
Last updated:
Abstract:
A technique for training a neural network including an input layer, one or more hidden layers and an output layer, in which the trained neural network can be used to perform a task such as speech recognition. In the technique, a base of the neural network having at least a pre-trained hidden layer is prepared. A parameter set associated with one pre-trained hidden layer in the neural network is decomposed into a plurality of new parameter sets. The number of hidden layers in the neural network is increased by using the plurality of the new parameter sets. Pre-training for the neural network is performed.
Status:
Grant
Type:
Utility
Filling date:
5 Jul 2017
Issue date:
31 Aug 2021