Alibaba Group Holding Limited
ARTIFICIAL NEURAL NETWORK WITH SPARSE WEIGHTS
Last updated:
Abstract:
The accuracy of multiple stages within an artificial neural network is substantially improved while at the same time utilizing approximately the same number of floating-point operations per second (FLOPS) as prior art neural network stages by filtering the input with large sparse weight matrices and large sparse weight arrays.
Status:
Application
Type:
Utility
Filling date:
29 Jun 2020
Issue date:
30 Dec 2021