Illumina, Inc.
Knowledge Distillation and Gradient Pruning-Based Compression of Artificial Intelligence-Based Base Caller
Last updated:
Abstract:
The technology disclosed compresses a larger, teacher base caller into a smaller, student base caller. The student base caller has fewer processing modules and parameters than the teacher base caller. The teacher base caller is trained using hard labels (e.g., one-hot encodings). The trained teacher base caller is used to generate soft labels as output probabilities during the inference phase. The soft labels are used to train the student base caller.
Status:
Application
Type:
Utility
Filling date:
15 Feb 2021
Issue date:
26 Aug 2021