Snap Inc.
Generative neural network distillation

Last updated:

Abstract:

A compact generative neural network can be distilled from a teacher generative neural network using a training network. The compact network can be trained on the input data and output data of the teacher network. The training network train the student network using a discrimination layer and one or more types of losses, such as perception loss and adversarial loss.

Status:
Grant
Type:

Utility

Filling date:

31 Aug 2018

Issue date:

30 Mar 2021