Adobe Inc.
Utilizing a gated self-attention memory network model for predicting a candidate answer match to a query

Last updated:

Abstract:

The present disclosure relates to systems, methods, and non-transitory computer-readable media that can determine an answer to a query based on matching probabilities for combinations of respective candidate answers. For example, the disclosed systems can utilize a gated-self attention mechanism (GSAM) to interpret inputs that include contextual information, a query, and candidate answers. The disclosed systems can also utilize a memory network in tandem with the GSAM to form a gated self-attention memory network (GSAMN) to refine outputs or predictions over multiple reasoning hops. Further, the disclosed systems can utilize transfer learning of the GSAM/GSAMN from an initial training dataset to a target training dataset.

Status:
Grant
Type:

Utility

Filling date:

12 Sep 2019

Issue date:

7 Sep 2021