SAP SE
CONTRASTIVE SELF-SUPERVISED MACHINE LEARNING FOR COMMONSENSE REASONING

Last updated:

Abstract:

In an example embodiment, a self-supervised learning task is used for training commonsense-aware representations in a minimally supervised fashion and a pair level mutual-exclusive loss is used to enforce commonsense knowledge during representation learning. This helps to exploit the mutual-exclusive nature of the training samples of commonsense reasoning corpora. Given two pieces of input where the only difference between them are trigger pieces of data, it may be postulated that the pairwise pronoun disambiguation is mutually exclusive. This idea is formulated using a contrastive loss and then this is used to update the language model.

Status:
Application
Type:

Utility

Filling date:

25 Jun 2020

Issue date:

30 Dec 2021