International Business Machines Corporation
FREQUENTLY ASKED QUESTIONS AND DOCUMENT RETRIVAL USING BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (BERT) MODEL TRAINED ON GENERATED PARAPHRASES
Last updated:
Abstract:
An example system includes a processor to receive a query. The processor can retrieve ranked candidates from an index based on the query. The processor can re-rank the ranked candidates using a Bidirectional Encoder Representations from Transformers (BERT) query-question (Q-q) model trained to match queries to questions of a frequently asked question (FAQ) dataset, wherein the BERT Q-q model is fine-tuned using paraphrases generated for the questions in the FAQ dataset. The processor can return the re-ranked candidates in response to the query.
Status:
Application
Type:
Utility
Filling date:
10 Jun 2020
Issue date:
16 Dec 2021