Sentence Embedding Using Transformer Encoder for Retrieving Answers with Higher Accuracy to User Queries

Authors

  • Godavarthi Deepthi, A. Mary Sowjanya

DOI:

https://doi.org/10.17762/msea.v71i4.794

Abstract

Word embeddings are used for several Natural Language Processing (NLP) tasks but these are not effective for obtaining embeddings for sentences. Sentence embeddings can be used to resolve this issue. They provides the vector representation for sentences  along with semantic information from which the machine gets clear understanding about the context. In this work, Question Answering system using universal sentence encoder (USE) with transformer encoder variant USETrans is developed to extract the sentence with the correct answer for a user query from the context. If the sentence having correct answer is identified efficiently it would be much more easier to retrieve the exact answer from the sentence.The developed model helps to provide exact answer to user query. The developed model is evaluated on SQuAD-2.0 dataset. Compared to USEDAN, it is observed that USETrans yields better accuracy.

Downloads

Published

2022-09-12

How to Cite

Godavarthi Deepthi, A. Mary Sowjanya. (2022). Sentence Embedding Using Transformer Encoder for Retrieving Answers with Higher Accuracy to User Queries. Mathematical Statistician and Engineering Applications, 71(4), 2424 –. https://doi.org/10.17762/msea.v71i4.794

Issue

Section

Articles