Publisher DOI: 10.3390/e23111422
Title: Language representation models : an overview
Language: English
Authors: Schomacker, Thorben 
Tropmann-Frick, Marina  
Keywords: Attention-based models; Deep learning; Embeddings; Multi-task learning; Natural language processing; Neural networks; Transformer
Issue Date: 28-Oct-2021
Publisher: MDPI
Source: article number : 1422
Journal or Series Name: Entropy 
Volume: 23
Issue: 11
Abstract: 
In the last few decades, text mining has been used to extract knowledge from free texts. Applying neural networks and deep learning to natural language processing (NLP) tasks has led to many accomplishments for real-world language problems over the years. The developments of the last five years have resulted in techniques that have allowed for the practical application of transfer learning in NLP....
URI: http://hdl.handle.net/20.500.12738/12332
ISSN: 1099-4300
Institute: Department Informatik 
Fakultät Technik und Informatik 
Type: Article
Appears in Collections:Publications without full text

Show full item record

Page view(s)

171
checked on Apr 4, 2025

Google ScholarTM

Check

HAW Katalog

Check

Add Files to Item

Note about this record


This item is licensed under a Creative Commons License Creative Commons