Fulltext available Open Access
Title: Application of Transformer-based Methods to Latin Text Analysis
Language: English
Authors: Schomacker, Thorben 
Keywords: NLP; Extractive Summarization; Transformer; BERT; BertSum; Transfer Learning
Issue Date: 15-May-2024
Abstract: 
Textzusammenfassung ist ein etabliertes Problem im NLP-Bereich. Der rasch anwachsende Erfolg von deep learning Algorithmen führte zur Entwicklung des attention Mechanismus, welcher wiederum die Grundlage für die Transformer Architektur bildet. Die Transformer Architektur ist ein transfer learning Ansatz NLP Probleme zu lösen. BERT, ein pre-trained Transformer Modell, hat herausragende Ergebnis bei...

Text summarization is an established problem in the field of NLP. The rapidly growing success of deep learning algorithms in solving NLP problems has led to the attention mechanism, which is the foundation for the Transformer architecture, a transfer learning approach for NLP tasks. BERT, a pre-trained Transformer model, has performed exceptionally well on various NLP tasks. This thesis applies Be...
URI: http://hdl.handle.net/20.500.12738/15731
Institute: Department Informatik 
Fakultät Technik und Informatik 
Type: Thesis
Thesis type: Bachelor Thesis
Advisor: Tropmann-Frick, Marina  
Referee: Zukunft, Olaf 
Appears in Collections:Theses

Files in This Item:
Show full item record

Page view(s)

112
checked on Apr 6, 2025

Download(s)

171
checked on Apr 6, 2025

Google ScholarTM

Check

HAW Katalog

Check

Note about this record


Items in REPOSIT are protected by copyright, with all rights reserved, unless otherwise indicated.