
Title: | Application of Transformer-based Methods to Latin Text Analysis |
Language: | English |
Authors: | Schomacker, Thorben |
Keywords: | NLP; Extractive Summarization; Transformer; BERT; BertSum; Transfer Learning |
Issue Date: | 15-May-2024 |
Abstract: | Textzusammenfassung ist ein etabliertes Problem im NLP-Bereich. Der rasch anwachsende Erfolg von deep learning Algorithmen führte zur Entwicklung des attention Mechanismus, welcher wiederum die Grundlage für die Transformer Architektur bildet. Die Transformer Architektur ist ein transfer learning Ansatz NLP Probleme zu lösen. BERT, ein pre-trained Transformer Modell, hat herausragende Ergebnis bei... Text summarization is an established problem in the field of NLP. The rapidly growing success of deep learning algorithms in solving NLP problems has led to the attention mechanism, which is the foundation for the Transformer architecture, a transfer learning approach for NLP tasks. BERT, a pre-trained Transformer model, has performed exceptionally well on various NLP tasks. This thesis applies Be... |
URI: | http://hdl.handle.net/20.500.12738/15731 |
Institute: | Department Informatik Fakultät Technik und Informatik |
Type: | Thesis |
Thesis type: | Bachelor Thesis |
Advisor: | Tropmann-Frick, Marina ![]() |
Referee: | Zukunft, Olaf |
Appears in Collections: | Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
BA_Application of Transformer-based Methods_geschwärzt.pdf | 1.49 MB | Adobe PDF | View/Open |
Note about this record
Export
Items in REPOSIT are protected by copyright, with all rights reserved, unless otherwise indicated.