Please use this identifier to cite or link to this item: https://doi.org/10.48441/4427.1962
Publisher DOI: 10.1007/s44244-024-00020-y
Title: Masked autoencoder : influence of self-supervised pretraining on object segmentation in industrial images
Language: English
Authors: Witte, Anja 
Lange, Sascha 
Lins, Christian  
Keywords: Masked autoencoder; Self-supervised pretraining; Semantic segmentation; UNETR; Label-efficiency; Log- yard cranes
Issue Date: 23-Aug-2024
Publisher: Springer
Journal or Series Name: Industrial artificial intelligence 
Volume: 2
Issue: 1
Abstract: 
The amount of labelled data in industrial use cases is limited because the annotation process is time-consuming and costly. As in research, self-supervised pretraining such as MAE resulted in training segmentation models with fewer labels, this is also an interesting direction for industry. The reduction of required labels is achieved with large amounts of unlabelled images for the pretraining that aims to learn image features. This paper analyses the influence of MAE pretraining on the efficiency of label usage for semantic segmentation with UNETR. This is investigated for the use case of log-yard cranes. Additionally, two transfer learning cases with respect to crane type and perspective are considered in the context of label-efficiency. The results show that MAE is successfully applicable to the use case. With respect to the segmentation, an IoU improvement of 3.26% is reached while using 2000 labels. The strongest positive influence is found for all experiments in the lower label amounts. The highest effect is achieved with transfer learning regarding cranes, where IoU and Recall increase about 4.31% and 8.58%, respectively. Further analyses show that improvements result from a better distinction between the background and the segmented crane objects.
URI: https://hdl.handle.net/20.500.12738/16387
DOI: 10.48441/4427.1962
ISSN: 2731-667X
Review status: This version was peer reviewed (peer review)
Institute: Department Informatik 
Fakultät Technik und Informatik 
Type: Article
Additional note: Witte, A., Lange, S. & Lins, C. Masked autoencoder: influence of self-supervised pretraining on object segmentation in industrial images. Industrial Artificial Intelligence 2, 7 (2024). https://doi.org/10.1007/s44244-024-00020-y
Appears in Collections:Publications with full text

Files in This Item:
File Description SizeFormat
2024_Witte_MaskedAutoencoder.pdf4.54 MBAdobe PDFView/Open
Show full item record

Page view(s)

10
checked on Oct 18, 2024

Download(s)

3
checked on Oct 18, 2024

Google ScholarTM

Check

HAW Katalog

Check

Note about this record


This item is licensed under a Creative Commons License Creative Commons