Issue Report Classification Using Pre-Trained Language Models

Feb 1, 2023ยท
Giuseppe Colavito
,
Filippo Lanubile
,
Nicole Novielli
ยท 0 min read
Abstract
This paper describes our participation in the tool competition organized in the scope of the 1st International Workshop on Natural Language-based Software Engineering. We propose a supervised approach relying on fine-tuned BERT-based language models for the automatic classification of GitHub issues. We experimented with different pre-trained models, achieving the best performance with fine-tuned RoBERTa (F1 = .8591).
Type
Publication
Proceedings of the 1st International Workshop on Natural Language-based Software Engineering