>
Fa   |   Ar   |   En
   Deep Transformer-based Representation for Text Chunking  
   
نویسنده kavehzadeh parsa ,abdollah pour mohammad mahdi ,momtazi saeedeh
منبع journal of information systems and telecommunication - 2023 - دوره : 11 - شماره : 3 - صفحه:176 -184
چکیده    Text chunking is one of the basic tasks in natural language processing. most proposed models in recent years were employed on chunking and other sequence labeling tasks simultaneously and they were mostly based on recurrent neural networks (rnn) and conditional random field (crf). in this article, we use state-of-the-art transformer-based models in combination with crf, long short-term memory (lstm)-crf as well as a simple dense layer to study the impact of different pre-trained models on the overall performance in text chunking. to this aim, we evaluate bert, roberta, funnel transformer, xlm, xlm-roberta, bart, and gpt2 as candidates of contextualized models. our experiments exhibit that all transformer-based models except gpt2 achieved close and high scores on text chunking. due to the unique unidirectional architecture of gpt2, it shows a relatively poor performance on text chunking in comparison to other bidirectional transformer-based architectures. our experiments also revealed that adding a lstm layer to transformer-based models does not significantly improve the results since lstm does not add additional features to assist the model to achieve more information from the input compared to the deep contextualized models.
کلیدواژه Text Chunking; Sequence Labeling; Contextualized Word Representation ,Deep Learning; Transformers
آدرس amirkabir university of technology, computer engineering department, Iran, amirkabir university of technology, computer engineering department, Iran, amirkabir university of technology, computer engineering department, Iran
پست الکترونیکی momtazi@aut.ac.ir
 
     
   
Authors
  
 
 

Copyright 2023
Islamic World Science Citation Center
All Rights Reserved