Show simple item record

dc.contributor.authorShi, Jen_NZ
dc.contributor.authorLi, Wen_NZ
dc.contributor.authorBai, Qen_NZ
dc.contributor.authorIto, Ten_NZ
dc.date.accessioned2022-05-27T02:54:54Z
dc.date.available2022-05-27T02:54:54Z
dc.identifier.citationThe Journal of Supercomputing (2022). https://doi.org/10.1007/s11227-022-04579-0
dc.identifier.issn0920-8542en_NZ
dc.identifier.issn1573-0484en_NZ
dc.identifier.urihttp://hdl.handle.net/10292/15173
dc.description.abstractAspect terms are opinion targets for people to express and understand opinions in reviews. Aspect terms extraction is an essential subtask in aspect-level sentiment analysis. To extract aspect terms from a sentence, existing methods mainly focus on context features generated by pre-trained models. However, these models either neglect the crucial implicit linguistic features, e.g., post-of-tag, head, and head dependency, or fail to explore sufficient valuable features for aspect term extraction, which lead to the deficiency in aspect term extraction task. To address the challenges, in this paper, we propose a novel and effective framework for aspect term extraction by integrating both contextual and linguistic features with the artificial bee colony-based feature selection method. Firstly, a novel variant of artificial bee colony is designed to identify the most valuable linguistic features to reduce the high sparsity and dimensionality of the raw dataset. Next, the selected features and context embeddings are integrated to improve the performance of aspect extraction. Finally, extensive experiments are conducted on real-world datasets, and the results exhibit that our proposed framework can outperform the competitive baselines. Compared with the latest baselines, the proposed framework achieves the comparatively higher <jats:italic>F</jats:italic>1 scores of 80.7%, 84.7%, 72.2%, and 74.8% on the four groups of datasets. Furthermore, the ablation study shows that the proposed method with the designed feature selection module significantly outperforms the method with the original artificial bee colony, having 4.15%, 4.4%, 4.4%, and 3.2% improvements in <jats:italic>F</jats:italic>1 score on all the four datasets, respectively.en_NZ
dc.languageenen_NZ
dc.publisherSpringer Science and Business Media LLCen_NZ
dc.relation.urihttps://link.springer.com/article/10.1007/s11227-022-04579-0
dc.rights© 2022 Springer Nature Switzerland AG. Part of Springer Nature. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Com mons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licen ses/by/4.0/
dc.subjectLinguistic feature; Feature selection; Artifcial bee colony; Aspect term extraction
dc.titleBeeAE: Effective Aspect Term Extraction With Artificial Bee Colonyen_NZ
dc.typeJournal Article
dc.rights.accessrightsOpenAccessen_NZ
dc.identifier.doi10.1007/s11227-022-04579-0en_NZ
pubs.elements-id454997
aut.relation.journalThe Journal of Supercomputingen_NZ


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record