عرض بسيط للتسجيلة

المؤلفElJundi, Obeida
المؤلفAntoun, Wissam
المؤلفEl Droubi, Nour
المؤلفHajj, Hazem
المؤلفEl-Hajj, Wassim
المؤلفShaban, Khaled
تاريخ الإتاحة2022-12-21T10:01:47Z
تاريخ النشر2019
اسم المنشورACL 2019 - 4th Arabic Natural Language Processing Workshop, WANLP 2019 - Proceedings of the Workshop
المصدرScopus
معرّف المصادر الموحدhttp://dx.doi.org/10.18653/v1/W19-4608
معرّف المصادر الموحدhttp://hdl.handle.net/10576/37510
الملخصArabic is a complex language with limited resources which makes it challenging to produce accurate text classification tasks such as sentiment analysis. The utilization of transfer learning (TL) has recently shown promising results for advancing accuracy of text classification in English. TL models are pre-trained on large corpora, and then fine-tuned on task-specific datasets. In particular, universal language models (ULMs), such as recently developed BERT, have achieved state-of-the-art results in various NLP tasks in English. In this paper, we hypothesize that similar success can be achieved for Arabic. The work aims at supporting the hypothesis by developing the first Universal Language Model in Arabic (hULMonA - حلمنا meaning our dream), demonstrating its use for Arabic classifications tasks, and demonstrating how a pre-trained multi-lingual BERT can also be used for Arabic. We then conduct a benchmark study to evaluate both ULM successes with Arabic sentiment analysis. Experiment results show that the developed hULMonA and multi-lingual ULM are able to generalize well to multiple Arabic data sets and achieve new state of the art results in Arabic Sentiment Analysis for some of the tested sets.
اللغةen
الناشرAssociation for Computational Linguistics (ACL)
الموضوعClassification (of information)
Computational linguistics
Large dataset
Benchmark study
Classification tasks
Data set
Language model
Large corpora
Learning models
Sentiment analysis
State of the art
Text classification
Transfer learning
Sentiment analysis
العنوانHulmona ( حلمنا ): The universal language model in arabic
النوعConference Paper
الصفحات68-77
dc.accessType Abstract Only


الملفات في هذه التسجيلة

الملفاتالحجمالصيغةالعرض

لا توجد ملفات لها صلة بهذه التسجيلة.

هذه التسجيلة تظهر في المجموعات التالية

عرض بسيط للتسجيلة