From: AraXLNet: pre-trained language model for sentiment analysis of Arabic
Model | Method | Accuracy (%) | F1-Score (SQuAD1.1)a (%) |
---|---|---|---|
BERT | Bidirectional transformer with masked language model MLM and next sentence prediction NSP | 72.0 | 90.9 |
RoBERTa | BERT without NSP | 83.2 | 94.6 |
XLNet | Bidirectional transformer with permutation-based modeling | 85.4 | 95.1 |