Skip to main content

Table 4 The pre-trained models

From: Survey of transformers and towards ensemble learning using transformers for natural language processing

Model

BERT

XLNet

GPT2

RoBERTa

ALBERT

Pretrained model

Bert-base-uncased

xlnet-base-cased

gpt2

Roberta-base

ALBERT-base-v2