Tag
bert
3 verified claims carrying this tag. Each has 2+ primary sources and an HMAC-SHA256 signature.
BERT (Bidirectional Encoder Representations from Transformers) introduced in paper: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Devlin et al., 2018).
4c1ee70007dc89c1 · 2 sources · 100% confidence
RoBERTa introduced in: Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach.
d4fecb26a4c9cdca · 2 sources · 100% confidence
DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation.
245af747a3d21061 · 2 sources · 100% confidence