Year hub · 8 claims
AI/ML claims from 2019
Hand-verified research claims with primary sources dated 2019. Each claim has ≥2 primary sources and an HMAC-SHA256 signature.
BART introduced in: Lewis et al. 2019 — denoising sequence-to-sequence pretraining.
f5b422e3255fd7c0 · 2 sources · 100% confidence
C4 (Colossal Clean Crawled Corpus) introduced in paper: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer (Raffel et al., 2019).
0d24c97977ebd744 · 2 sources · 100% confidence
DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation.
245af747a3d21061 · 2 sources · 100% confidence
GPT-2 introduced in paper: Language Models are Unsupervised Multitask Learners (Radford et al., 2019).
859551dc078c46f8 · 2 sources · 100% confidence
Milvus vector database publicly released on: 2019-10-15 by Zilliz.
aa5b818bcc5124d7 · 2 sources · 100% confidence
RoBERTa introduced in: Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach.
d4fecb26a4c9cdca · 2 sources · 100% confidence
SuperGLUE benchmark introduced in paper: SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems (Wang et al., 2019).
1a1e87145608c91a · 2 sources · 100% confidence
T5 (Text-to-Text Transfer Transformer) introduced in paper: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer (Raffel et al., 2019).
ef28341c3b308737 · 2 sources · 100% confidence
Foundational papers · 2024-2025 releases · All claims · All topic hubs