Verified claim · AI-ML · 100% confidence
DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation.
Last verified 2026-05-16 · Methodology veritas-v0.1 · 245af747a3d21061
Structured fields
- Subject
- DistilBERT
- Predicate
introduced_in- Object
- Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation
- Confidence
- 100%
- Tags
- distilbert · bert · knowledge-distillation · hugging-face · foundational · 2019 · introduced_in
Sources (2)
[1] preprint · arXiv (Sanh, Debut, Chaumond, Wolf / Hugging Face) · 2019-10-02
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter“We introduce a method to pre-train a smaller general-purpose language representation model, called DistilBERT, which can then be fine-tuned with good performances on a wide range of tasks like its larger counterparts. We show that it is possible to reduce the size of a BERT model by 40%, while retaining 97% of its language understanding capabilities and being 60% faster.”
[2] official blog · Hugging Face · 2019-10-02
DistilBERT — Hugging Face Transformers documentation
Cite this claim
Ready-to-paste citation (Markdown / plain text):
DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation. — SourceScore Claim 245af747a3d21061 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/245af747a3d21061.jsonEmbed this claim
Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.
<iframe src="https://sourcescore.org/embed/claim/245af747a3d21061/" width="100%" height="360" frameborder="0" loading="lazy" title="DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation."></iframe>Preview: open in new tab
Related claims
Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.
RoBERTa introduced in: Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach.
d4fecb26a4c9cdca · 100% confidence · shares 4 tags (bert, foundational, 2019…)
BART introduced in: Lewis et al. 2019 — denoising sequence-to-sequence pretraining.
f5b422e3255fd7c0 · 100% confidence · shares 3 tags (foundational, 2019, introduced_in)
Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network.
f14acb906ba6c12f · 100% confidence · shares 3 tags (knowledge-distillation, foundational, introduced_in)
BERT (Bidirectional Encoder Representations from Transformers) introduced in paper: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Devlin et al., 2018).
4c1ee70007dc89c1 · 100% confidence · shares 2 tags (bert, foundational)
GPT-2 introduced in paper: Language Models are Unsupervised Multitask Learners (Radford et al., 2019).
859551dc078c46f8 · 100% confidence · shares 2 tags (foundational, 2019)
Use this claim in your code
Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.
cURL
curl https://sourcescore.org/api/v1/claims/245af747a3d21061.jsonJavaScript / TypeScript
const r = await fetch("https://sourcescore.org/api/v1/claims/245af747a3d21061.json");
const envelope = await r.json();
console.log(envelope.claim.statement);
// "DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation."Python
import httpx
r = httpx.get("https://sourcescore.org/api/v1/claims/245af747a3d21061.json")
envelope = r.json()
print(envelope["claim"]["statement"])
# "DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation."LangChain (retrieve-then-cite)
from langchain_core.tools import tool
import httpx
@tool
def get_distilbert_fact() -> dict:
"""Fetch the verified SourceScore claim for DistilBERT."""
r = httpx.get("https://sourcescore.org/api/v1/claims/245af747a3d21061.json")
return r.json()