SourceScore

Verified claim · AI-ML · 100% confidence

RoBERTa introduced in: Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach.

Last verified 2026-05-16 · Methodology veritas-v0.1 · d4fecb26a4c9cdca

Structured fields

Subject
RoBERTa
Predicate
introduced_in
Object
Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach
Confidence
100%
Tags
roberta · bert · facebook-ai · pretraining · foundational · 2019 · introduced_in

Sources (2)

  1. [1] preprint · arXiv (Liu, Ott, Goyal, Du, Joshi, Chen, Levy, Lewis, Zettlemoyer, Stoyanov / Facebook AI) · 2019-07-26

    RoBERTa: A Robustly Optimized BERT Pretraining Approach
    We present a replication study of BERT pretraining (Devlin et al., 2019) that carefully measures the impact of many key hyperparameters and training data size. We find that BERT was significantly undertrained, and can match or exceed the performance of every model published after it.
  2. [2] official blog · Hugging Face · 2019-07-26

    RoBERTa — Hugging Face Transformers documentation

Cite this claim

Ready-to-paste citation (Markdown / plain text):

RoBERTa introduced in: Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach. — SourceScore Claim d4fecb26a4c9cdca (verified 2026-05-16). https://sourcescore.org/api/v1/claims/d4fecb26a4c9cdca.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/d4fecb26a4c9cdca/" width="100%" height="360" frameborder="0" loading="lazy" title="RoBERTa introduced in: Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/d4fecb26a4c9cdca.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/d4fecb26a4c9cdca.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "RoBERTa introduced in: Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/d4fecb26a4c9cdca.json") envelope = r.json() print(envelope["claim"]["statement"]) # "RoBERTa introduced in: Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_roberta_fact() -> dict: """Fetch the verified SourceScore claim for RoBERTa.""" r = httpx.get("https://sourcescore.org/api/v1/claims/d4fecb26a4c9cdca.json") return r.json()