Verified claim · AI-ML · 100% confidence
Switch Transformer introduced in paper: Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity (Fedus et al., 2021).
Last verified 2026-05-16 · Methodology veritas-v0.1 · 3d9c14b9379038c9
Structured fields
- Subject
- Switch Transformer
- Predicate
introduced_in_paper- Object
- Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity (Fedus et al., 2021)
- Confidence
- 100%
- Tags
- switch-transformer · moe · foundational · fedus · 2021 · google
Sources (2)
[1] preprint · arXiv (Fedus, Zoph, Shazeer) · 2021-01-11
Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity“We simplify the MoE routing algorithm and design intuitive improved models with reduced communication and computational costs.”
[2] peer reviewed · Journal of Machine Learning Research · 2022-12-01
Switch Transformers (JMLR 2022)
Cite this claim
Ready-to-paste citation (Markdown / plain text):
Switch Transformer introduced in paper: Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity (Fedus et al., 2021). — SourceScore Claim 3d9c14b9379038c9 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/3d9c14b9379038c9.jsonEmbed this claim
Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.
<iframe src="https://sourcescore.org/embed/claim/3d9c14b9379038c9/" width="100%" height="360" frameborder="0" loading="lazy" title="Switch Transformer introduced in paper: Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity (Fedus et al., 2021)."></iframe>Preview: open in new tab
Related claims
Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.
Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017).
2d6d7f61f1db6493 · 100% confidence · shares 3 tags (moe, foundational, google)
Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating.
f068236101568ad7 · 100% confidence · shares 3 tags (moe, google, foundational)
Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021).
d7b97d1b93d8d8bc · 100% confidence · shares 2 tags (foundational, 2021)
BERT (Bidirectional Encoder Representations from Transformers) introduced in paper: BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding (Devlin et al., 2018).
4c1ee70007dc89c1 · 100% confidence · shares 2 tags (foundational, google)
T5 (Text-to-Text Transfer Transformer) introduced in paper: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer (Raffel et al., 2019).
ef28341c3b308737 · 100% confidence · shares 2 tags (foundational, google)
Use this claim in your code
Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.
cURL
curl https://sourcescore.org/api/v1/claims/3d9c14b9379038c9.jsonJavaScript / TypeScript
const r = await fetch("https://sourcescore.org/api/v1/claims/3d9c14b9379038c9.json");
const envelope = await r.json();
console.log(envelope.claim.statement);
// "Switch Transformer introduced in paper: Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity (Fedus et al., 2021)."Python
import httpx
r = httpx.get("https://sourcescore.org/api/v1/claims/3d9c14b9379038c9.json")
envelope = r.json()
print(envelope["claim"]["statement"])
# "Switch Transformer introduced in paper: Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity (Fedus et al., 2021)."LangChain (retrieve-then-cite)
from langchain_core.tools import tool
import httpx
@tool
def get_switch_transformer_fact() -> dict:
"""Fetch the verified SourceScore claim for Switch Transformer."""
r = httpx.get("https://sourcescore.org/api/v1/claims/3d9c14b9379038c9.json")
return r.json()