SourceScore

Verified claim · AI-ML · 100% confidence

Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017).

Last verified 2026-05-16 · Methodology veritas-v0.1 · 2d6d7f61f1db6493

Structured fields

Subject
Sparsely-Gated Mixture-of-Experts (MoE)
Predicate
introduced_in_paper
Object
Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017)
Confidence
100%
Tags
moe · foundational · shazeer · 2017 · google

Sources (1)

  1. [1] preprint · arXiv (Shazeer, Mirhoseini, Maziarz, Davis, Le, Hinton, Dean) · 2017-01-23

    Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer
    We introduce a Sparsely-Gated Mixture-of-Experts layer (MoE), consisting of up to thousands of feed-forward sub-networks. A trainable gating network determines a sparse combination of these experts to use for each example.

Cite this claim

Ready-to-paste citation (Markdown / plain text):

Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017). — SourceScore Claim 2d6d7f61f1db6493 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/2d6d7f61f1db6493.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/2d6d7f61f1db6493/" width="100%" height="360" frameborder="0" loading="lazy" title="Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017)."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/2d6d7f61f1db6493.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/2d6d7f61f1db6493.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017)."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/2d6d7f61f1db6493.json") envelope = r.json() print(envelope["claim"]["statement"]) # "Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017)."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_sparsely_gated_mixture_of_experts_moe_fact() -> dict: """Fetch the verified SourceScore claim for Sparsely-Gated Mixture-of-Experts (MoE).""" r = httpx.get("https://sourcescore.org/api/v1/claims/2d6d7f61f1db6493.json") return r.json()