SourceScore

Verified claim · AI-ML · 100% confidence

Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network.

Last verified 2026-05-16 · Methodology veritas-v0.1 · f14acb906ba6c12f

Structured fields

Subject
Knowledge Distillation
Predicate
popularized_in
Object
Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network
Confidence
100%
Tags
knowledge-distillation · hinton · google · compression · foundational · 2015 · introduced_in

Sources (2)

  1. [1] preprint · arXiv (Hinton, Vinyals, Dean / Google) · 2015-03-09

    Distilling the Knowledge in a Neural Network
    A very simple way to improve the performance of almost any machine learning algorithm is to train many different models on the same data and then to average their predictions. Unfortunately, making predictions using a whole ensemble of models is cumbersome and may be too computationally expensive. We show that it is possible to compress the knowledge in an ensemble into a single model which is much easier to deploy.
  2. [2] preprint · arXiv · 2015-03-09

    Knowledge Distillation — full paper PDF

Cite this claim

Ready-to-paste citation (Markdown / plain text):

Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network. — SourceScore Claim f14acb906ba6c12f (verified 2026-05-16). https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/f14acb906ba6c12f/" width="100%" height="360" frameborder="0" loading="lazy" title="Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.json") envelope = r.json() print(envelope["claim"]["statement"]) # "Knowledge Distillation popularized in: Hinton, Vinyals, Dean 2015 — distilling the knowledge in a neural network."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_knowledge_distillation_fact() -> dict: """Fetch the verified SourceScore claim for Knowledge Distillation.""" r = httpx.get("https://sourcescore.org/api/v1/claims/f14acb906ba6c12f.json") return r.json()