Verified claim · AI-ML · 100% confidence
AdamW optimizer introduced in paper: Decoupled Weight Decay Regularization (Loshchilov & Hutter, 2017).
Last verified 2026-05-16 · Methodology veritas-v0.1 · b6d51eba4fc7f918
Structured fields
- Subject
- AdamW optimizer
- Predicate
introduced_in_paper- Object
- Decoupled Weight Decay Regularization (Loshchilov & Hutter, 2017)
- Confidence
- 100%
- Tags
- adamw · optimizer · weight-decay · foundational · 2017 · iclr
Sources (2)
[1] preprint · arXiv (Loshchilov, Hutter) · 2017-11-14
Decoupled Weight Decay Regularization“We propose a simple modification to recover the original formulation of weight decay regularization by decoupling the weight decay from the optimization steps taken w.r.t. the loss function.”
[2] peer reviewed · OpenReview / ICLR · 2019-05-06
Decoupled Weight Decay Regularization (ICLR 2019)
Cite this claim
Ready-to-paste citation (Markdown / plain text):
AdamW optimizer introduced in paper: Decoupled Weight Decay Regularization (Loshchilov & Hutter, 2017). — SourceScore Claim b6d51eba4fc7f918 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/b6d51eba4fc7f918.jsonEmbed this claim
Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.
<iframe src="https://sourcescore.org/embed/claim/b6d51eba4fc7f918/" width="100%" height="360" frameborder="0" loading="lazy" title="AdamW optimizer introduced in paper: Decoupled Weight Decay Regularization (Loshchilov & Hutter, 2017)."></iframe>Preview: open in new tab
Related claims
Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.
Adam optimizer introduced in paper: Adam: A Method for Stochastic Optimization (Kingma, Ba, 2014).
dffbe905003cc581 · 100% confidence · shares 3 tags (optimizer, foundational, iclr)
Mixture of Experts (MoE) revival popularized in: Shazeer et al. 2017 — outrageously large neural networks via sparse gating.
f068236101568ad7 · 100% confidence · shares 3 tags (foundational, iclr, 2017)
Transformer architecture introduced in paper: Attention Is All You Need (Vaswani et al., 2017).
ad17e76a8baad7a1 · 100% confidence · shares 2 tags (foundational, 2017)
Reinforcement Learning from Human Feedback (RLHF) introduced in paper: Deep Reinforcement Learning from Human Preferences (Christiano et al., 2017).
67866330cd60e54d · 100% confidence · shares 2 tags (foundational, 2017)
Sparsely-Gated Mixture-of-Experts (MoE) introduced in paper: Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer (Shazeer et al., 2017).
2d6d7f61f1db6493 · 100% confidence · shares 2 tags (foundational, 2017)
Use this claim in your code
Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.
cURL
curl https://sourcescore.org/api/v1/claims/b6d51eba4fc7f918.jsonJavaScript / TypeScript
const r = await fetch("https://sourcescore.org/api/v1/claims/b6d51eba4fc7f918.json");
const envelope = await r.json();
console.log(envelope.claim.statement);
// "AdamW optimizer introduced in paper: Decoupled Weight Decay Regularization (Loshchilov & Hutter, 2017)."Python
import httpx
r = httpx.get("https://sourcescore.org/api/v1/claims/b6d51eba4fc7f918.json")
envelope = r.json()
print(envelope["claim"]["statement"])
# "AdamW optimizer introduced in paper: Decoupled Weight Decay Regularization (Loshchilov & Hutter, 2017)."LangChain (retrieve-then-cite)
from langchain_core.tools import tool
import httpx
@tool
def get_adamw_optimizer_fact() -> dict:
"""Fetch the verified SourceScore claim for AdamW optimizer."""
r = httpx.get("https://sourcescore.org/api/v1/claims/b6d51eba4fc7f918.json")
return r.json()