Verified claim · AI-ML · 100% confidence
Long Short-Term Memory (LSTM) introduced in: 1997 by Hochreiter and Schmidhuber.
Last verified 2026-05-16 · Methodology veritas-v0.1 · 97ec4d132871224b
Structured fields
- Subject
- Long Short-Term Memory (LSTM)
- Predicate
introduced_in- Object
- 1997 by Hochreiter and Schmidhuber
- Confidence
- 100%
- Tags
- lstm · rnn · hochreiter · schmidhuber · foundational · 1997 · introduced_in
Sources (2)
[1] peer reviewed · Neural Computation 9(8) — Hochreiter & Schmidhuber · 1997-11-15
Long Short-Term Memory“We introduce a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete time steps by enforcing constant error flow through constant error carrousels within special units.”
[2] peer reviewed · MIT Press / Neural Computation · 1997-11-15
Long Short-Term Memory — MIT Press archive
Cite this claim
Ready-to-paste citation (Markdown / plain text):
Long Short-Term Memory (LSTM) introduced in: 1997 by Hochreiter and Schmidhuber. — SourceScore Claim 97ec4d132871224b (verified 2026-05-16). https://sourcescore.org/api/v1/claims/97ec4d132871224b.jsonEmbed this claim
Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.
<iframe src="https://sourcescore.org/embed/claim/97ec4d132871224b/" width="100%" height="360" frameborder="0" loading="lazy" title="Long Short-Term Memory (LSTM) introduced in: 1997 by Hochreiter and Schmidhuber."></iframe>Preview: open in new tab
Related claims
Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.
Sequence-to-Sequence Learning (seq2seq) introduced in paper: Sequence to Sequence Learning with Neural Networks (Sutskever, Vinyals, Le, 2014).
ff80a25ed7e83b45 · 100% confidence · shares 2 tags (lstm, foundational)
RoBERTa introduced in: Liu et al. 2019 — A Robustly Optimized BERT Pretraining Approach.
d4fecb26a4c9cdca · 100% confidence · shares 2 tags (foundational, introduced_in)
DistilBERT introduced in: Sanh et al. 2019 — a smaller, faster, cheaper BERT via knowledge distillation.
245af747a3d21061 · 100% confidence · shares 2 tags (foundational, introduced_in)
BART introduced in: Lewis et al. 2019 — denoising sequence-to-sequence pretraining.
f5b422e3255fd7c0 · 100% confidence · shares 2 tags (foundational, introduced_in)
GloVe introduced in: Pennington, Socher, Manning 2014 — global vectors for word representation.
7f9254f3c0612ed0 · 100% confidence · shares 2 tags (foundational, introduced_in)
Use this claim in your code
Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.
cURL
curl https://sourcescore.org/api/v1/claims/97ec4d132871224b.jsonJavaScript / TypeScript
const r = await fetch("https://sourcescore.org/api/v1/claims/97ec4d132871224b.json");
const envelope = await r.json();
console.log(envelope.claim.statement);
// "Long Short-Term Memory (LSTM) introduced in: 1997 by Hochreiter and Schmidhuber."Python
import httpx
r = httpx.get("https://sourcescore.org/api/v1/claims/97ec4d132871224b.json")
envelope = r.json()
print(envelope["claim"]["statement"])
# "Long Short-Term Memory (LSTM) introduced in: 1997 by Hochreiter and Schmidhuber."LangChain (retrieve-then-cite)
from langchain_core.tools import tool
import httpx
@tool
def get_long_short_term_memory_lstm_fact() -> dict:
"""Fetch the verified SourceScore claim for Long Short-Term Memory (LSTM)."""
r = httpx.get("https://sourcescore.org/api/v1/claims/97ec4d132871224b.json")
return r.json()