Verified claim · AI-ML · 100% confidence
Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021).
Last verified 2026-05-16 · Methodology veritas-v0.1 · d7b97d1b93d8d8bc
Structured fields
- Subject
- Low-Rank Adaptation (LoRA)
- Predicate
introduced_in_paper- Object
- LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)
- Confidence
- 100%
- Tags
- lora · fine-tuning · foundational · hu · 2021 · microsoft
Sources (2)
[1] preprint · arXiv (Hu, Shen, Wallis, Allen-Zhu, Li, Wang, Wang, Chen) · 2021-06-17
LoRA: Low-Rank Adaptation of Large Language Models“We propose Low-Rank Adaptation, or LoRA, which freezes the pretrained model weights and injects trainable rank decomposition matrices into each layer of the Transformer architecture, greatly reducing the number of trainable parameters for downstream tasks.”
[2] github release · Microsoft · 2021-06-30
LoRA reference implementation
Cite this claim
Ready-to-paste citation (Markdown / plain text):
Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021). — SourceScore Claim d7b97d1b93d8d8bc (verified 2026-05-16). https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.jsonEmbed this claim
Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.
<iframe src="https://sourcescore.org/embed/claim/d7b97d1b93d8d8bc/" width="100%" height="360" frameborder="0" loading="lazy" title="Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)."></iframe>Preview: open in new tab
Related claims
Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.
LoRA (Low-Rank Adaptation) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021).
f191b2876790dc6e · 100% confidence · shares 5 tags (lora, fine-tuning, foundational…)
ResNet (Residual Networks) introduced in paper: Deep Residual Learning for Image Recognition (He et al., 2015).
4f55f77c4bfb316e · 100% confidence · shares 2 tags (foundational, microsoft)
Switch Transformer introduced in paper: Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity (Fedus et al., 2021).
3d9c14b9379038c9 · 100% confidence · shares 2 tags (foundational, 2021)
QLoRA introduced in paper: QLoRA: Efficient Finetuning of Quantized LLMs (Dettmers et al., 2023).
767cbe41c961be1a · 100% confidence · shares 2 tags (fine-tuning, foundational)
Rotary Position Embedding (RoPE) introduced in paper: RoFormer: Enhanced Transformer with Rotary Position Embedding (Su et al., 2021).
f8d64457ba9fd35b · 100% confidence · shares 2 tags (foundational, 2021)
Use this claim in your code
Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.
cURL
curl https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.jsonJavaScript / TypeScript
const r = await fetch("https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.json");
const envelope = await r.json();
console.log(envelope.claim.statement);
// "Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)."Python
import httpx
r = httpx.get("https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.json")
envelope = r.json()
print(envelope["claim"]["statement"])
# "Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)."LangChain (retrieve-then-cite)
from langchain_core.tools import tool
import httpx
@tool
def get_low_rank_adaptation_lora_fact() -> dict:
"""Fetch the verified SourceScore claim for Low-Rank Adaptation (LoRA)."""
r = httpx.get("https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.json")
return r.json()