SourceScore

Verified claim · AI-ML · 100% confidence

Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021).

Last verified 2026-05-16 · Methodology veritas-v0.1 · d7b97d1b93d8d8bc

Structured fields

Subject
Low-Rank Adaptation (LoRA)
Predicate
introduced_in_paper
Object
LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)
Confidence
100%
Tags
lora · fine-tuning · foundational · hu · 2021 · microsoft

Sources (2)

  1. [1] preprint · arXiv (Hu, Shen, Wallis, Allen-Zhu, Li, Wang, Wang, Chen) · 2021-06-17

    LoRA: Low-Rank Adaptation of Large Language Models
    We propose Low-Rank Adaptation, or LoRA, which freezes the pretrained model weights and injects trainable rank decomposition matrices into each layer of the Transformer architecture, greatly reducing the number of trainable parameters for downstream tasks.
  2. [2] github release · Microsoft · 2021-06-30

    LoRA reference implementation

Cite this claim

Ready-to-paste citation (Markdown / plain text):

Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021). — SourceScore Claim d7b97d1b93d8d8bc (verified 2026-05-16). https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/d7b97d1b93d8d8bc/" width="100%" height="360" frameborder="0" loading="lazy" title="Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.json") envelope = r.json() print(envelope["claim"]["statement"]) # "Low-Rank Adaptation (LoRA) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_low_rank_adaptation_lora_fact() -> dict: """Fetch the verified SourceScore claim for Low-Rank Adaptation (LoRA).""" r = httpx.get("https://sourcescore.org/api/v1/claims/d7b97d1b93d8d8bc.json") return r.json()