SourceScore

Verified claim · AI-ML · 100% confidence

LoRA (Low-Rank Adaptation) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021).

Last verified 2026-05-16 · Methodology veritas-v0.1 · f191b2876790dc6e

Structured fields

Subject
LoRA (Low-Rank Adaptation)
Predicate
introduced_in_paper
Object
LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)
Confidence
100%
Tags
lora · peft · fine-tuning · foundational · 2021 · microsoft

Sources (2)

  1. [1] preprint · arXiv (Hu, Shen, Wallis, Allen-Zhu, Li, Wang, Wang, Chen) · 2021-06-17

    LoRA: Low-Rank Adaptation of Large Language Models
    We propose Low-Rank Adaptation, or LoRA, which freezes the pre-trained model weights and injects trainable rank decomposition matrices into each layer of the Transformer architecture, greatly reducing the number of trainable parameters for downstream tasks.
  2. [2] github release · Microsoft · 2021-06-17

    microsoft/LoRA — official implementation

Cite this claim

Ready-to-paste citation (Markdown / plain text):

LoRA (Low-Rank Adaptation) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021). — SourceScore Claim f191b2876790dc6e (verified 2026-05-16). https://sourcescore.org/api/v1/claims/f191b2876790dc6e.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/f191b2876790dc6e/" width="100%" height="360" frameborder="0" loading="lazy" title="LoRA (Low-Rank Adaptation) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/f191b2876790dc6e.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/f191b2876790dc6e.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "LoRA (Low-Rank Adaptation) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/f191b2876790dc6e.json") envelope = r.json() print(envelope["claim"]["statement"]) # "LoRA (Low-Rank Adaptation) introduced in paper: LoRA: Low-Rank Adaptation of Large Language Models (Hu et al., 2021)."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_lora_low_rank_adaptation_fact() -> dict: """Fetch the verified SourceScore claim for LoRA (Low-Rank Adaptation).""" r = httpx.get("https://sourcescore.org/api/v1/claims/f191b2876790dc6e.json") return r.json()