SourceScore

Verified claim · AI-ML · 100% confidence

GPT-4 Turbo context window tokens: 128000.

Last verified 2026-05-16 · Methodology veritas-v0.1 · 26314e3164e18b24

Structured fields

Subject
GPT-4 Turbo
Predicate
context_window_tokens
Object
128000
Confidence
100%
Tags
gpt-4-turbo · context · openai · 128k · devday

Sources (2)

  1. [1] official blog · OpenAI · 2023-11-06

    New models and developer products announced at DevDay
    GPT-4 Turbo … supports up to 128K tokens of context — equivalent to more than 300 pages of text in a single prompt.
  2. [2] docs · OpenAI

    OpenAI GPT-4 Turbo model documentation

Cite this claim

Ready-to-paste citation (Markdown / plain text):

GPT-4 Turbo context window tokens: 128000. — SourceScore Claim 26314e3164e18b24 (verified 2026-05-16). https://sourcescore.org/api/v1/claims/26314e3164e18b24.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/26314e3164e18b24/" width="100%" height="360" frameborder="0" loading="lazy" title="GPT-4 Turbo context window tokens: 128000."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/26314e3164e18b24.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/26314e3164e18b24.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "GPT-4 Turbo context window tokens: 128000."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/26314e3164e18b24.json") envelope = r.json() print(envelope["claim"]["statement"]) # "GPT-4 Turbo context window tokens: 128000."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_gpt_4_turbo_fact() -> dict: """Fetch the verified SourceScore claim for GPT-4 Turbo.""" r = httpx.get("https://sourcescore.org/api/v1/claims/26314e3164e18b24.json") return r.json()