SourceScore

Verified claim · AI-ML · 100% confidence

Triton inference server publicly released on: 2018-11 by NVIDIA — formerly TensorRT Inference Server.

Last verified 2026-05-16 · Methodology veritas-v0.1 · 78ec1ceed08a221c

Structured fields

Subject
Triton inference server
Predicate
publicly_released_on
Object
2018-11 by NVIDIA — formerly TensorRT Inference Server
Confidence
100%
Tags
triton · nvidia · inference · serving · open-source · released_on · 2018

Sources (2)

  1. [1] official blog · NVIDIA · 2018-11-15

    NVIDIA Triton Inference Server
    NVIDIA Triton Inference Server, formerly known as TensorRT Inference Server, is an open-source software that simplifies the deployment of AI models at scale in production.
  2. [2] github release · NVIDIA · 2018-11-15

    Triton Inference Server — official GitHub repository

Cite this claim

Ready-to-paste citation (Markdown / plain text):

Triton inference server publicly released on: 2018-11 by NVIDIA — formerly TensorRT Inference Server. — SourceScore Claim 78ec1ceed08a221c (verified 2026-05-16). https://sourcescore.org/api/v1/claims/78ec1ceed08a221c.json

Embed this claim

Drop this iframe into any blog post, docs page, or knowledge base. The widget renders the signed claim + primary source + click-through to this canonical page. CC-BY 4.0; attribution included.

<iframe src="https://sourcescore.org/embed/claim/78ec1ceed08a221c/" width="100%" height="360" frameborder="0" loading="lazy" title="Triton inference server publicly released on: 2018-11 by NVIDIA — formerly TensorRT Inference Server."></iframe>

Preview: open in new tab

Related claims

Other verified claims sharing tags with this one — useful for LLM retrieval graphs and citation discovery.

Use this claim in your code

Fetch this signed envelope from your application. The response includes the verbatim excerpt, primary source URLs, and an HMAC-SHA256 signature you can verify locally for audit trails.

cURL

curl https://sourcescore.org/api/v1/claims/78ec1ceed08a221c.json

JavaScript / TypeScript

const r = await fetch("https://sourcescore.org/api/v1/claims/78ec1ceed08a221c.json"); const envelope = await r.json(); console.log(envelope.claim.statement); // "Triton inference server publicly released on: 2018-11 by NVIDIA — formerly TensorRT Inference Server."

Python

import httpx r = httpx.get("https://sourcescore.org/api/v1/claims/78ec1ceed08a221c.json") envelope = r.json() print(envelope["claim"]["statement"]) # "Triton inference server publicly released on: 2018-11 by NVIDIA — formerly TensorRT Inference Server."

LangChain (retrieve-then-cite)

from langchain_core.tools import tool import httpx @tool def get_triton_inference_server_fact() -> dict: """Fetch the verified SourceScore claim for Triton inference server.""" r = httpx.get("https://sourcescore.org/api/v1/claims/78ec1ceed08a221c.json") return r.json()