VerifiedLLM

Cryptographic proof that an LLM said what it said.

On-chain attestation for every LLM API call

VerifiedLLM is a proxy that sits between you and your LLM provider. It hashes every request and response, commits the proof to Solana, and returns a tamper-evident verification receipt.

How it works

1 Send your request

Call our proxy instead of the LLM API directly. We support OpenAI, Anthropic, xAI, and Gemini.

2 We hash everything

SHA-256 hash of the exact request bytes and the exact response bytes. Deterministic and verifiable.

3 On-chain commitment

Both hashes are committed to Solana via a memo transaction. Immutable, timestamped, public.

4 Get your receipt

Your LLM response comes back with a verification receipt containing hashes, Solana tx, and a verify link.

Supported providers

OpenAI Anthropic xAI Google Gemini

Quick start

curl -X POST https://verifiedllm.com/v1/proxy/openai/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "X-Provider-Key: sk-your-openai-key" \
  -d '{
    "model": "gpt-4o",
    "messages": [{"role": "user", "content": "Hello"}]
  }'