Skip to main content
The LLM Stats API gives you direct, read-only access to the same dataset that powers llm-stats.com — every model, every benchmark score, every pricing change.

Base URL

https://api.llm-stats.com/stats/v1

Authentication

Send your API key as a Bearer token on every request.
Authorization: Bearer YOUR_API_KEY
Request access and create keys from the developer console.
Keys starting with ze_ are LLM Stats keys. The same key works for both the Stats API and the Gateway API.

Endpoints at a glance

MethodPathDescription
GET/v1/modelsCatalog with metadata, pricing, and category scores
GET/v1/models/{id}Full model detail with every benchmark score
GET/v1/benchmarksAll benchmarks with categories and model counts
GET/v1/scoresScore matrix — filter across models and benchmarks
GET/v1/rankingsTrueSkill rankings by category
GET/v1/updatesRecently added models (1–30 day lookback)
See the Endpoints section for full request and response schemas.

Errors

Every error uses the same envelope, so you only need to write the handling code once.
{
  "error": {
    "code": "not_found",
    "message": "Human-readable explanation.",
    "param": "model_id"
  }
}
code is the contract — branch on it, never on message.

Rate limits

Limits are applied per API key, per endpoint:
EndpointLimit
/v1/models/{id}120 / minute
/v1/rankings120 / minute
/v1/models60 / minute
/v1/benchmarks60 / minute
/v1/updates60 / minute
/v1/scores30 / minute
Exceeding a limit returns HTTP 429 with the standard error envelope and a Retry-After header. Need higher limits? Get in touch.