Skip to content

The agent envelope

Every response from api.gridrock.ai ships with an _agent object describing what the response means, in machine-friendly terms.

{
  "h3": "8861aacd1bfffff",
  "city": "Mumbai",
  ...,
  "_agent": {
    "endpoint": "/v1/intel/hex/:h3",
    "version": "1.0.0",
    "summary_for_llm": "Returns merged labels, admin polygons, transit, top POIs and environmental signals for a single H3 cell.",
    "typical_use_case": "An LLM agent answering a 'tell me about this place' question.",
    "next_endpoints": ["/v1/hex/freshness", "/v1/env/aqi", "/v1/admin/ward"],
    "cache_ttl_s": 300,
    "cost_tokens": 35
  }
}

This means an LLM can reason about what to do next without your prompt having to explain the API.