Codec
Token-native binary transport for AI APIs.
The control plane for AI inference. Codec lets gateways, routers, agents, and tool dispatchers operate on raw token IDs end-to-end — no repeated detokenize-and-reserialize between hops.
Stop paying the serialization tax
Every traditional AI API call converts tokens → text → JSON → text → tokens at each hop. Codec keeps the wire in binary token form the whole way through. The result is smaller payloads, faster tool dispatch, and clean handoffs between models with different vocabularies.
Three primitives
Route
Wire-native streaming of token IDs as length-prefixed binary frames. Optional gzip, brotli, or dict-zstd compression.
Dispatch
MCP tool-call integration with leaf-mode bypass — token IDs forward transparently to tools that don't need to inspect them.
Translate
Cross-vocabulary agent handoffs between models with different tokenizers, without round-tripping through text.
The wins are measurable
Infra operators and agent builders
Teams running AI inference at scale. Builders of gateways, routers, and multi-agent systems. Anyone whose agents talk across models and tools and can't afford the serialization round-trip.
Licensing: source-available under BSL 1.1 — free for non-production and for organizations under US $5M annual revenue. Commercial licensing above that.