Product

Codec

Token-native binary transport for AI APIs.

The control plane for AI inference. Codec lets gateways, routers, agents, and tool dispatchers operate on raw token IDs end-to-end — no repeated detokenize-and-reserialize between hops.

Why it matters

Stop paying the serialization tax

Every traditional AI API call converts tokens → text → JSON → text → tokens at each hop. Codec keeps the wire in binary token form the whole way through. The result is smaller payloads, faster tool dispatch, and clean handoffs between models with different vocabularies.

Three primitives

Route

Wire-native streaming of token IDs as length-prefixed binary frames. Optional gzip, brotli, or dict-zstd compression.

Dispatch

MCP tool-call integration with leaf-mode bypass — token IDs forward transparently to tools that don't need to inspect them.

Translate

Cross-vocabulary agent handoffs between models with different tokenizers, without round-tripping through text.

Numbers

The wins are measurable

1,707×
smaller on the wire for 2K-token streams, same TTFB
100×
faster tool-call detection vs. detokenize-and-regex
30%
faster agent-to-agent handoff, 15.1× smaller wire
Who it's for

Infra operators and agent builders

Teams running AI inference at scale. Builders of gateways, routers, and multi-agent systems. Anyone whose agents talk across models and tools and can't afford the serialization round-trip.

Licensing: source-available under BSL 1.1 — free for non-production and for organizations under US $5M annual revenue. Commercial licensing above that.