We have implemented “super-types” that are strongly typed in TypeScript and compatible with all providers (think a superset in mathematics). Hence the name, Super SDK. Gateway is super light weight. We don’t wrap around other SDKs. We describe the APIs in the provider packages (@adaline/openai) that plugs into @adaline/gateway that has all the features.
The fully local production-grade Super SDK that provides a simple, unified, and powerful interface for calling more than 200+ LLMs.
- Production-ready and used by enterprises.
- Fully local and NOT a proxy. You can deploy it anywhere.
- Comes with batching, retries, caching, callbacks, and OpenTelemetry support.
- Supports custom plugins for caching, logging, HTTP client, and more. You can use it like LEGOs and make it work with your infrastructure.
- Supports plug-and-play providers. You can run fully custom providers and still leverage all the benefits of Adaline Gateway.
Features
Strongly typed in TypeScript
Isomorphic - works everywhere
100% local and private and NOT a proxy
Tool calling support across all compatible LLMs
Batching for all requests with custom queue support
Automatic retries with exponential backoff
Caching with custom cache plug-in support
Callbacks for full custom instrumentation and hooks
OpenTelemetry to plug tracing into your existing infrastructure
Plug-and-play custom providers for local and custom models
* Collaborative playground with automatic versioning and multi-provider support.
* One-click evals like context recall, llm-rubric (LLM as a judge), latency, and many more against datasets to get a definitive score and run regressions.
* Datasets that play well with logs and enable one-click creation of ‘good’ and ‘bad’ examples from real data.
* Logs that can enrich the analytics dashboard to track token usage, latency, cost, etc.
* Continuous evals sampled on live logs to ensure real-time performance.
## Coming soon
* One-click synthetic data to make iteration even faster.
* Auto-suggestions for instructions in the prompts by using the eval and prompt diff data already in the workspace.
* Multi-modality across the workspace - playground, datasets, evals, logs, history, and analytics.
* Prompt engineering → Flow engineering for multi-turn, CoT, and Tree of thought use cases.
We allow everyone to accept payments without any intermediaries. We are building Stripe for closed-loop payments. Our mission is to make the exchange of value seamless and equitable for everyone.
We are currently hiring full stack engineers who can later lead a team across different functions of Zage. Thank you for considering us! Email us at [founders(at)zage.app] with a short blurb and a resume.