TensorZero creates a feedback loop for optimizing LLM applications — turning production data into smarter, faster, and cheaper models.
-
Updated
May 5, 2025 - Rust
TensorZero creates a feedback loop for optimizing LLM applications — turning production data into smarter, faster, and cheaper models.
⚙️🦀 Build portable, modular & lightweight Fullstack Agents
The AI-native proxy server for agents. Arch handles the pesky low-level work in building agentic apps like calling specific tools, routing prompts to the right agents, clarifying vague inputs, unifying access and observability to any LLM, etc.
BionicGPT is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality
AICI: Prompts as (Wasm) Programs
Stateful load balancer custom-tailored for llama.cpp 🏓🦙
Scalable, fast, and disk-friendly vector search in Postgres, the successor of pgvecto.rs.
High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included
Simple, Composable, High-Performance, Safe and Web3 Friendly AI Agents for Everyone
Robot VLM and VLA (Vision-Language-Action) inference API helping you manage multimodal prompts, RAG, and location metadata
Burgonet Gateway is an enterprise LLM gateway that provides secure access and compliance controls for AI systems
A Lazy, high throughput and blazing fast structured text generation backend.
Burgonet Gateway is an enterprise LLM gateway that provides secure access and compliance controls for AI systems
Add a description, image, and links to the llmops topic page so that developers can more easily learn about it.
To associate your repository with the llmops topic, visit your repo's landing page and select "manage topics."