📄️ Getting Started - E2E Tutorial
End-to-End tutorial for LiteLLM Proxy to:
🗃️ Config.yaml
3 items
🗃️ Setup & Deployment
7 items
📄️ Demo App
Here is a demo of the proxy. To log in pass in:
🗃️ Architecture
3 items
🔗 All Endpoints (Swagger)
📄️ ✨ Enterprise Features
To get a license, get in touch with us here
🗃️ Making LLM Requests
2 items
🗃️ Authentication
7 items
🗃️ Admin UI
3 items
🗃️ Team Management
1 items
🗃️ Spend Tracking + Budgets
6 items
🔗 Load Balancing, Routing, Fallbacks
🗃️ Logging, Alerting, Metrics
5 items
🗃️ [Beta] Guardrails
9 items
🗃️ Secret Managers
2 items
📄️ Prompt Management
LiteLLM supports using Langfuse for prompt management on the proxy.
📄️ Caching
Cache LLM Responses
📄️ Modify / Reject Incoming Requests
- Modify data before making llm api calls on proxy
📄️ Post-Call Rules
Use this to fail a request based on the output of an llm api call.