<h1 align="center">
<a href="https://prompts.chat">
_Unmute video for voice-over_
Sign in to like and favorite skills
Unmute video for voice-over
https://github.com/langfuse/langfuse/assets/2834609/a94062e9-c782-4ee9-af59-dee6370149a8
Managed deployment by the Langfuse team, generous free-tier (hobby plan), no credit card required.
# Clone repository git clone https://github.com/langfuse/langfuse.git cd langfuse # Run server and database docker compose up -d
→ Learn more about deploying locally
Langfuse is simple to self-host and keep updated. It currently requires only a single docker container. → Self Hosting Instructions
Templated deployments: Railway, GCP Cloud Run, AWS Fargate, Kubernetes and others
You need a Langfuse public and secret key to get started. Sign up here and find them in your project settings.
Note: We recommend using our fully async, typed SDKs that allow you to instrument any LLM application with any underlying model. They are available in Python (Decorators) & JS/TS. The SDKs will always be the most fully featured and stable way to ingest data into Langfuse.
You may want to use another integration to get started quickly or implement a use case that we do not yet support. However, we recommend to migrate to the Langfuse SDKs over time to ensure performance and stability.
See the → Quickstart to integrate Langfuse.
| Integration | Supports | Description |
|---|---|---|
| SDK | Python, JS/TS | Manual instrumentation using the SDKs for full flexibility. |
| OpenAI | Python, JS/TS | Automated instrumentation using drop-in replacement of OpenAI SDK. |
| Langchain | Python, JS/TS | Automated instrumentation by passing callback handler to Langchain application. |
| LlamaIndex | Python | Automated instrumentation via LlamaIndex callback system. |
| Haystack | Python | Automated instrumentation via Haystack content tracing system. |
| LiteLLM | Python, JS/TS (proxy only) | Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs). |
| API | Directly call the public API. OpenAPI spec available. |
Packages that integrate with Langfuse:
| Name | Description |
|---|---|
| Instructor | Library to get structured LLM outputs (JSON, Pydantic) |
| Mirascope | Python toolkit for building LLM applications. |
| AI SDK by Vercel | Typescript SDK that makes streaming LLM outputs super easy. |
| Flowise | JS/TS no-code builder for customized LLM flows. |
| Langflow | Python-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows. |
In order of preference the best way to communicate with us:
This repository is MIT licensed, except for the
ee folders. See LICENSE and docs for more details.
GET routes to use data in downstream applications (e.g. embedded analytics). You can also access them conveniently via the SDKs (docs).
We take data security and privacy seriously. Please refer to our Security and Privacy page for more information.
By default, Langfuse automatically reports basic usage statistics of self-hosted instances to a centralized server (PostHog).
This helps us to:
None of the data is shared with third parties and does not include any sensitive information. We want to be super transparent about this and you can find the exact data we collect here.
You can opt-out by setting
TELEMETRY_ENABLED=false.