Langfuse Integrations Overview
Integrate your application with Langfuse to explore production traces and metrics.
Objective:
- Capture traces of your application
- Add scores to these traces to measure/evaluate quality of outputs
There are currently six main ways to integrate with Langfuse:
Main Integrations
Integration | Supports | Description |
---|---|---|
SDK | Python, JS/TS | Manual instrumentation using the SDKs for full flexibility. |
OpenAI | Python, JS/TS | Automated instrumentation using drop-in replacement of OpenAI SDK. |
Langchain | Python, JS/TS | Automated instrumentation by passing callback handler to Langchain application. |
LlamaIndex | Python | Automated instrumentation via LlamaIndex callback system. |
Haystack | Python | Automated instrumentation via Haystack content tracing system. |
LiteLLM | Python, JS/TS (proxy only) | Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs). |
Vercel AI SDK | JS/TS | TypeScript toolkit designed to help developers build AI-powered applications with React, Next.js, Vue, Svelte, Node.js. |
API | Directly call the public API. OpenAPI spec available. |
Packages integrated with Langfuse
Name | Description |
---|---|
Instructor | Library to get structured LLM outputs (JSON, Pydantic) |
DSPy | Framework that systematically optimizes language model prompts and weights |
Dify | Open source LLM app development platform with no-code builder. |
Ollama | Easily run open source LLMs on your own machine. |
Mirascope | Python toolkit for building LLM applications. |
Flowise | JS/TS no-code builder for customized LLM flows. |
Langflow | Python-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows. |
Unsure which integration to choose? Ask us on Discord or in the chat.
Request a new integration
We use GitHub Discussions to track interest in new integrations. Please upvote/add to the list below if you’d like to see a new integration.
End to end examples
If you want to see how things work together, you can look at the end-to-end examples below. They are Jupyter notebooks that you can easily run in Google Colab or locally.
Generally, we recommend reading the get started guides for each integration first.
Integrations
Integration Azure Openai LangchainIntegration DspyIntegration HaystackOSS Observability for InstructorIntegration LangchainOpen Source Observability for LangGraphIntegration LangserveIntegration Litellm ProxyIntegration Llama-indexIntegration Llama-index InstrumentationIntegration Llama-index Milvus-liteMonitoring LlamaIndex applications with PostHog and LangfuseIntegration MirascopeIntegration Mistral SdkOllama Observability and Tracing for local LLMs using LangfuseOSS Observability for OpenAI Assistants APIIntegration Openai SdkObserve OpenAI Structured Outputs with LangfuseJs Integration LangchainJs Integration Litellm ProxyJs Integration OpenaiJs Tracing Example Vercel Ai Sdk