LangSmith, a prominent platform for monitoring AI applications, has announced an integration with OpenTelemetry to enhance distributed tracking and observability, according to LangChain. This integration allows LangSmith to collect traces in OpenTelemetry format, giving developers a comprehensive view of application performance.
OpenTelemetry integration details
OpenTelemetry is an open standard for distributed tracking and observability that supports a wide range of programming languages, frameworks, and monitoring tools. This integration means that LangSmith’s API layer can now directly accept OpenTelemetry tracking. Developers can point any supported OpenTelemetry exporter to a LangSmith OTEL endpoint, allowing its traces to be collected and accessed within LangSmith. This setup combines LLM monitoring and system telemetry to provide a unified view of application performance.
Semantic rules and supported formats
OpenTelemetry defines semantic rules for a variety of use cases, including databases, messaging systems, and protocols such as HTTP or gRPC. LangSmith is particularly focused on rules for generative AI, a development field with few existing standards. LangSmith currently supports tracing in the OpenLLMetry format, which facilitates basic instrumentation for a variety of LLM models, vector databases, and common frameworks. Future plans include support for other evolving semantic rules.
Getting started with OpenTelemetry
To take advantage of this new functionality, developers can start with an OpenTelemetry-based client, such as the OpenTelemetry Python client. By installing the necessary dependencies and configuring environment variables, developers can start tracking their application. The LangSmith dashboard displays these traces to provide insight into application performance.
Additional SDK integrations
LangSmith also supports integration with other SDKs such as Traceloop and Vercel AI SDK. This integration allows developers to transfer tracking data using a variety of SDKs and provides flexibility and compatibility with a variety of AI models and frameworks. For example, the Traceloop SDK supports a wide range of integrations, and the Vercel AI SDK provides client-side trace export functionality defined by the LangSmith library.
These advancements position LangSmith as a powerful solution for developers seeking comprehensive observability and performance monitoring of AI applications that leverages the power of OpenTelemetry to provide a detailed, integrated view of system operations.
Image source: Shutterstock