# A Complete Guide to LLM Observability With OpenTelemetry and Grafana Cloud ![rw-book-cover](https://grafana.com/media/blog/llm-observability/llm-observability-otel-meta.png) URL:: https://grafana.com/blog/2024/07/18/a-complete-guide-to-llm-observability-with-opentelemetry-and-grafana-cloud/ Author:: Ishan Jain ![rw-book-cover](https://grafana.com/media/blog/llm-observability/llm-observability-otel-meta.png) ## AI-Generated Summary This guide explains how to monitor large language model (LLM) applications using OpenTelemetry and Grafana Cloud. Observability is crucial for understanding performance, managing costs, and improving user satisfaction in LLM applications. Tools like OpenLIT help simplify the process of capturing and analyzing telemetry data effectively. ## Highlights > Why this isn’t just plain API monitoring > While LLM observability does involve monitoring external API calls to LLMs, it goes much further than traditional API monitoring. In standard API monitoring, the focus is primarily on request and error tracking. However, LLM observability captures detailed and valuable information, such as prompts, responses, associated costs, and token usage. ([View Highlight](https://read.readwise.io/read/01j7znj3djwgr1g89mv9zg2hve))