Trace Your AI Applications and Collect User Feedback

Intermediate
4.1Rating
1,241Deployments
60 mDuration

Learn to implement OpenTelemetry tracing for AI applications using Azure AI Foundry, including automatic instrumentation and user feedback collection.

Skills Validated

PythonAzure AI FoundryOpen TelemetryTracing

Lab Overview & Objectives

OpenTelemetry tracing is an observability framework that captures telemetry data from AI applications by automatically recording API calls, timing, and custom events. Azure AI Foundry integrates with OpenTelemetry to monitor AI model calls and application behavior, helping organizations track performance and debug issues in production environments.

In this lab, you will implement OpenTelemetry tracing for AI applications using Azure AI Foundry and collect user feedback data. You'll learn how to set up automatic instrumentation for AI model calls, create custom spans and attributes for application context, and implement user feedback collection to track user satisfaction alongside technical metrics.

Objectives

Upon completion of this intermediate level lab, you will be able to:

  • Configure Azure AI Foundry environment with Application Insights for telemetry collection
  • Deploy AI models and set up authentication for development environments
  • Implement automatic OpenTelemetry instrumentation for OpenAI SDK to capture API calls and conversation content
  • Create custom spans and attributes to add application context to traces
  • View trace data in Azure AI Foundry portal including spans, attributes, and events

Who is this lab for?

This lab is designed for:

  • Software developers building AI-powered applications who need to implement production-ready observability
  • DevOps engineers responsible for monitoring and maintaining AI applications in cloud environments
  • AI/ML engineers who want to understand how their models perform in real-world user interactions
  • Site reliability engineers focused on ensuring AI application performance and user experience
  • Technical architects designing observability strategies for enterprise AI solutions
  • Product managers who need to understand user satisfaction and AI application performance metrics

Real-Time Validation

Our platform uses an automated validation engine to verify your configurations as you work through the lab modules. No multiple choice—just real-world proficiency.

[SYSTEM] VALIDATION_ACTIVEv2.4.0
Checking resource: vpc-ingress-01...
Scanning security policy: allow-ssh...
Modules
4
Duration
60 m

Lab Curriculum

01

Logging into Azure Account using Azure Portal

02

Enable Tracing in Your Project

03

Instrument the OpenAI SDK

04

Add Custom Spans and User Feedback