LLM Calls Interface

The LLM calls interface allows you to monitor and analyze language model usage across your tasks.

Current Access

Use the “LLM Calls” tab inside a task to view the LLM calls for the task. We are working on detailed documentation covering:
  • Usage monitoring
  • Cost analysis
  • Performance metrics
  • Error tracking
Join our Discord community for updates on LLM monitoring best practices.