Browserable home page
Search...
⌘K
Discord
Twitter
Support
Github Repo
Github Repo
Search...
Navigation
Admin Interface
LLM Calls Monitoring
Documentation
REST API
JS SDK
Cloud
Join Discord
GitHub
X/ Twitter
Blog
Roadmap
Getting Started
Welcome
Quickstart
Demos
Development
Local Development
Environment variables
Development Roadmap
CLI
Troubleshooting
Guides
Custom Functions & Tools
Self-hosting Setup
Customization
Storage Configuration
Database Configuration
LLM Configuration
Remote Browser Configuration
Admin Interface
API Keys Management
Task Creation UI
Flow Chart Visualization
LLM Calls Monitoring
Results Table View
On this page
LLM Calls Interface
Current Access
Admin Interface
LLM Calls Monitoring
Monitor and analyze LLM usage in the admin interface
LLM Calls Interface
The LLM calls interface allows you to monitor and analyze language model usage across your tasks.
Current Access
Use the “LLM Calls” tab inside a task to view the LLM calls for the task.
We are working on detailed documentation covering:
Usage monitoring
Cost analysis
Performance metrics
Error tracking
Join our
Discord community
for updates on LLM monitoring best practices.
Flow Chart Visualization
Results Table View
Assistant
Responses are generated using AI and may contain mistakes.