Monitoring Quickstart
Monitor your LLM requests and application functions.
Installation
First, you’ll need a Parea API key. See Authentication to get started.
After you’ve followed those steps, you are ready to install the Parea SDK client.
Start Logging
Use any of the code snippets below to start logging your LLM requests.
Parea supports automatic logging for OpenAI, Anthropic, Langchain, or any model if using Parea’s completion method.
The @trace
decorator allows you to associate multiple functions into a single trace.
View Logs
Now you can view your trace logs on the Logs page. You will see a table of your logs, and any chains will be expandable. The log table supports search, filtering, and sorting.
If you click a log, it will open the detailed trace view. Here, you can step through each span and view inputs, outputs, messages, metadata, and other key metrics associated with a given trace.
You can also add these traces to a test collection to test new prompts on historical inputs or open the trace in the Playground to iterate on the example.
What’s Next?
Explore our cookbooks for more examples on how to use Parea’s SDKs.
Learn how to use evaluation metrics to run experiments.
Was this page helpful?