It’s good practice to use templated messages for your LLM calls. In order to directly associate inputs to templated LLM calls, you can pass them into wrapped OpenAI and Anthropic calls via a template_inputs argument that contains key-value pairs of template variables and their corresponding values. When you do this, the template_inputs will be visualized in the trace and used to format the messages in your LLM calls. Note, that any variable in the template message needs to enclosed in double curly braces {{}} in order to get replaced. See, below OpenAI example:

OpenAI
from openai import OpenAI
from parea import Parea

client = OpenAI()
p = Parea(api_key="PAREA_API_KEY")
p.wrap_openai_client(client)

response = client.chat.completions.create(
    model="gpt-4",
    messages=[
        {"role": "user", "content": "Make up {{number}} people"},
    ],
    template_inputs={"number": "three"},
)

This will result in the following trace: