POST
/
api
/
parea
/
v1
/
completion

Authorizations

x-user-id
string
header
required

Body

application/json
cache
boolean
default:
true

If true, the completion will be cached to avoid latency & cost for any subsequent completion using the same inputs.

deployment_id
string | null

This is the ID for a specific deployed prompt. You can find your deployed prompts on the Deployments tab. If a deployment_id is provided, Parea will fetch all of the associated configuration including model name, model parameters, and any associated functions. Any information provided on the llm_configuration field will be used instead of the associated deployed prompts fields.

end_user_identifier
string | null

Special field to track the end user which is interacting with your LLM app.

eval_metric_ids
integer[] | null

List of evaluation metric IDs deployed on Parea which should be used to evaluate the completion output.

experiment_uuid
string | null

Experiment UUID which is used to associate the log with an experiment.

fallback_strategy
string[] | null

Deprecated field

inference_id
string | null

Deprecated field which is the same trace_id

llm_configuration
object

LLM configuration parameters such as messages, functions, etc.

llm_inputs
object | null

Key-value pairs as inputs to prompt template. Only needs to be provided if deployment_id is provided or llm_configuration.messages are templated.

log_omit
boolean
default:
false

Equivalent to setting both log_omit_inputs and log_omit_outputs to true.

log_omit_inputs
boolean
default:
false

If true, the inputs, llm_configuration.messages, llm_configuration.functions, llm_configuration.model_params will not be logged.

log_omit_outputs
boolean
default:
false

If true, the generated response will not be logged.

log_sample_rate
number | null
default:
1

If specified, this log and its entire associated trace will logged with this probability. Must be between 0 and 1 (incl.). Defaults to 1.0 (i.e., keeping all logs)

Required range: 0 < x < 1
metadata
object | null

Key-value pairs to be associated with the log.

name
string | null

Deprecated field

parent_trace_id
string | null

UUID of the parent log. If given, will be used to associate the generation in a chain & create hierarchical nested logs.

project_name
string | null
default:
default

Project name which is used to associate the log with a project.

project_uuid
string | null

Project UUID which is used to associate the log with a project. Does not need to be provided if the project_name is provided.

provider_api_key
string | null

Provider API key to generate response. If not given, API keys saved on the platform will be used

retry
boolean
default:
false

Deprecated field

root_trace_id
string | null

UUID of the root log. If given, will be used to associate the generation in a chain & create hierarchical nested logs.

stream
boolean
default:
false

Deprecated field. Use /completion/stream instead.

tags
string[] | null

List of tags to be associated with the log.

target
string | null

Optional ground truth output for the inputs. Will be used for evaluation and can be used when creating a test case from the log.

trace_id
string | null

UUID of the generation log. If not given, will be auto-generated.

trace_name
string | null

Name of the generation log. If not given, will be auto-generated in the format llm-{provider}.

Response

200 - application/json
cache_hit
boolean
required

If true, the completion was fetched from the cache.

content
string
required

Generated completion content.

cost
number
required

Cost of the completion in USD.

end_timestamp
string
required

End timestamp of the completion.

inference_id
string
required

UUID of the log of the completion. The same as the trace_id in the request if provided.

input_tokens
integer
required

Number of tokens in the input.

latency
number
required

Latency of the completion in seconds.

model
string
required

Model name.

output_tokens
integer
required

Number of tokens in the output.

provider
string
required

Provider name.

start_timestamp
string
required

Start timestamp of the completion.

status
string
required

Status of the completion. Either 'success' or 'error'.

total_tokens
integer
required

Total number of tokens in the input and output.

error
string | null

Error message if the completion failed.

trace_id
string | null

UUID of the log of the completion. Will be the same as the trace_id in the request if provided.