Execute Prompt Stream

POST
Executes a deployed Prompt and streams back the results.

Request

This endpoint expects an object.
inputs
list of unionsRequired
The list of inputs defined in the Prompt's deployment with their corresponding values.
prompt_deployment_id
stringOptional

The ID of the Prompt Deployment. Must provide either this or prompt_deployment_name.

prompt_deployment_name
stringOptional

The name of the Prompt Deployment. Must provide either this or prompt_deployment_id.

release_tag
stringOptional
Optionally specify a release tag if you want to pin to a specific release of the Prompt Deployment
external_id
stringOptional
"Optionally include a unique identifier for tracking purposes. Must be unique for a given prompt deployment.
expand_meta
objectOptional

The name of the Prompt Deployment. Must provide either this or prompt_deployment_id.

raw_overrides
objectOptional
expand_raw
list of stringsOptional

Returns the raw API response data sent from the model host. Combined with raw_overrides, it can be used to access new features from models.

metadata
map from strings to anyOptional

Response

This endpoint returns a stream of union
Initiated
OR
Streaming
OR
Fulfilled
OR
Rejected
POST
1curl -X POST https://predict.vellum.ai/v1/execute-prompt-stream \
2 -H "X_API_KEY: <apiKey>" \
3 -H "Content-Type: application/json" \
4 -d '{
5 "inputs": [
6 {
7 "type": "STRING",
8 "name": "string",
9 "value": "string"
10 }
11 ]
12}'
Streamed Response