Prompt Retry Logic
Prompt nodes support two selectable outputs - one from the model in case of a valid output and one in case of a non deterministic error. Model hosts fail for all sorts of reasons that include timeouts, rate limits, or server overload. You could make your production-grade LLM features resilient to these features by adding retry logic into your Workflows!
Implementation Steps
Add a Conditional Node (Error Check
)
This node will read from the new Error output from the Prompt Node and check to see if it’s not null.