Build multi-step AI apps with Vellum’s Worfklows

Vellum Workflows help you quickly prototype, deploy, version, and monitor complex chains of LLM calls and the business logic that tie them together.

It provides a low-code interface for defining these chains so that you get rapid feedback on how they work across a variety of test cases that you define. Once you’re happy with the Workflow, you can “deploy” it and hit an API to invoke that Workflow from your application.

Once deployed, future changes to the Workflow definition are versioned and invocations made from your application are logged. For a given invocation, you can view the inputs, outputs, and latency of each step along the way.

Concepts

Workflows make heavy use of the following concepts:

  1. Input Variables
  2. Scenarios
  3. Nodes
  4. Edges
  5. Final Outputs

Let’s take a look at each

Input Variables

The behavior of most Workflows depend on 1 or more dynamic Input. For example, you could define a single Input named query that your Workflow depends on.

Workflow Input Variables

Scenarios

A Scenario is a set of values for your Input Variables. In the above example, we have Scenario 1 which assigns a value of What is fine tuning? to the query Input Variable.

You can define as many Scenarios as you want and swap between them to test that your Workflows behaves the way you expect for each.

Nodes

Nodes are the steps in your Workflow where some action will take place. Some Nodes generate Outputs, whereas some Nodes are used purely to direct the flow of execution.

For example, the Prompt Node is used to pass Input Variables into a Prompt and execute an LLM. It generates an output that can then be used as an input to other downstream Nodes.

Workflow Nodes

Edges

Edges connect Nodes and define the order in which they are executed. The are represented as the lines in between Nodes.

Workflow Edges

Note that a Node has access to the output data from all upstream Nodes, not just the Node(s) that it’s directly connected to via an Edge.

Final Output

There’s a special Node called a “Final Output Node.” They’re used to indicate which Node output you actually care about and want to surface as the overall output for the Workflow.

In the below example, I have a Final Output Node named final-output that subscribes to a string output that comes from the OpenAI Help Center Prompt Node.

Workflow Final Output

Final Output Nodes are particularly important when you Deploy a Workflow and invoke it via API. By default, only the data that Final Output Nodes subscribe to will be returned by the API.

Note that you can have as many Final Output Nodes in a Workflow and can assign each a name to differentiate the data associated with each in API calls.