Changelog | February, 2025

Workflow SDK Code Preview

February 18th, 2025

For those participating in the Workflow SDK beta, you can now preview the SDK-code representation of your Workflows directly in the UI. To do so, click on the preview button:

Workflow SDK Code Preview Button

Doing so will open a side panel with a full representation of your Workflow in code form.

Workflow SDK Code Preview Side Panel

This feature is still in beta, so please let us know if you encounter any issues or have feedback! You can learn more about how to opt into the Workflows SDK beta here.

Restore Button Moved to History Cards

February 16th, 2025

The Restore button has been moved from the top-right of Sandboxes pages to the History Cards themselves. This makes it easier to find and use the Restore button and revert back to a prior state of your Prompt or Workflow Sandbox.

Workflow Node Layout Improvements

February 16th, 2025

There has been a minor improvement to the layout of Workflow Nodes. Now, input variables are always displayed above output types, making it easier to skim and understand a Node from top to bottom.

Code Execution, Templating, and Final Output Nodes are all affected by this change.

Model Picker Improvements

February 16th, 2025

The Model Picker has been updated to make it easier to find the model you’re looking for. Now, you can sort models alphabetically by their name, the date they were introduced, and when they were last used by someone in your Workspace.

Model Picker Improvements

Additionally, filters and sort preferences are saved across sessions, so you don’t have to reapply them every time.

Multiple Chat History Variables in Prompts

February 16th, 2025

Up until now, we’ve restricted you to only have one Chat History variable in a Prompt. It’s quite uncommon to need more than one, but we understand that there are some cases where it can be useful. We’ve now lifted this restriction and you can now have multiple Chat History variables in a Prompt.

Application-Wide Performance Improvements

February 10th, 2025

We recently overhauled core pieces of our webserver to improve the performance of the Vellum web application across the board. You should generally notice snappier page load times and more responsive interactions throughout the app.

Support for xAI as a Model Host and Grok Models

February 7th, 2025

We’ve added support for xAI as a Model Host and along with it, the following Grok models have been implemented:

  • Grok Beta
  • Grok Vision Beta
  • Grok 2 1212
  • Grok 2 Vision 1212

Support for Multiple Gemini Models via Vertex AI

February 7th, 2025

We’ve added support for several new Gemini models via Vertex AI:

Support for DeepSeek V3 via Fireworks AI

February 7th, 2025

We’ve added support for the DeepSeek V3 model via Fireworks AI.

Support for Gemini 2.0 Flash 001 via Gemini

February 7th, 2025

We’ve added support for the Gemini 2.0 Flash 001 model via Gemini.

Support for OpenAI’s o3-mini 2025-01-31 Snapshot via Azure

February 3rd, 2025

We’ve added support for OpenAI’s o3-mini 2025-01-31 snapshot to be used as a self-hosted model via Azure.

Built with