Changelog | February, 2025

Application-Wide Performance Improvements

February 10th, 2025

We recently overhauled core pieces of our webserver to improve the performance of the Vellum web application across the board. You should generally notice snappier page load times and more responsive interactions throughout the app.

Support for xAI as a Model Host and Grok Models

February 7th, 2025

We’ve added support for xAI as a Model Host and along with it, the following Grok models have been implemented:

  • Grok Beta
  • Grok Vision Beta
  • Grok 2 1212
  • Grok 2 Vision 1212

Support for Multiple Gemini Models via Vertex AI

February 7th, 2025

We’ve added support for several new Gemini models via Vertex AI:

Support for DeepSeek V3 via Fireworks AI

February 7th, 2025

We’ve added support for the DeepSeek V3 model via Fireworks AI.

Support for Gemini 2.0 Flash 001 via Gemini

February 7th, 2025

We’ve added support for the Gemini 2.0 Flash 001 model via Gemini.

Support for OpenAI’s o3-mini 2025-01-31 Snapshot via Azure

February 3rd, 2025

We’ve added support for OpenAI’s o3-mini 2025-01-31 snapshot to be used as a self-hosted model via Azure.

Built with