Building a RAG Chatbot from Scratch
Building a RAG Chatbot from Scratch
In this tutorial, we’ll walk through the process of building a simple chatbot in Vellum’s Workflow Builder, and then enhance it with Retrieval Augmented Generation (RAG) capabilities to provide more accurate, context-aware responses.
This tutorial is divided into two parts:
- Building a basic chatbot
- Adding document context with RAG (coming soon)
Part 1: Building a Basic Chatbot
Let’s start by creating a simple chatbot that can respond to user messages.
Step 1: Create a New Workflow
Begin by creating a new Workflow in Vellum. Once you’re in the Workflow Builder, you’ll see an empty canvas with an Entrypoint node.
Step 2: Add a Prompt Node
Click and drag from the Entrypoint node to create your first connection. This will open the node selection panel.
Select “Prompt” from the list of available nodes.
Step 3: Configure Workflow Inputs
Next, we need to set up our Workflow to accept chat history as an input. Click on the “Inputs” tab in the left sidebar.
Click the “Add” button and select “Chat History” from the dropdown menu. This creates a chat history variable that will store the conversation between the user and the chatbot.
You can test your workflow by adding a sample message in the chat history. Click on the chat history variable and add a test message like “What is the capital of Peru?”
Step 4: Configure the Prompt Node
Now, let’s configure our Prompt Node to use the chat history. Click on the Prompt Node to open its configuration panel.
In the System Prompt area, enter a simple instruction:
Next, we need to connect our chat history input to the Prompt Node. In the left sidebar of the Prompt Node, click “Add” under Input Variables and select “Chat History” from the dropdown.
Click the input field to open a dropdown of available values and connect it to the Workflow Input chat_history
variable we created earlier.
Step 5: Configure the Final Output
Now we need to connect our Prompt Node to the Final Output node and configure what the workflow will return.
Connect your Prompt Node to the Final Output node by dragging from the output port of the Prompt Node to the input port of the Final Output node.
In the Final Output node, select the Prompt Node as the value to use.
Step 6: Run Your Workflow
Now it’s time to test your chatbot! Click the “Run” button in the top right corner of the Workflow Builder.
After your workflow finishes running, you’ll see the response from the LLM. In our example, it should answer the question about the capital of Peru.
Step 7: Test Multi-Turn Conversations
You can also test multi-turn conversations by using the Chat History tab. This makes it easier to simulate a back-and-forth conversation with your chatbot.
Click on the “Chat History” tab, and you’ll see your initial question. The workflow’s output is automatically added to the conversation. You can then add another message to continue the conversation.
Click the “Run” button in the chat interface to execute the workflow with the updated chat history.
Congratulations! You’ve built a basic chatbot using Vellum’s Workflow Builder. This chatbot can maintain context across multiple messages and provide responses to user questions.
Next Steps
In the next tutorial, we’ll enhance this chatbot by adding RAG capabilities, which will allow it to retrieve and use information from document indexes to provide more accurate, context-aware responses.