If you want to look like you know AI, then you better know what LLM chains are.
I'll tell you enough so that you'll sound smart at your next team meeting.
In this article I would like to introduce Flowise, a drag-and-drop open source tool used to build LLM chains and deploy them. Okay, great, but what’s an LLM chain?
LLM chains are what ChatGPT, BardAI and custom chatbots are made of. Understanding LLM chains will not only give you insights into the inner workings of AI platforms, but also will provide technical foundations for designing complex and customized AI applications.
You may think of LLM (Large Language Model) applications like ChatGPT as a simple black box with input and output. You type something in, and ChatGPT returns its output text. Simple, right?
The reality is a little more complex. At the very least, a chatbot AI model needs another input besides your latest prompt: chat history. An LLM model itself does not remember chat history. Your chat history is maintained in a separate place of memory. Each time you type in a new line of prompt, the new prompt is combined with the chat history and fed into the LLM model. The output of the LLM model is then appended to the chat history memory.
Complexity doesn’t stop there. Think of chatbots that can browse internet and create images and files. As there are multiple input and output elements in operation in such LLM applications, this structure of data flow is called a “LLM chain.” If you understand how LLM chains work, you can use the knowledge to build your custom chatbots.
Flowise is probably the best open source application to build custom LLM chains visually without writing code. Thus, it’s an excellent educational tool as well if you want to learn about LLM chains. After a Flowise LLM chain is built, it can be deployed to your own web server unlike ChatGPT custom GPTs. The application can be installed by going to its Github page and following the install instructions. As the application uses NodeJS, some basic knowledge of Github and NodeJS is required to install and run the application. (You don’t have to know how to code in NodeJS, just need to know how to set it up.) To use OpenAI inside Flowise, you also need an OpenAI API Key.
If Flowise is running successfully, you can access it by navigating to http://localhost:3000 in your browser.
Click “Add New” to create a new chat flow.
Click the Plus button to “Add Nodes.” Expand “Chains.”
You start by adding a “chain” object. I have to admit that despite Flowise’s drag and drop UI, building a LLM chain using the tool is not always intuitive. You may imagine connecting input and output to a LLM model to build a chain. Instead, you start by inserting a LLM chain object and connecting a LLM model to it. The “LLM Chain” object is a very barebone object without space for history. So, to create a basic chatbot example with chat history, we need to insert “Conversation Chain.”
Next under “Add Nodes,” expand “Chat Models.”
As you can see, we have many choices here, not just OpenAI. For now, let’s stick with “ChatOpenAI.” After adding it to the flow, you need to set the credentials using your OpenAI API key. Inside the ChatOpenAI box, click on “Connect Credential”→”Create New” then enter your API Key. “Credential Name” can be anything you want.
You also need to add “Buffer Memory” (under “Memory”) to handle chat history. Then, add “Chat Prompt Template” (under “Prompts”) to handle prompt input. For fun, set the parameters so we have customized interactions. Our chatbot will talk like a pirate all the time.
Now you have four nodes, they should be wired together.i
Important: Before you test-drive the chain, you need to first save the flow by clicking on the save icon on the top right side. Then click on the chat icon to try talking.
That’s it! Our first custom LLM chain and customized pirate chatbot! Unlike the OpenAI custom GPTs, this chatbot can be deployed to your own web server/website. Flowise’s deployment documentation explains how to deploy Flowise to various platforms including AWS and Digital Ocean. Once you have your Flowise deployed online and running, you can “embed” the chatbot in your website using simple HTML codes.
Of course, we have only scratched the surface of what you can do with Flowise. The meat of the custom LLM chains is about using custom data sources and custom actions. I hope to cover this more advanced topic in a future issue.