Customizing Your LLM Model on Mendable.ai
Use the custom prompt feature to take control of your model
One of the key features of Mendable is the Custom Prompt feature, which offers several use cases to enhance your chatbotâs performance and provide a tailored experience for your customers. In this tutorial, we will focus on the three main use cases of the Custom Prompt feature: Hallucination Gatekeeping, Maintaining Voice, and Structure/Format Control. By making edits in the custom prompt, we can customize the output of our answers. This example will be trained on the Mendable docs.
Use Case 1: Hallucination Gatekeeping
One of the main use cases of the custom prompt feature is hallucination gatekeeping. This allows you to control the chatbotâs responses to unpredictable or potentially harmful inputs. By customizing your prompt on Mendable, you can guide the chatbot to provide accurate and safe responses. Letâs dive into an example:
Example
Letâs say our product requires a high degree of accuracy, and we want to prevent the model from âguessingâ, ensuring it is 100% confident that the answer or information about the answer can be found in the documentation.
Before the prompt change, I asked Mendable âHow does Mendableâs ClickUp Integration work?â. We do not have a ClickUp Integration (yet). Here is the output:
Notice how the chatbot tried to answer anyways. To prevent this, we only need one sentence! In the custom prompt we can write: âIf nothing is picked up in the information retrieval, say âSorry, I donât have the answer for that. Please contact support at help.mendable.ai for assistance!ââ.
Now⊠letâs test with the same query to see if our custom prompt change is successful.
Awesome! We just prevented a hallucination! Feel free to experiment with different prompts and refine them based on your specific requirements.
Use Case 2: Maintaining Voice
Maintaining Voice is crucial for creating a consistent user experience. The Custom Prompt feature allows you to define specific language patterns, tones, or styles that align with your brand or product. Letâs try it out!
Example
In this example, weâll focus on a couple of ways to bring our companyâs voice to life. Letâs build:
- A nice introductory message to make the user feel welcomed
- A supportive voice when the user is running into issues
Letâs jump back into the custom prompt. Starting with the introductory message, letâs input something like âAt the beginning of the conversation, say âWelcome to Mendable and thanks for reaching out!ââ.
By specifying this message at the beggining of the conversation, weâve pinned the welcome message to only appear on the first message of the conversation.
Additionally, we can sense the tone of our customerâs voice. If we sense that theyâre frustrated, we can customize a message to be as helpful and supportive as possible. Iâll set this up as such: âIf the user is running into issues, say âIâm sorry youâre running into issues! Iâll try my best to helpâ. Then try to answer. After the answer, say âIf this answer didnât help, please reach out to support at help.mendable.ai!ââ
Weâve taken a few steps to improve CX:
- Given a tone to be understanding and helpful as the user works through their error
- Tried to answer the question
- Given a clear and easy outlet to the company support channel in the event that this issue supersedes the botâs ability.
Here is everything that was added to the custom prompt:
As expected, we have our welcome message in the first message of the conversation:
In our follow up question, we didnât specifially say that we were ârunning into issuesâ. However, the way we wrote the prompt allows the model to detect the negative tone in the question the user asked, and it will use the prompted response as a result:
Congrats! Youâve now learned how to get started matching your companyâs voice to its LLM.
Use Case 3: Structure and Format
Custom Prompt empowers you to guide the chatbotâs responses according to your desired structure and formatting requirements. Letâs explore an example:
Example
Sometimes the answer is outputted in one main large paragraph, which can be difficult to read. Hereâs an example of what this looks like:
To combat this, Iâll just put in a short prompt. In your answer, start a different key point in another paragraph.
With this quick change, we can make the format a bit easier to digest. Testing the same question, here is the new output:
Much easier to read! This is just one of many ways you can customize the format to your liking.
Conclusion: We covered three main use cases of the Custom Prompt feature in Mendable: Hallucination Gatekeeping, Maintaining Voice, and Structure/Format. By customizing prompts, you be confident that your chatbot provides accurate, consistent, and controlled responses. Feel free to explore more prompts and experiment with different scenarios to enhance your chatbotâs conversational capabilities.
If you have any questions, feel free to reach out to us!