You can go into the settings and, under the Chatbot section, enable the Discussions feature. This will add a new tab where you can view a history of all the conversations that have taken place in your chatbots, presented in a human-readable format.
All of this data remains entirely on your server—Meow Apps does not process or store any of it. You can read this article to learn more.

You can add a list of the discussions (like ChatGPT) next to your chatbot, by using the mwai_discussions shortcode on your website so each user can see it’s history of conversation.
[mwai_discussions id="default" text_new_chat="+ Start New Discussion"]

Local Memory
If you enable the Local Memory option for your chatbot, the current discussion thread will be saved in the user’s browser local storage—not in your database. This means the conversation will persist even if the user refreshes the page or navigates to a different one, effectively caching the chat locally.
However, since this is browser-based, the conversation will not be preserved if the user clears their local storage, switches devices, or uses a different browser.

History Strategy
If the model you have selected supports Responses (OpenAI) you will find a new setting section in your chatbots settings called “Advanced” in which you can choose the History Strategy, which basically dictates how the previous messages from the thread are sent into the current query history to maintain context.

Available Options
Automatic
Best choice for most users as it optimizes performance while ensuring reliability
- Default setting that intelligently chooses the best strategy for each situation
- Uses Incremental mode when a valid
previous_response_id
is available - Automatically falls back to Full History mode when:
- No previous response ID exists (first message in conversation)
- The response ID has expired (after 30 days)
- The response ID is invalid or corrupted
Incremental
- Uses OpenAI’s
previous_response_id
parameter to maintain conversation state - Only sends the latest message instead of the entire conversation history
- More efficient in terms of:
- Reduced token usage (potentially lower costs)
- Faster processing times
- Lower bandwidth usage
- Limitations:
- Only works with valid response IDs (expire after 30 days per OpenAI policy)
- Requires the conversation to have been initiated with Responses API
- May fall back to Full History if response ID becomes invalid
Full History
- Sends the complete conversation history with every request
- Traditional approach used by most chat implementations
- More reliable for complex conversations with many turns
- Higher resource usage:
- Increases token consumption
- Larger request payloads
- May impact response times for long conversations
- Always works regardless of conversation age or API changes