Adding memory to your Chatbot using LangGraph | How to use MemorySaver | Beginner's Guide | Part 3
- Pratibha
- Jun 12
- 4 min read

So far, we’ve built a chatbot with LangGraph that can talk and even use external tools like web search (check out the last blog if you missed it). But what happens if we want our chatbot to actually remember things across multiple messages or interactions?
That’s where memory — persistent memory — comes in.
In this blog, we'll show you how to plug in memory using LangGraph's MemorySaver, and explain:
What memory really means in LangGraph
How threads and configurations help organize chat sessions
When the bot remembers and when it forgets (intentionally!)
Let’s get started.
Note: This blog continues from where we left off in our previous blog, where we constructed a chatbot with tool support using LangGraph. We'll be building on top of that same graph in this tutorial by adding memory support. Make sure to complete the setup in the previous blog first.
What Is Memory in LangGraph?
When we talk to people, we don’t repeat ourselves every time we speak. We assume the other person remembers the context of the conversation so far. For a chatbot to feel natural and useful, it needs this kind of memory too.
Memory in General Terms
In traditional chatbots or LLM workflows, memory refers to the ability to retain past interactions. Without memory, each message is treated like the first time you're ever speaking to the bot. Not great for natural conversations, right?
That’s why we need memory management systems that can store and retrieve conversation history.
LangGraph and Memory
LangGraph implements memory through a concept called checkpointing. Think of it like taking snapshots of the conversation state at different points in time.
A checkpoint is like saving your game progress.
If the app crashes or you want to revisit a past state, you can go back to that checkpoint.
You can also fork off new threads from a past checkpoint, making it great for workflows or session branching.
LangGraph lets you plug in different types of checkpointers (memory backends). One of the simplest and most beginner-friendly ones is:
Introducing: MemorySaver
LangGraph gives you a built-in memory manager called MemorySaver:
from langgraph.checkpoint.memory import MemorySaver
memory = MemorySaver()
This does a few important things:
It stores the entire graph state in RAM.
It automatically keeps track of conversation history across messages.
It supports multiple threads, meaning different users or sessions can be isolated from each other.
Note: MemorySaver is great for development and local testing — but it’s volatile. If you restart your program, the memory is lost. For production, use persistent options like Redis or SQLite.
Setting Up Memory with the Graph
Now, we’ll connect this memory system to our existing chatbot graph. We assume you have the graph_builder already defined (from the previous tutorial).
graph = graph_builder.compile(checkpointer=memory)
This line tells LangGraph: “Use this memory instance (MemorySaver) to store every state transition and message history from now on.”
LangGraph now becomes aware of memory tracking and allows for better control flow management, history lookup, and branching.
Threads and Configurations — How LangGraph Separates Conversations
A thread in LangGraph is like a conversation room. You can have multiple rooms, each with their own memory.
To identify a thread, we pass a unique thread_id using a config dictionary:
config = {"configurable": {"thread_id": "1"}}
This config acts like a session identifier. If you reuse this thread ID in subsequent runs, LangGraph continues from the last checkpoint in that thread. If you change it, a new thread (with fresh memory) starts.
Every thread_id maintains its own memory. Switch thread IDs, and you start fresh — like a brand new chat window.
Chatting with Memory
Let’s run a message and see memory in action. We’ll send a simple message introducing ourselves.
user_input = "Hi there! My name is Will."
# Run the message in thread 1
events = graph.stream(
{"messages": [{"role": "user", "content": user_input}]},
config,
stream_mode="values",
)
for event in events:
event["messages"][-1].pretty_print()
Output:

Here’s what happens under the hood:
We pass a message from the user.
LangGraph passes this through the defined graph logic (chatbot > tools > chatbot).
Each node updates the graph’s state, and MemorySaver stores the changes under thread_id=1.
Cool! Now the assistant should store that message internally. Let’s test if our chatbot remembers the user's name:
user_input = "Remember my name?"
events = graph.stream(
{"messages": [{"role": "user", "content": user_input}]},
config,
stream_mode="values",
)
for event in events:
event["messages"][-1].pretty_print()
Output:

The model now has access to all prior context stored in memory — so it should respond with something like "Yes, your name is Will!"
What Happens If We Change Threads?
Let’s try the same question in a new thread:
events = graph.stream(
{"messages": [{"role": "user", "content": user_input}]},
{"configurable": {"thread_id": "2"}},
stream_mode="values",
)
for event in events:
event["messages"][-1].pretty_print()
Output:

Since thread 2 has no memory of thread 1, it will respond like it has no idea who you are.
Pro Tip: Use thread IDs like session IDs — one for each user or conversation.
Bonus: Peek into the Bot’s Brain
Want to see what’s actually stored?
snapshot = graph.get_state(config)
snapshot
Output:

This will show you the full internal state of the graph for thread 1 — including all messages so far.
Great for debugging or showing how much the bot has remembered!
Visualise the Flow
If you’re a visual learner, LangGraph can show you a diagram of how data flows through the chatbot.
from IPython.display import Image, display
try:
display(Image(graph.get_graph().draw_mermaid_png()))
except:
pass
Output:

This shows the structure of the graph: how the chatbot node connects to the tool node, and how control flows between them.
Make sure Graphviz and Mermaid dependencies are installed if this doesn’t work.
Recap:
You now know how to:
Add persistent memory using MemorySaver
Separate chat sessions using thread_id
Pass configuration objects in LangGraph
Debug state and visualize control flow
All of this builds upon our previous chatbot — and makes it much more powerful and realistic.
You may want to check out these articles:
Looking to integrate LangGraph, tools, memory, or agents into your own app? Codersarts can help — we specialize in AI-driven development. Reach out via our website or social media links!
Happy building!

Comments