top of page

Adding tools to your LangGraph Chatbots | Beginner's Guide | Part 2

Updated: Jun 12


So, you've built your first chatbot using LangGraph (if not, check out our previous blog here)— and now you're wondering, "How can I make it smarter? Maybe even give it some real-world superpowers?"


Well, buckle up! In this post, we’re going to show you how to plug external tools (like search engines) into your chatbot so it can fetch real-time info, just like ChatGPT with browsing.



What Is a Tool?


Before we start adding tools, let’s take a moment to understand what a tool really is in the context of LangGraph and large language models.


Think of tools as external abilities that your chatbot doesn’t possess natively — but can call upon when needed. Just like a human might use a calculator to do complex math or a browser to search for current news, a language model can be given tools to perform tasks it can't do on its own.


For example:


  • A search tool can fetch live information from the web.

  • A calculator tool can solve equations precisely.

  • A code execution tool can run Python code.


Tools expand the capabilities of your chatbot from just “chatting” to doing real-world work.



Tools in LangGraph


LangGraph lets you define and manage tools in a clean, structured way. Here’s how it works:


  1. You define your tools using existing integrations or your custom code.

  2. You bind those tools to your language model using .bind_tools().

  3. LangGraph handles the flow — if your model asks for a tool, LangGraph detects that and routes execution accordingly.


There’s even a built-in node called ToolNode that makes this logic super clean. It listens for tool calls and runs the appropriate one.


Now that you understand what tools are and how LangGraph manages them — let’s dive into building your own tool-using chatbot!




What We’ll Build


We’re going to build a chatbot that:


  • Remembers conversation history

  • Decides when to use a web search tool (via Tavily)

  • Uses a tool only when needed — otherwise, it keeps chatting naturally



By the end, you’ll have a bot that can chat and search the web, all in one smart flow.




Setup and Dependencies


Prerequisites


Install Libraries


pip install -U "langchain[openai]"
pip install -U langgraph langsmith langchain_tavily



Let's Start with the Code


We’ll go step by step, with short intros before each block to help you understand the purpose and context. This way, even if you’re new to LangGraph or LangChain, you won’t feel lost.



Step 1: Set Up the Chat Model


We're starting with a brain for our bot — the language model.


import os
from langchain.chat_models import init_chat_model

os.environ["OPENAI_API_KEY"] = "sk-..."
llm = init_chat_model("openai:gpt-4.1")

What’s Happening:


  • We set the OpenAI API key as an environment variable.

  • Then we initialize a GPT-4.1-based chat model.


You can switch to gpt-3.5-turbo if you're on the free tier.


Step 2: Define the State


Next, we give our chatbot a memory.


from typing import Annotated
from typing_extensions import TypedDict
from langgraph.graph.message import add_messages

class State(TypedDict):
    messages: Annotated[list, add_messages]

Think of this as your bot's memory.


  • TypedDict defines the structure: a dictionary with a key messages.

  • Annotated[list, add_messages] means that messages will be stored as a growing list.


So, every time the user or assistant says something, it's added to this list.

Now that the memory is ready, let’s prepare the brain’s workflow.



Step 3: Initialize the Graph


from langgraph.graph import StateGraph, START, END

graph_builder = StateGraph(State)

  • StateGraph is the main thing that holds all your steps (nodes) and how they’re connected (edges).

  • START and END are just labels for where the graph begins and ends.


🎯 Our goal: Make a path that starts at START, goes to chatbot, uses tools (if needed), then returns to chatbot.


Step 4: Add a Tool (Tavily Web Search)


Next up: giving the chatbot some real-world searching powers.


from langchain_tavily import TavilySearch

tool = TavilySearch(max_results=2)
tools = [tool]

What is Tavily? A simple web search API that LangChain supports. We wrap it into a tool that our chatbot can call when it needs to look something up.


🔧 You can create your own tools too! Like a weather fetcher or a database connector.


Step 5: Combine LLM and Tools


Now let’s bind the tool to our language model.

llm_with_tools = llm.bind_tools(tools)

This line says: "Hey chatbot, here are some tools you can use when you see fit."



Step 6: Define the Chatbot Node


Let’s now use this upgraded model in a chat function.

def chatbot(state: State):
    return {"messages": [llm_with_tools.invoke(state["messages"]) ]}

  • Feeds the full conversation to the model

  • Gets a response

  • Adds that response to the list of messages


🍔 Think of this function as placing an order (all messages), getting a fresh burger (response), and storing it back.


Step 7: Add the Chatbot Node


Next: connecting this chatbot function into our graph.


graph_builder.add_node("chatbot", chatbot)

Output:


This adds our chatbot function into the graph under the label "chatbot".



Step 8: Add a Tool Node


To support tools, let’s now add the ToolNode.


from langgraph.prebuilt import ToolNode, tools_condition

tool_node = ToolNode(tools=[tool])
graph_builder.add_node("tools", tool_node)

Output:


  • This is a special LangGraph feature: ToolNode handles invoking the correct tool.

  • tools_condition will decide whether the chatbot’s last response needs a tool.


🧠 Behind the scenes, it checks if the assistant tried to call a tool (like a search query). If yes, it routes to tools.


Step 9: Wire Up the Logic


Time to wire up how this logic flows.


graph_builder.add_conditional_edges("chatbot", tools_condition)
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_edge(START, "chatbot")
graph = graph_builder.compile()

  • From chatbot, if a tool is needed: go to tools.

  • After using a tool: return to chatbot.

  • Start the flow from START.

  • compile(): finalizes the graph.



♻️ This loop can continue multiple times — chat ➔ tool ➔ chat ➔ tool.


(Optional) Visualize the Graph


Want to see how this looks visually? Let’s do that next.


from IPython.display import Image, display

try:
    display(Image(graph.get_graph().draw_mermaid_png()))
except:
    pass

Output:


This shows a flowchart of your graph — very cool if you want to see how things connect!


🧼 If this doesn’t work, you may need to install graphviz and mermaid.


Step 10: Stream Results


With the graph wired up, let's move on to using it.


def stream_graph_updates(user_input: str):
    for event in graph.stream({"messages": [{"role": "user", "content": user_input}]}):
        for value in event.values():
            print("Assistant:", value["messages"][-1].content)

This sends your message through the graph and prints out the final assistant message.

Let’s now bring it all together into an interactive chat.



Step 11: Let’s Chat!


while True:
    try:
        user_input = input("User: ")
        if user_input.lower() in ["quit", "exit", "q"]:
            print("Goodbye!")
            break
        stream_graph_updates(user_input)
    except:
        user_input = "What do you know about LangGraph?"
        print("User: " + user_input)
        stream_graph_updates(user_input)
        break

An infinite chat loop — until you say quit!


What You Should See


Example:


The bot decides to use Tavily to fetch the answer and comes back with the result.



Things to Watch Out For


  • Tools need API keys too! Tavily has a free tier.

  • Use llm_with_tools not the plain llm once tools are bound.

  • The flow only works because of the tools_condition. Without it, the tool node won’t be triggered.




You just leveled up your chatbot — from a memory-retaining assistant to a real-world problem solver with tools!





LangGraph makes it easy to:


  • Add tools

  • Create conditional workflows

  • Keep chat history intact





This is just the beginning — you can add multiple tools, memory, even multi-agent collaboration.



You may want to check out these articles:



If you need help building custom AI apps with LangChain, LangGraph, or tool-using chatbots, reach out to Codersarts — we help businesses and developers bring AI ideas to life.

Comentarios


bottom of page