Building applications with large language models (LLMs) is one of the most exciting frontiers in AI today. From chatbots and coding assistants to autonomous agents, LLMs are transforming how we build intelligent systems. But as our applications grow more complex—especially when multiple LLMs need to work together—we run into a few big questions:
-
How do we manage the flow of information between LLMs?
-
How can we coordinate actions between tools, APIs, and agents?
-
How do we ensure the system knows where it is in a task?
Enter LangGraph: a powerful open-source framework that lets you build multi-agent workflows using graph-based state machines.
This tutorial is your beginner’s guide to using LangGraph. With hands-on examples and complete code, you’ll learn how to design and build LLM workflows that are structured, interactive, and even include human-in-the-loop decision-making.
🧠 Understanding LangGraph
LangGraph is an extension of LangChain, designed to help developers model complex LLM applications using graphs instead of chains. In a LangGraph application, each node represents a processing step (often an LLM), and edges represent how data moves from one step to the next.
Think of it like a flowchart for your AI app, where:
-
Nodes = Agents, tools, or functions
-
Edges = Logical decisions or state transitions
With LangGraph, you can:
-
Coordinate multiple LLMs
-
Add conditional logic
-
Maintain and update state across steps
-
Include tools and external APIs
-
Introduce human-in-the-loop control
-
Support streaming outputs and persistent memory
🔑 Key Concepts of LangGraph
1. Graph Structures
LangGraph applications are graphs, not linear chains. This gives you more flexibility and power to model branching workflows, feedback loops, retries, or even asynchronous interactions.
2. State Management
Every LangGraph has a shared graph state, often represented as a Python TypedDict
, which is updated at each node.
3. Multi-Agent Systems
LangGraph enables coordination between multiple LLM agents, each with a specialized task. You can control how and when each one is called.
4. Persistence
LangGraph supports storing and retrieving state, which is useful for long-running or resumable workflows.
5. Human-in-the-Loop
You can insert human checkpoints—pauses in execution where user input is required to proceed.
6. Streaming
LangGraph supports real-time token streaming, allowing for more responsive interfaces and voice apps.
💡 Why Use LangGraph?
LangGraph makes it easier to build:
-
Smart agents with memory and tools
-
Autonomous workflows
-
Assistants that coordinate multiple tasks
-
Applications that combine AI + human decisions
Instead of wiring together if-else logic and manually managing state, LangGraph lets you define a clean, composable architecture using Python and graph logic.
🛠️ Building the Simplest Graph
Let’s walk through the basic steps to build your first LangGraph app.
Step 1: Define the Graph State
from typing import TypedDict
class GraphState(TypedDict):
messages: list
Step 2: Create the Nodes
Each node is a function that processes and updates the state.
def echo_node(state: GraphState) -> GraphState:
state["messages"].append("Echo: Hello!")
return state
Step 3: Add Conditional Logic
You can add branching by returning different paths based on the state.
def condition_fn(state: GraphState) -> str:
return "end" if "stop" in state["messages"] else "echo"
Step 4: Construct the Graph
from langgraph.graph import StateGraph
builder = StateGraph(GraphState)
builder.add_node("echo", echo_node)
builder.set_conditional_entry_point(condition_fn)
builder.add_edge("echo", "end")
builder.set_finish_point("end")
graph = builder.compile()
Step 5: Invoke the Graph
result = graph.invoke({"messages": []})
print(result)
🤖 Building a Support Chatbot with LangGraph Using OpenAI
Let’s build something more useful: a support chatbot with tools, memory, and human review.
✅ Setup
pip install langgraph langchain openai
Set your OpenAI API key:
import os
os.environ["OPENAI_API_KEY"] = "your-key-here"
📦 Part 1: Build a Basic Chatbot
from langchain.chat_models import ChatOpenAI
from langgraph.prebuilt import chat_agent_executor
llm = ChatOpenAI(model="gpt-3.5-turbo")
executor = chat_agent_executor(llm)
state = {"messages": [{"role": "user", "content": "How do I reset my password?"}]}
result = executor.invoke(state)
print(result["messages"][-1]["content"])
🛠️ Part 2: Enhancing the Chatbot with Tools
Add a tool for handling common tasks, like resetting passwords.
from langchain.tools import tool
@tool
def reset_password(email: str) -> str:
return f"Password reset link sent to {email}"
tools = [reset_password]
executor = chat_agent_executor(llm, tools=tools)
🧠 Part 3: Add Memory to the Chatbot
LangGraph supports persistent memory, allowing your chatbot to remember previous user interactions.
You can store previous messages in the graph state and pass them forward through nodes.
🙋 Part 4: Human-in-the-Loop
LangGraph supports user interruptions or decisions mid-graph.
You can create a node that pauses for human approval:
def human_review(state: GraphState) -> GraphState:
print("Awaiting human review...")
# simulate human input
input("Press Enter to approve...")
return state
Add this node into your graph as a step before completing certain actions.
🚀 Deploying Your LangGraph Applications
LangGraph can be run on your local machine, deployed to servers, or integrated into web apps. You can pair it with LangServe for API deployment or Streamlit/Gradio for UI layers.
🔍 Use Cases of LangGraph
LangGraph is ideal for:
-
Customer support bots with fallback logic
-
Workflow automation with multiple AI agents
-
Coding assistants that use multiple tools
-
AI tutors that adapt to student input
-
Healthcare triage systems that combine AI and doctor reviews
🧠 Conclusion
LangGraph is a powerful tool for anyone looking to build structured, reliable, and flexible LLM applications. By modeling workflows as graphs, it gives you full control over agent behavior, tool integration, and state flow—without the chaos of manual orchestration.
Whether you're building a multi-agent system, a chatbot with memory, or an app that blends AI with human input, LangGraph is the right place to start.
👉 Start simple. Build smarter. Let LangGraph organize your LLM world.