This template demonstrates a simple application implemented using LangGraph, designed for showing how to get started with LangGraph Server and using LangGraph Studio, a visual debugging IDE.
The core logic defined in src/agent/graph.py
, showcases an single-step application that responds with a fixed string and the configuration provided.
You can extend this graph to orchestrate more complex agentic workflows that can be visualized and debugged in LangGraph Studio.
- Install dependencies, along with the LangGraph CLI, which will be used to run the server.
cd path/to/your/app
pip install -e . "langgraph-cli[inmem]"
- (Optional) Customize the code and project as needed. Create a
.env
file if you need to use secrets.
cp .env.example .env
If you want to enable LangSmith tracing, add your LangSmith API key to the .env
file.
# .env
LANGSMITH_API_KEY=lsv2...
- Start the LangGraph Server.
langgraph dev
For more information on getting started with LangGraph Server, see here.
-
Define configurable parameters: Modify the
Configuration
class in thegraph.py
file to expose the arguments you want to configure. For example, in a chatbot application you may want to define a dynamic system prompt or LLM to use. For more information on configurations in LangGraph, see here. -
Extend the graph: The core logic of the application is defined in graph.py. You can modify this file to add new nodes, edges, or change the flow of information.
While iterating on your graph in LangGraph Studio, you can edit past state and rerun your app from previous states to debug specific nodes. Local changes will be automatically applied via hot reload.
Follow-up requests extend the same thread. You can create an entirely new thread, clearing previous history, using the +
button in the top right.
For more advanced features and examples, refer to the LangGraph documentation. These resources can help you adapt this template for your specific use case and build more sophisticated conversational agents.
LangGraph Studio also integrates with LangSmith for more in-depth tracing and collaboration with teammates, allowing you to analyze and optimize your chatbot's performance.