Skip to content

Commit 967e560

Browse files
authored
Enhance README.md with quick start guide for clients (#394)
Enhance README.md with quick start guide for clients and update environment variable instructions. Update uv.lock to reflect package version changes and add new dependencies. Notable updates include bumping graphiti-core to 0.10.5 and adding graph-service as a dependency.
1 parent 190b187 commit 967e560

File tree

2 files changed

+265
-220
lines changed

2 files changed

+265
-220
lines changed

mcp_server/README.md

+61-50
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,30 @@ The Graphiti MCP server exposes the following key high-level functions of Graphi
2121
- **Group Management**: Organize and manage groups of related data with group_id filtering
2222
- **Graph Maintenance**: Clear the graph and rebuild indices
2323

24+
## Quick Start for Claude Desktop, Cursor, and other clients
25+
26+
1. Clone the Graphiti GitHub repo
27+
28+
```bash
29+
git clone https://github.com/getzep/graphiti.git
30+
```
31+
32+
or
33+
34+
```bash
35+
gh repo clone getzep/graphiti
36+
```
37+
38+
Note the full path to this directory.
39+
40+
```
41+
cd graphiti && pwd
42+
```
43+
44+
2. Install the [Graphiti prerequisites](#prerequisites).
45+
46+
3. Configure Claude, Cursor, or other MCP client to use [Graphiti with a `stdio` transport](#integrating-with-mcp-clients). See the client documentation on where to find their MCP configuration files.
47+
2448
## Installation
2549

2650
### Prerequisites
@@ -94,25 +118,25 @@ Before running the Docker Compose setup, you need to configure the environment v
94118

95119
1. **Using a .env file** (recommended):
96120

97-
- Copy the provided `.env.example` file to create a `.env` file:
98-
```bash
99-
cp .env.example .env
100-
```
101-
- Edit the `.env` file to set your OpenAI API key and other configuration options:
102-
```
103-
# Required for LLM operations
104-
OPENAI_API_KEY=your_openai_api_key_here
105-
MODEL_NAME=gpt-4.1-mini
106-
# Optional: OPENAI_BASE_URL only needed for non-standard OpenAI endpoints
107-
# OPENAI_BASE_URL=https://api.openai.com/v1
108-
```
109-
- The Docker Compose setup is configured to use this file if it exists (it's optional)
121+
- Copy the provided `.env.example` file to create a `.env` file:
122+
```bash
123+
cp .env.example .env
124+
```
125+
- Edit the `.env` file to set your OpenAI API key and other configuration options:
126+
```
127+
# Required for LLM operations
128+
OPENAI_API_KEY=your_openai_api_key_here
129+
MODEL_NAME=gpt-4.1-mini
130+
# Optional: OPENAI_BASE_URL only needed for non-standard OpenAI endpoints
131+
# OPENAI_BASE_URL=https://api.openai.com/v1
132+
```
133+
- The Docker Compose setup is configured to use this file if it exists (it's optional)
110134
111135
2. **Using environment variables directly**:
112-
- You can also set the environment variables when running the Docker Compose command:
113-
```bash
114-
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up
115-
```
136+
- You can also set the environment variables when running the Docker Compose command:
137+
```bash
138+
OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up
139+
```
116140
117141
#### Neo4j Configuration
118142
@@ -151,23 +175,33 @@ This will start both the Neo4j database and the Graphiti MCP server. The Docker
151175
152176
To use the Graphiti MCP server with an MCP-compatible client, configure it to connect to the server:
153177
178+
> [!IMPORTANT]
179+
> You will need the Python package manager, `uv` installed. Please refer to the [`uv` install instructions](https://docs.astral.sh/uv/getting-started/installation/).
180+
>
181+
> Ensure that you set the full path to the `uv` binary and your Graphiti project folder.
182+
154183
```json
155184
{
156185
"mcpServers": {
157186
"graphiti": {
158187
"transport": "stdio",
159-
"command": "uv",
188+
"command": "/Users/<user>/.local/bin/uv",
160189
"args": [
161190
"run",
162-
"/ABSOLUTE/PATH/TO/graphiti_mcp_server.py",
191+
"--isolated",
192+
"--directory",
193+
"/Users/<user>>/dev/zep/graphiti/mcp_server",
194+
"--project",
195+
".",
196+
"graphiti_mcp_server.py",
163197
"--transport",
164198
"stdio"
165199
],
166200
"env": {
167201
"NEO4J_URI": "bolt://localhost:7687",
168202
"NEO4J_USER": "neo4j",
169-
"NEO4J_PASSWORD": "demodemo",
170-
"OPENAI_API_KEY": "${OPENAI_API_KEY}",
203+
"NEO4J_PASSWORD": "password",
204+
"OPENAI_API_KEY": "sk-XXXXXXXX",
171205
"MODEL_NAME": "gpt-4.1-mini"
172206
}
173207
}
@@ -188,31 +222,6 @@ For SSE transport (HTTP-based), you can use this configuration:
188222
}
189223
```
190224
191-
Or start the server with uv and connect to it:
192-
193-
```json
194-
{
195-
"mcpServers": {
196-
"graphiti": {
197-
"command": "uv",
198-
"args": [
199-
"run",
200-
"/ABSOLUTE/PATH/TO/graphiti_mcp_server.py",
201-
"--transport",
202-
"sse"
203-
],
204-
"env": {
205-
"NEO4J_URI": "bolt://localhost:7687",
206-
"NEO4J_USER": "neo4j",
207-
"NEO4J_PASSWORD": "demodemo",
208-
"OPENAI_API_KEY": "${OPENAI_API_KEY}",
209-
"MODEL_NAME": "gpt-4.1-mini"
210-
}
211-
}
212-
}
213-
}
214-
```
215-
216225
## Available Tools
217226
218227
The Graphiti MCP server exposes the following tools:
@@ -233,12 +242,14 @@ The Graphiti MCP server can process structured JSON data through the `add_episod
233242
allows you to automatically extract entities and relationships from structured data:
234243
235244
```
245+
236246
add_episode(
237-
name="Customer Profile",
238-
episode_body="{\"company\": {\"name\": \"Acme Technologies\"}, \"products\": [{\"id\": \"P001\", \"name\": \"CloudSync\"}, {\"id\": \"P002\", \"name\": \"DataMiner\"}]}",
239-
source="json",
240-
source_description="CRM data"
247+
name="Customer Profile",
248+
episode_body="{\"company\": {\"name\": \"Acme Technologies\"}, \"products\": [{\"id\": \"P001\", \"name\": \"CloudSync\"}, {\"id\": \"P002\", \"name\": \"DataMiner\"}]}",
249+
source="json",
250+
source_description="CRM data"
241251
)
252+
242253
```
243254
244255
## Integrating with the Cursor IDE

0 commit comments

Comments
 (0)