You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+16-12Lines changed: 16 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,8 @@
1
1
# Valkey MCP Task Management Server
2
2
3
-
[](https://github.com/jbrinkman/valkey-ai-tasks/actions/workflows/publish-container-image.yml)
A task management system that implements the Model Context Protocol (MCP) for seamless integration with agentic AI tools. This system allows AI agents to create, manage, and track tasks within plans using Valkey as the persistence layer.
@@ -13,6 +15,7 @@ A task management system that implements the Model Context Protocol (MCP) for se
13
15
- Status tracking for tasks
14
16
- Notes support with Markdown formatting for both plans and tasks
15
17
- MCP server for AI agent integration
18
+
- Supports STDIO, SSE and Streamable HTTP transport protocols
16
19
- Docker container support for easy deployment
17
20
18
21
## Architecture
@@ -45,7 +48,7 @@ docker run -d --name valkey-mcp \
45
48
-p 6379:6379 \
46
49
-v valkey-data:/data \
47
50
-e ENABLE_SSE=true \
48
-
valkey-tasks-mcp-server:latest
51
+
ghcr.io/jbrinkman/valkey-ai-tasks:latest
49
52
```
50
53
51
54
#### Running with Streamable HTTP
@@ -56,7 +59,7 @@ docker run -d --name valkey-mcp \
56
59
-p 6379:6379 \
57
60
-v valkey-data:/data \
58
61
-e ENABLE_STREAMABLE_HTTP=true \
59
-
valkey-tasks-mcp-server:latest
62
+
ghcr.io/jbrinkman/valkey-ai-tasks:latest
60
63
```
61
64
62
65
#### Running with STDIO (For direct process communication)
@@ -65,7 +68,7 @@ docker run -d --name valkey-mcp \
65
68
docker run -i --rm --name valkey-mcp \
66
69
-v valkey-data:/data \
67
70
-e ENABLE_STDIO=true \
68
-
valkey-tasks-mcp-server:latest
71
+
ghcr.io/jbrinkman/valkey-ai-tasks:latest
69
72
```
70
73
71
74
### Using the Container Images
@@ -75,7 +78,7 @@ The container images are published to GitHub Container Registry and can be pulle
@@ -135,10 +138,12 @@ The server automatically selects the appropriate transport based on:
135
138
136
139
### Local MCP Configuration
137
140
138
-
To configure an AI agent to use the local MCP server, add the following to your `~/.codeium/windsurf/mcp_config.json`file:
141
+
To configure an AI agent to use the local MCP server, add the following to your MCP configuration file (the exact file location depends on your AI Agent):
139
142
140
143
#### Using SSE Transport (Default)
141
144
145
+
> Note: The docker container should already be running.
146
+
142
147
```json
143
148
{
144
149
"mcpServers": {
@@ -151,12 +156,13 @@ To configure an AI agent to use the local MCP server, add the following to your
151
156
152
157
#### Using Streamable HTTP Transport
153
158
159
+
> Note: The docker container should already be running.
160
+
154
161
```json
155
162
{
156
163
"mcpServers": {
157
164
"valkey-tasks": {
158
-
"serverUrl": "http://localhost:8080/mcp",
159
-
"transport": "streamable-http"
165
+
"serverUrl": "http://localhost:8080/mcp"
160
166
}
161
167
}
162
168
}
@@ -178,11 +184,9 @@ For agentic tools that need to start and manage the MCP server process, use a co
0 commit comments