You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+4-64Lines changed: 4 additions & 64 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -70,10 +70,6 @@ Claude Code, Cursor CLI, OpenAI Codex, Gemini CLI, Kiro CLI, OpenCode, and Ollam
70
70
71
71
Terminal, process manager, git, file search, HTTP client, environment variables, network diagnostics, cron jobs, and system info all callable by the LLM.
72
72
73
-
### 18 MCP Servers
74
-
75
-
Connect GitHub, Brave Search, Puppeteer, PostgreSQL, MongoDB, Redis, Elasticsearch, AWS, GCP, Cloudflare, Vercel, Atlassian, Supabase, CircleCI, Postman, Stripe, ElevenLabs, and Kaggle as external tools via the Model Context Protocol.
76
-
77
73
### Session Logging
78
74
79
75
Per-session logs accessible from the TUI. Follow live, view by index, auto-pruned after 7 days.
@@ -125,9 +121,9 @@ docker run -it \
125
121
txtcode
126
122
```
127
123
128
-
| Flag | Purpose |
129
-
| :--- | :------ |
130
-
|`-v $(pwd):/workspace`| Mounts your project directory into the container |
|`-v $(pwd):/workspace`| Mounts your project directory into the container|
131
127
|`-v ~/.txtcode:/root/.txtcode`| Persists config, session data, and logs across runs |
132
128
133
129
> **Note:** API keys are stored securely via your OS keychain when running natively. Inside Docker, txtcode uses an encrypted file-based fallback (`TXTCODE_DOCKER=1` is set automatically). You can also pass keys as environment variables with `-e`, e.g. `-e ANTHROPIC_API_KEY=sk-...`.
@@ -163,7 +159,7 @@ txtcode supports **9 LLM providers** for chat mode. Configure one or more during
163
159
|**HuggingFace**|_Discovered at runtime_| Inference Providers API |
164
160
|**OpenRouter**|_Discovered at runtime_| Unified API for 100+ models |
165
161
166
-
All providers support tool calling and LLM can invoke any built-in tool or connected MCP server.
162
+
All providers support tool calling and the LLM can invoke any built-in tool.
167
163
168
164
---
169
165
@@ -201,52 +197,6 @@ The primary LLM in chat mode has access to **9 built-in tools** that it can call
201
197
202
198
---
203
199
204
-
## 📟 MCP Servers
205
-
206
-
txtcode integrates with the **Model Context Protocol** to connect external tool servers. Configure during initial setup or later via **Configuration**→**Manage MCP Servers** in the TUI.
> **stdio** = local process, **HTTP** = remote Streamable HTTP endpoint. You can also add custom MCP servers via **Configuration**→**Manage MCP Servers**.
247
-
248
-
---
249
-
250
200
## 💬 Chat Commands
251
201
252
202
Send these commands in any messaging app while connected:
@@ -272,7 +222,6 @@ To modify settings, select **Configuration** from the main menu. Options include
272
222
- Change Messaging Platform
273
223
- Change Coding CLI Type
274
224
- Change AI Provider
275
-
- Manage MCP Servers (add/remove/enable/disable)
276
225
- Change Project Path
277
226
- View Current Config
278
227
@@ -327,13 +276,4 @@ Verbose and debug output goes to the log file; the terminal shows only key statu
327
276
328
277
</details>
329
278
330
-
<details>
331
-
<summary><b>MCP server connection failures</b></summary>
332
-
333
-
-**stdio servers:** ensure the required npm package is installed (e.g. `npx @modelcontextprotocol/server-github`)
334
-
-**HTTP servers:** verify the token is correct via **Configuration**→**Manage MCP Servers**
0 commit comments