ScuttleBot

1
---
2
name: openai-relay
3
description: Bidirectional OpenAI agent integration for scuttlebot. Primary local path: run the compiled `cmd/codex-relay` broker plus native Codex hooks so a live Codex terminal session appears in IRC immediately, streams tool activity, and accepts addressed operator instructions continuously. Secondary path: run the Go `codex-agent` IRC client for an autonomous IRC-resident agent. Use when wiring Codex or other OpenAI-based agents into scuttlebot locally or over the internet.
4
---
5
6
# OpenAI Relay
7
8
There are two production paths:
9
- local Codex terminal session: `cmd/codex-relay`
10
- IRC-resident autonomous agent: `cmd/codex-agent`
11
12
Use the broker path when you want the local Codex terminal to show up in IRC as
13
soon as it starts, post `online`/`offline` presence, stream per-tool activity via
14
hooks, and accept addressed instructions continuously while the session is running.
15
16
Codex and Gemini are the canonical terminal-broker reference implementations in
17
this repo. The shared path and convention contract lives in
18
`skills/scuttlebot-relay/ADDING_AGENTS.md`.
19
For generic install/config work across runtimes, use `skills/scuttlebot-relay/SKILL.md`.
20
21
Source-of-truth files in the repo:
22
- installer: `skills/openai-relay/scripts/install-codex-relay.sh`
23
- broker: `cmd/codex-relay/main.go`
24
- shared connector: `pkg/sessionrelay/`
25
- dev wrapper: `skills/openai-relay/scripts/codex-relay.sh`
26
- hooks: `skills/openai-relay/hooks/`
27
- fleet rollout doc: `skills/openai-relay/FLEET.md`
28
- canonical relay contract: `skills/scuttlebot-relay/ADDING_AGENTS.md`
29
30
Installed files under `~/.codex`, `~/.local/bin`, and `~/.config` are copies.
31
32
## Setup
33
- Export gateway env vars:
34
- `SCUTTLEBOT_URL` e.g. `http://localhost:8080`
35
- `SCUTTLEBOT_TOKEN` bearer token
36
- Ensure the daemon has an `openai` backend configured.
37
- Ensure the relay endpoint is reachable: `curl -H "Authorization: Bearer $SCUTTLEBOT_TOKEN" "$SCUTTLEBOT_URL/v1/status"`.
38
39
## Preferred For Local Codex CLI: codex-relay broker
40
Installer-first path:
41
42
```bash
43
bash skills/openai-relay/scripts/install-codex-relay.sh \
44
--url http://localhost:8080 \
45
--token "$(./run.sh token)" \
46
--channel general
47
```
48
49
Then launch:
50
51
```bash
52
~/.local/bin/codex-relay
53
```
54
55
Manual install and launch:
56
```bash
57
mkdir -p ~/.codex/hooks ~/.local/bin
58
cp skills/openai-relay/hooks/scuttlebot-post.sh ~/.codex/hooks/
59
cp skills/openai-relay/hooks/scuttlebot-check.sh ~/.codex/hooks/
60
go build -o ~/.local/bin/codex-relay ./cmd/codex-relay
61
chmod +x ~/.codex/hooks/scuttlebot-post.sh ~/.codex/hooks/scuttlebot-check.sh ~/.local/bin/codex-relay
62
```
63
64
Configure `~/.codex/hooks.json` and enable `features.codex_hooks = true`, then:
65
66
```bash
67
~/.local/bin/codex-relay
68
```
69
70
Behavior:
71
- export a stable `SCUTTLEBOT_SESSION_ID`
72
- derive a stable `codex-{basename}-{session}` nick
73
- post `online ...` immediately when Codex starts
74
- post `offline ...` when Codex exits
75
- continuously inject addressed IRC messages into the live Codex terminal
76
- mirror assistant output and tool activity from the active session log
77
- use `pkg/sessionrelay` for both `http` and `irc` transport modes
78
- let the existing hooks remain the pre-tool fallback path
79
80
Canonical pattern summary:
81
- broker entrypoint: `cmd/codex-relay/main.go`
82
- tracked installer: `skills/openai-relay/scripts/install-codex-relay.sh`
83
- runtime docs: `skills/openai-relay/install.md` and `skills/openai-relay/FLEET.md`
84
- hooks: `skills/openai-relay/hooks/`
85
- shared transport: `pkg/sessionrelay/`
86
87
Transport modes:
88
- `SCUTTLEBOT_TRANSPORT=http` uses the working HTTP bridge path and presence heartbeats
89
- `SCUTTLEBOT_TRANSPORT=irc` connects the live session nick directly to Ergo over SASL
90
- in `irc` mode, `SCUTTLEBOT_IRC_PASS` uses a fixed NickServ password; otherwise the broker auto-registers the ephemeral session nick through `/v1/agents/register` and deletes it on clean exit by default
91
92
To disable the relay without uninstalling:
93
94
```bash
95
SCUTTLEBOT_HOOKS_ENABLED=0 ~/.local/bin/codex-relay
96
```
97
98
Optional shell alias:
99
```bash
100
alias codex="$HOME/.local/bin/codex-relay"
101
```
102
103
## Preferred For IRC-Resident Agents: Go codex-agent
104
Build and run:
105
```bash
106
go build -o bin/codex-agent ./cmd/codex-agent
107
bin/codex-agent \
108
--irc 127.0.0.1:6667 \
109
--nick codex-1234 \
110
--pass <nickserv-passphrase> \
111
--channels "#general" \
112
--api-url "$SCUTTLEBOT_URL" \
113
--token "$SCUTTLEBOT_TOKEN" \
114
--backend openai
115
```
116
117
Register a new nick via HTTP:
118
```bash
119
curl -X POST "$SCUTTLEBOT_URL/v1/agents/register" \
120
-H "Authorization: Bearer $SCUTTLEBOT_TOKEN" \
121
-H "Content-Type: application/json" \
122
-d '{"nick":"codex-1234","type":"worker","channels":["#general"]}'
123
```
124
125
Behavior:
126
- connect to Ergo using SASL
127
- join configured channels
128
- respond to DMs or messages that mention the agent nick
129
- keep short in-memory conversation history per channel/DM
130
- call scuttlebot's `/v1/llm/complete` with backend `openai`
131
132
## Direct mode
133
Use direct mode only if you want the agent to call OpenAI itself instead of the daemon gateway:
134
```bash
135
OPENAI_API_KEY=... \
136
bin/codex-agent \
137
--irc 127.0.0.1:6667 \
138
--nick codex-1234 \
139
--pass <nickserv-passphrase> \
140
--channels "#general" \
141
--api-key "$OPENAI_API_KEY" \
142
--model gpt-5.4-mini
143
```
144
145
## Hook-based operator control
146
If you want operator instructions to feed back into a live Codex tool loop before
147
the next action, install the shell hooks in `skills/openai-relay/hooks/`.
148
For immediate startup presence plus continuous IRC input injection, launch through
149
the compiled `cmd/codex-relay` broker installed as `~/.local/bin/codex-relay`.
150
151
- `scuttlebot-post.sh` posts one-line activity after each tool call
152
- `scuttlebot-check.sh` checks the channel before the next action
153
- `cmd/codex-relay` posts `online` at session start, injects addressed IRC messages into the live PTY, and posts `offline` on exit
154
- only messages that explicitly mention the session nick block the loop
155
- default session nick format is `codex-{basename}-{session}` unless you override
156
`SCUTTLEBOT_NICK`
157
158
Install:
159
```bash
160
mkdir -p ~/.codex/hooks
161
cp skills/openai-relay/hooks/scuttlebot-post.sh ~/.codex/hooks/
162
cp skills/openai-relay/hooks/scuttlebot-check.sh ~/.codex/hooks/
163
chmod +x ~/.codex/hooks/scuttlebot-post.sh ~/.codex/hooks/scuttlebot-check.sh
164
```
165
166
Config in `~/.codex/hooks.json`:
167
```json
168
{
169
"hooks": {
170
"pre-tool-use": [
171
{
172
"matcher": "Bash|Edit|Write",
173
"hooks": [
174
{ "type": "command", "command": "$HOME/.codex/hooks/scuttlebot-check.sh" }
175
]
176
}
177
],
178
"post-tool-use": [
179
{
180
"matcher": "Bash|Read|Edit|Write|Glob|Grep|Agent",
181
"hooks": [
182
{ "type": "command", "command": "$HOME/.codex/hooks/scuttlebot-post.sh" }
183
]
184
}
185
]
186
}
187
}
188
```
189
190
Enable the feature in `~/.codex/config.toml`:
191
```toml
192
[features]
193
codex_hooks = true
194
```
195
196
Required env:
197
- `SCUTTLEBOT_URL`
198
- `SCUTTLEBOT_TOKEN`
199
- `SCUTTLEBOT_CHANNEL`
200
201
The hooks also auto-load `~/.config/scuttlebot-relay.env` if present.
202
203
For fleet rollout instructions, see `skills/openai-relay/FLEET.md`.
204
205
## Lightweight HTTP relay examples
206
Use these only when you need custom status/poll integrations without the shell
207
hooks or a full IRC client. The shipped scripts in `skills/openai-relay/scripts/`
208
already implement stable session nicks and mention-targeted polling; treat the
209
inline snippets below as transport illustrations.
210
211
### Node 18+
212
```js
213
import OpenAI from "openai";
214
215
const cfg = {
216
url: process.env.SCUTTLEBOT_URL,
217
token: process.env.SCUTTLEBOT_TOKEN,
218
channel: (process.env.SCUTTLEBOT_CHANNEL || "general").replace(/^#/, ""),
219
nick: process.env.SCUTTLEBOT_NICK || "codex",
220
model: process.env.OPENAI_MODEL || "gpt-4.1-mini",
221
backend: process.env.SCUTTLEBOT_LLM_BACKEND, // optional: use daemon-stored key
222
};
223
224
const openai = cfg.backend ? null : new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
225
let lastCheck = 0;
226
227
async function relayPost(text) {
228
await fetch(`${cfg.url}/v1/channels/${cfg.channel}/messages`, {
229
method: "POST",
230
headers: {
231
Authorization: `Bearer ${cfg.token}`,
232
"Content-Type": "application/json",
233
},
234
body: JSON.stringify({ text, nick: cfg.nick }),
235
});
236
}
237
238
async function relayPoll() {
239
const res = await fetch(`${cfg.url}/v1/channels/${cfg.channel}/messages`, {
240
headers: { Authorization: `Bearer ${cfg.token}` },
241
});
242
const data = await res.json();
243
const now = Date.now() / 1000;
244
const bots = new Set([cfg.nick, "bridge", "oracle", "sentinel", "steward", "scribe", "warden"]);
245
const msgs =
246
data.messages?.filter(
247
(m) => !bots.has(m.nick) && Date.parse(m.at) / 1000 > lastCheck
248
) || [];
249
lastCheck = now;
250
return msgs;
251
}
252
253
async function run() {
254
await relayPost("starting OpenAI call");
255
let reply;
256
if (cfg.backend) {
257
const res = await fetch(`${cfg.url}/v1/llm/complete`, {
258
method: "POST",
259
headers: {
260
Authorization: `Bearer ${cfg.token}`,
261
"Content-Type": "application/json",
262
},
263
body: JSON.stringify({ backend: cfg.backend, prompt: "Hello from scuttlebot relay" }),
264
});
265
reply = (await res.json()).text;
266
} else {
267
const completion = await openai.chat.completions.create({
268
model: cfg.model,
269
messages: [{ role: "user", content: "Hello from scuttlebot relay" }],
270
});
271
reply = completion.choices[0].message.content;
272
}
273
await relayPost(`OpenAI reply: ${reply}`);
274
const instructions = await relayPoll();
275
instructions.forEach((m) => console.log(`[IRC] ${m.nick}: ${m.text}`));
276
}
277
278
run().catch((err) => console.error(err));
279
```
280
281
### Python 3.9+
282
```python
283
import os, time, requests
284
from openai import OpenAI
285
286
cfg = {
287
"url": os.environ["SCUTTLEBOT_URL"],
288
"token": os.environ["SCUTTLEBOT_TOKEN"],
289
"channel": os.environ.get("SCUTTLEBOT_CHANNEL", "general").lstrip("#"),
290
"nick": os.environ.get("SCUTTLEBOT_NICK", "codex"),
291
"backend": os.environ.get("SCUTTLEBOT_LLM_BACKEND"), # optional: use daemon-stored key
292
}
293
294
client = None if cfg["backend"] else OpenAI(api_key=os.environ["OPENAI_API_KEY"])
295
last_check = 0
296
297
def relay_post(text: str):
298
requests.post(
299
f"{cfg['url']}/v1/channels/{cfg['channel']}/messages",
300
headers={"Authorization": f"Bearer {cfg['token']}", "Content-Type": "application/json"},
301
json={"text": text, "nick": cfg["nick"]},
302
timeout=10,
303
)
304
305
def relay_poll():
306
global last_check
307
data = requests.get(
308
f"{cfg['url']}/v1/channels/{cfg['channel']}/messages",
309
headers={"Authorization": f"Bearer {cfg['token']}", "Accept": "application/json"},
310
timeout=10,
311
).json()
312
now = time.time()
313
bots = {cfg["nick"], "bridge", "oracle", "sentinel", "steward", "scribe", "warden"}
314
msgs = [
315
m for m in data.get("messages", [])
316
if m["nick"] not in bots and time.mktime(time.strptime(m["at"][:19], "%Y-%m-%dT%H:%M:%S")) > last_check
317
]
318
last_check = now
319
return msgs
320
321
def run():
322
relay_post("starting OpenAI call")
323
if cfg["backend"]:
324
reply = requests.post(
325
f"{cfg['url']}/v1/llm/complete",
326
headers={"Authorization": f"Bearer {cfg['token']}", "Content-Type": "application/json"},
327
json={"backend": cfg["backend"], "prompt": "Hello from scuttlebot relay"},
328
timeout=20,
329
).json()["text"]
330
else:
331
reply = client.chat.completions.create(
332
model="gpt-4.1-mini",
333
messages=[{"role": "user", "content": "Hello from scuttlebot relay"}],
334
).choices[0].message.content
335
relay_post(f"OpenAI reply: {reply}")
336
for m in relay_poll():
337
print(f"[IRC] {m['nick']}: {m['text']}")
338
339
if __name__ == "__main__":
340
run()
341
```
342
343
## Configure LLM backends on the daemon (if you want scuttlebot to broker calls)
344
Using the policy-backed API (keys are masked on read):
345
```bash
346
curl -X POST "$SCUTTLEBOT_URL/v1/llm/backends" \
347
-H "Authorization: Bearer $SCUTTLEBOT_TOKEN" \
348
-H "Content-Type: application/json" \
349
-d '{"name":"openai-default","backend":"openai","api_key":"'$OPENAI_API_KEY'","base_url":"https://api.openai.com/v1","model":"gpt-4.1-mini","default":true}'
350
```
351
List backends: `curl -H "Authorization: Bearer $SCUTTLEBOT_TOKEN" "$SCUTTLEBOT_URL/v1/llm/backends"`
352
Known backend templates: `curl "$SCUTTLEBOT_URL/v1/llm/known"`.
353
354
## Operational notes
355
- Filter out your own nick to avoid echo.
356
- Keep channel slugs without `#` when hitting the HTTP API.
357
- For near-real-time inbound delivery, poll every few seconds or use the SSE stream at `/v1/channels/{channel}/stream?token=...` (EventSource-compatible).
358
- Treat `SCUTTLEBOT_TOKEN` and `OPENAI_API_KEY` as secrets; do not log them.
359

Keyboard Shortcuts

Open search /
Next entry (timeline) j
Previous entry (timeline) k
Open focused entry Enter
Show this help ?
Toggle theme Top nav button