Skip to content

Commit 834b673

Browse files
OpenAPI (#1571)
* basic metadata working * swigger ui * run script * ✨ Move serverHost declaration to improve clarity serverHost constant is declared earlier in the function for clarity. * fixing cors * fix schema * ✨ Add externalDocs and params schema to OpenAPI config Enhanced OpenAPI setup with external documentation and params schema. * docs * Update packages/cli/src/openapi.ts Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> * Update packages/cli/src/openapi.ts Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> * Update docs/src/content/docs/reference/openapi-server.mdx Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> * Update packages/cli/src/openapi.ts Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> * Update packages/cli/src/openapi.ts Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> * ♻️ Refactor query parameter handling in OpenAPI server Replaced params with query, removed unnecessary schema definitions. * typo * Update docs/src/content/docs/reference/openapi-server.mdx Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> * ✨ fix: update tool run function to include file input - Replaced empty array with 'files' argument in the run function. * ♻️ Remove duplicate import from openapi module - Eliminated redundant ScriptFilterOptions import line * unlisted --------- Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
1 parent cf3ba03 commit 834b673

11 files changed

Lines changed: 916 additions & 149 deletions

File tree

docs/src/content/docs/reference/cli/commands.md

Lines changed: 99 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -531,20 +531,37 @@ Usage: genaiscript serve [options]
531531
Start a GenAIScript local web server
532532
533533
Options:
534-
-p, --port <number> Specify the port number, default: 8003
535-
-k, --api-key <string> API key to authenticate requests
536-
-n, --network Opens server on 0.0.0.0 to make it accessible
537-
on the network
538-
-c, --cors <string> Enable CORS and sets the allowed origin. Use
539-
'*' to allow any origin.
540-
--dispatch-progress Dispatch progress events to all clients
541-
--github-copilot-chat-client Allow github_copilot_chat provider to connect
542-
to connected Visual Studio Code
543-
--remote <string> Remote repository URL to serve
544-
--remote-branch <string> Branch to serve from the remote
545-
--remote-force Force pull from remote repository
546-
--remote-install Install dependencies from remote repository
547-
-h, --help display help for command
534+
--port <number> Specify the port number, default: 8003
535+
-k, --api-key <string> API key to authenticate requests
536+
-n, --network Opens server on 0.0.0.0 to make it
537+
accessible on the network
538+
-c, --cors <string> Enable CORS and sets the allowed origin.
539+
Use '*' to allow any origin.
540+
--dispatch-progress Dispatch progress events to all clients
541+
--github-copilot-chat-client Allow github_copilot_chat provider to
542+
connect to connected Visual Studio Code
543+
--remote <string> Remote repository URL to serve
544+
--remote-branch <string> Branch to serve from the remote
545+
--remote-force Force pull from remote repository
546+
--remote-install Install dependencies from remote repository
547+
-p, --provider <string> Preferred LLM provider aliases (choices:
548+
"openai", "azure", "azure_ai_inference",
549+
"azure_serverless",
550+
"azure_serverless_models", "github",
551+
"ollama", "windows_ai", "anthropic",
552+
"anthropic_bedrock", "google",
553+
"huggingface", "mistral", "alibaba",
554+
"deepseek", "transformers", "lmstudio",
555+
"jan", "llamafile", "sglang", "vllm",
556+
"litellm", "whisperasr", "echo")
557+
-m, --model <string> 'large' model alias (default)
558+
-sm, --small-model <string> 'small' alias model
559+
-vm, --vision-model <string> 'vision' alias model
560+
-em, --embeddings-model <string> 'embeddings' alias model
561+
-ma, --model-alias <nameid...> model alias as name=modelid
562+
-re, --reasoning-effort <string> Reasoning effort for o* models (choices:
563+
"high", "medium", "low")
564+
-h, --help display help for command
548565
```
549566

550567
## `mcp`
@@ -555,15 +572,74 @@ Usage: genaiscript mcp|mcps [options]
555572
Starts a Model Context Protocol server that exposes scripts as tools
556573
557574
Options:
558-
--groups <string...> Filter script by groups
559-
--ids <string...> Filter script by ids
560-
--startup <string> Startup script id, executed after the server is
561-
started
562-
--remote <string> Remote repository URL to serve
563-
--remote-branch <string> Branch to serve from the remote
564-
--remote-force Force pull from remote repository
565-
--remote-install Install dependencies from remote repository
566-
-h, --help display help for command
575+
--groups <string...> Filter script by groups
576+
--ids <string...> Filter script by ids
577+
--startup <string> Startup script id, executed after the
578+
server is started
579+
--remote <string> Remote repository URL to serve
580+
--remote-branch <string> Branch to serve from the remote
581+
--remote-force Force pull from remote repository
582+
--remote-install Install dependencies from remote repository
583+
-p, --provider <string> Preferred LLM provider aliases (choices:
584+
"openai", "azure", "azure_ai_inference",
585+
"azure_serverless",
586+
"azure_serverless_models", "github",
587+
"ollama", "windows_ai", "anthropic",
588+
"anthropic_bedrock", "google",
589+
"huggingface", "mistral", "alibaba",
590+
"deepseek", "transformers", "lmstudio",
591+
"jan", "llamafile", "sglang", "vllm",
592+
"litellm", "whisperasr", "echo")
593+
-m, --model <string> 'large' model alias (default)
594+
-sm, --small-model <string> 'small' alias model
595+
-vm, --vision-model <string> 'vision' alias model
596+
-em, --embeddings-model <string> 'embeddings' alias model
597+
-ma, --model-alias <nameid...> model alias as name=modelid
598+
-re, --reasoning-effort <string> Reasoning effort for o* models (choices:
599+
"high", "medium", "low")
600+
-h, --help display help for command
601+
```
602+
603+
## `openapi`
604+
605+
```
606+
Usage: genaiscript openapi|api [options]
607+
608+
Starts an OpenAPI 3.1.1 server that exposes scripts as /api/tools/<id>
609+
endpoints
610+
611+
Options:
612+
-n, --network Opens server on 0.0.0.0 to make it
613+
accessible on the network
614+
--port <number> Specify the port number, default: 8003
615+
-c, --cors <string> Enable CORS and sets the allowed origin.
616+
Use '*' to allow any origin.
617+
--groups <string...> Filter script by groups
618+
--ids <string...> Filter script by ids
619+
--startup <string> Startup script id, executed after the
620+
server is started
621+
--remote <string> Remote repository URL to serve
622+
--remote-branch <string> Branch to serve from the remote
623+
--remote-force Force pull from remote repository
624+
--remote-install Install dependencies from remote repository
625+
-p, --provider <string> Preferred LLM provider aliases (choices:
626+
"openai", "azure", "azure_ai_inference",
627+
"azure_serverless",
628+
"azure_serverless_models", "github",
629+
"ollama", "windows_ai", "anthropic",
630+
"anthropic_bedrock", "google",
631+
"huggingface", "mistral", "alibaba",
632+
"deepseek", "transformers", "lmstudio",
633+
"jan", "llamafile", "sglang", "vllm",
634+
"litellm", "whisperasr", "echo")
635+
-m, --model <string> 'large' model alias (default)
636+
-sm, --small-model <string> 'small' alias model
637+
-vm, --vision-model <string> 'vision' alias model
638+
-em, --embeddings-model <string> 'embeddings' alias model
639+
-ma, --model-alias <nameid...> model alias as name=modelid
640+
-re, --reasoning-effort <string> Reasoning effort for o* models (choices:
641+
"high", "medium", "low")
642+
-h, --help display help for command
567643
```
568644

569645
## `parse`
Lines changed: 119 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,119 @@
1+
---
2+
title: OpenAPI Server
3+
description: The OpenAPI Server exposes scripts as OpenAPI endpoints.
4+
---
5+
6+
You can launch the [cli](/genaiscript/reference/cli) as a [OpenAPI server](https://swagger.io/specification/)
7+
to serve scripts as OpenAPI endpoints.
8+
9+
```bash
10+
genaiscript openapi
11+
```
12+
13+
## Scripts as OpenAPI endpoints
14+
15+
The OpenAPI endpoint description is the script description.
16+
**Make sure to carefully craft the description** as it is how the LLM decides
17+
which tool to use when running a script. If your tool does not get picked up by the LLM, it's probably a description issue.
18+
19+
The OpenAPI endpoint parameters is inferred from the [script parameters](/genaiscript/reference/scripts/parameters) and files automatically.
20+
The OpenAPI parameters will then populate the `env.vars` object in the script
21+
as usual.
22+
23+
The OpenAPI endpoint output is the script output. That is, typically, the last assistant message for a script that uses the top-level context.
24+
The OpenAPI endpoint output corresponds to the script's output, typically the last assistant message or any content passed to [env.output](/genaiscript/reference/scripts/output-builder).
25+
26+
Let's see an example. Here is a script `task.genai.mjs` that takes a `task` parameter input, builds a prompt
27+
and the LLM output is sent back.
28+
29+
```js title="task.genai.mjs"
30+
script({
31+
description: "You MUST provide a description!",
32+
parameters: {
33+
task: {
34+
type: "string",
35+
description: "The task to perform",
36+
required: true
37+
}
38+
}
39+
})
40+
41+
const { task } = env.vars // extract the task parameter
42+
43+
... // genaiscript logic
44+
$`... prompt ... ${task}` // output the result
45+
```
46+
47+
A more advanced script might not use the top-level context and instead use the `env.output` to pass the result.
48+
49+
```js title="task.genai.mjs"
50+
script({
51+
description: "You should provide a description!",
52+
accept: "none", // this script does not use 'env.files'
53+
parameters: {
54+
task: {
55+
type: "string",
56+
description: "The task to perform",
57+
required: true
58+
}
59+
}
60+
})
61+
62+
const { output } = env // store the output builder
63+
const { task } = env.vars // extract the task parameter
64+
65+
... // genaiscript logic with inline prompts
66+
const res = runPrompt(_ => `... prompt ... ${task}`) // run some inner the prompt
67+
...
68+
69+
// build the output
70+
output.fence(`The result is ${res.text}`)
71+
```
72+
73+
## Startup script
74+
75+
You can specify a startup script id in the command line using the `--startup` option.
76+
It will run after the server is started.
77+
78+
```sh
79+
genaiscript openapi --startup load-resources
80+
```
81+
82+
You can use this script to load resources or do any other setup you need.
83+
84+
### Filtering scripts
85+
86+
If you need to filter out which scripts are exposed as OpenAPI endpoints, you can use the `--groups` flag and
87+
set the `openapi` group in your scripts.
88+
89+
```js 'group: "openapi"' title="task.genai.mjs"
90+
script({
91+
group: "openapi",
92+
})
93+
```
94+
95+
```bash
96+
genaiscript openapi --groups openapi
97+
```
98+
99+
## Running scripts from a remote repository
100+
101+
You can use the `--remote` option to load scripts from a remote repository.
102+
GenAIScript will do a shallow clone of the repository and run the script from the clone folder.
103+
104+
```sh
105+
npx --yes genaiscript openapi --remote https://github.com/...
106+
```
107+
108+
There are additional flags to how the repository is cloned:
109+
110+
- `--remote-branch <branch>`: The branch to clone from the remote repository.
111+
- `--remote-force`: Force the clone even if the cloned folder already exists.
112+
- `--remote-install`: Install dependencies after cloning the repository.
113+
114+
:::caution
115+
116+
As usual, be careful when running scripts from a remote repository.
117+
Make sure you trust the source before running the script and consider locking to a specific commit.
118+
119+
:::

package.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -71,6 +71,7 @@
7171
"test:scripts:view": "cd packages/sample/ && yarn test:scripts:view",
7272
"serve:cli": "node --watch --watch-path=packages/cli/built packages/cli/built/genaiscript.cjs serve --dispatch-progress",
7373
"serve:web": "yarn --cwd packages/web watch",
74+
"serve:openapi": "node --watch --watch-path=packages/cli/built packages/cli/built/genaiscript.cjs openapi --network --cors \"*\"",
7475
"serve": "yarn compile:cli && run-p serve:*",
7576
"docs": "cd docs && ./node_modules/.bin/astro telemetry disable && ./node_modules/.bin/astro dev --host",
7677
"slides": "cd slides && yarn run dev",

packages/cli/package.json

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,6 +63,9 @@
6363
"@anthropic-ai/sdk": "0.51.0",
6464
"@azure/identity": "^4.10.0",
6565
"@azure/search-documents": "^12.1.0",
66+
"@fastify/cors": "^11.0.1",
67+
"@fastify/swagger": "^9.5.1",
68+
"@fastify/swagger-ui": "^5.2.2",
6669
"@huggingface/jinja": "^0.5.0",
6770
"@inquirer/prompts": "^7.5.1",
6871
"@modelcontextprotocol/sdk": "^1.11.4",
@@ -71,10 +74,12 @@
7174
"@octokit/plugin-retry": "^8.0.1",
7275
"@octokit/plugin-throttling": "^11.0.1",
7376
"@octokit/rest": "^21.1.1",
77+
"cross-fetch": "^4.1.0",
7478
"debug": "^4.4.1",
7579
"dockerode": "^4.0.6",
7680
"dompurify": "^3.2.6",
7781
"es-toolkit": "^1.38.0",
82+
"fastify": "^5.3.3",
7883
"file-type": "^20.5.0",
7984
"fluent-ffmpeg": "^2.1.3",
8085
"gpt-tokenizer": "^2.9.0",
@@ -86,9 +91,8 @@
8691
"mathjs": "^14.5.0",
8792
"mermaid": "^11.6.0",
8893
"node-fetch": "^3.3.2",
89-
"cross-fetch": "^4.1.0",
90-
"pyodide": "^0.27.6",
9194
"openai": "^4.100.0",
95+
"pyodide": "^0.27.6",
9296
"supports-color": "^10.0.0",
9397
"tabletojson": "^4.1.6",
9498
"toml": "^3.0.0",

packages/cli/src/cli.ts

Lines changed: 34 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -74,6 +74,7 @@ import { listRuns } from "./runs"
7474
import { startMcpServer } from "./mcpserver"
7575
import { error } from "./log"
7676
import { DEBUG_CATEGORIES } from "../../core/src/dbg"
77+
import { startOpenAPIServer } from "./openapi"
7778

7879
/**
7980
* /NOП/
@@ -499,7 +500,7 @@ export async function cli() {
499500
.command("serve")
500501
.description("Start a GenAIScript local web server")
501502
.option(
502-
"-p, --port <number>",
503+
"--port <number>",
503504
`Specify the port number, default: ${SERVER_PORT}`
504505
)
505506
.option("-k, --api-key <string>", "API key to authenticate requests")
@@ -521,6 +522,7 @@ export async function cli() {
521522
)
522523
.action(startServer) // Action to start the server
523524
addRemoteOptions(serve) // Add remote options to the command
525+
addModelOptions(serve)
524526

525527
const mcp = program
526528
.command("mcp")
@@ -535,7 +537,37 @@ export async function cli() {
535537
"Starts a Model Context Protocol server that exposes scripts as tools"
536538
)
537539
.action(startMcpServer)
538-
addRemoteOptions(mcp) // Add remote options to the command
540+
addRemoteOptions(mcp)
541+
addModelOptions(mcp)
542+
543+
const openapi = program
544+
.command("openapi")
545+
.option(
546+
"-n, --network",
547+
"Opens server on 0.0.0.0 to make it accessible on the network"
548+
)
549+
.option(
550+
"--port <number>",
551+
`Specify the port number, default: ${SERVER_PORT}`
552+
)
553+
.option(
554+
"-c, --cors <string>",
555+
"Enable CORS and sets the allowed origin. Use '*' to allow any origin."
556+
)
557+
558+
.option("--groups <string...>", "Filter script by groups")
559+
.option("--ids <string...>", "Filter script by ids")
560+
.option(
561+
"--startup <string>",
562+
"Startup script id, executed after the server is started"
563+
)
564+
.alias("api")
565+
.description(
566+
"Starts an OpenAPI 3.1.1 server that exposes scripts as /api/tools/<id> endpoints"
567+
)
568+
.action(startOpenAPIServer)
569+
addRemoteOptions(openapi)
570+
addModelOptions(openapi)
539571

540572
// Define 'parse' command group for parsing tasks
541573
const parser = program

0 commit comments

Comments
 (0)