Skip to content

chore: generate documentation for cli #432

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jun 5, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 26 additions & 0 deletions .github/workflows/validate-docs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
name: Validate docs build
on:
push:
paths:
- docs/**
branches:
- main
pull_request:
paths:
- docs/**
branches:
- main

jobs:
validate-docs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/setup-go@v4
with:
cache: false
go-version: "1.22"
- run: make init-docs
- run: make validate-docs
3 changes: 2 additions & 1 deletion .golangci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,8 @@ run:
timeout: 5m

output:
format: github-actions
formats:
- github-actions

linters:
disable-all: true
Expand Down
16 changes: 16 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -44,3 +44,19 @@ ci: build

serve-docs:
(cd docs && npm i && npm start)


# This will initialize the node_modules needed to run the docs dev server. Run this before running serve-docs
init-docs:
docker run --rm --workdir=/docs -v $${PWD}/docs:/docs node:18-buster yarn install

# Ensure docs build without errors. Makes sure generated docs are in-sync with CLI.
validate-docs:
docker run --rm --workdir=/docs -v $${PWD}/docs:/docs node:18-buster yarn build
go run tools/gendocs/main.go
if [ -n "$$(git status --porcelain --untracked-files=no)" ]; then \
git status --porcelain --untracked-files=no; \
echo "Encountered dirty repo!"; \
git diff; \
exit 1 \
;fi
55 changes: 55 additions & 0 deletions docs/docs/100-reference/01-command-line/gptscript.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
---
title: "gptscript"
---
## gptscript



```
gptscript [flags] PROGRAM_FILE [INPUT...]
```

### Options

```
--cache-dir string Directory to store cache (default: $XDG_CACHE_HOME/gptscript) ($GPTSCRIPT_CACHE_DIR)
--chat-state string The chat state to continue, or null to start a new chat and return the state ($GPTSCRIPT_CHAT_STATE)
-C, --chdir string Change current working directory ($GPTSCRIPT_CHDIR)
--color Use color in output (default true) ($GPTSCRIPT_COLOR)
--config string Path to GPTScript config file ($GPTSCRIPT_CONFIG)
--confirm Prompt before running potentially dangerous commands ($GPTSCRIPT_CONFIRM)
--credential-context string Context name in which to store credentials ($GPTSCRIPT_CREDENTIAL_CONTEXT) (default "default")
--credential-override string Credentials to override (ex: --credential-override github.com/example/cred-tool:API_TOKEN=1234) ($GPTSCRIPT_CREDENTIAL_OVERRIDE)
--debug Enable debug logging ($GPTSCRIPT_DEBUG)
--debug-messages Enable logging of chat completion calls ($GPTSCRIPT_DEBUG_MESSAGES)
--default-model string Default LLM model to use ($GPTSCRIPT_DEFAULT_MODEL) (default "gpt-4o")
--disable-cache Disable caching of LLM API responses ($GPTSCRIPT_DISABLE_CACHE)
--disable-tui Don't use chat TUI but instead verbose output ($GPTSCRIPT_DISABLE_TUI)
--dump-state string Dump the internal execution state to a file ($GPTSCRIPT_DUMP_STATE)
--events-stream-to string Stream events to this location, could be a file descriptor/handle (e.g. fd://2), filename, or named pipe (e.g. \\.\pipe\my-pipe) ($GPTSCRIPT_EVENTS_STREAM_TO)
--force-chat Force an interactive chat session if even the top level tool is not a chat tool ($GPTSCRIPT_FORCE_CHAT)
--force-sequential Force parallel calls to run sequentially ($GPTSCRIPT_FORCE_SEQUENTIAL)
-h, --help help for gptscript
-f, --input string Read input from a file ("-" for stdin) ($GPTSCRIPT_INPUT)
--list-models List the models available and exit ($GPTSCRIPT_LIST_MODELS)
--list-tools List built-in tools and exit ($GPTSCRIPT_LIST_TOOLS)
--listen-address string Server listen address ($GPTSCRIPT_LISTEN_ADDRESS) (default "127.0.0.1:9090")
--no-trunc Do not truncate long log messages ($GPTSCRIPT_NO_TRUNC)
--openai-api-key string OpenAI API KEY ($OPENAI_API_KEY)
--openai-base-url string OpenAI base URL ($OPENAI_BASE_URL)
--openai-org-id string OpenAI organization ID ($OPENAI_ORG_ID)
-o, --output string Save output to a file, or - for stdout ($GPTSCRIPT_OUTPUT)
-q, --quiet No output logging (set --quiet=false to force on even when there is no TTY) ($GPTSCRIPT_QUIET)
--server Start server ($GPTSCRIPT_SERVER)
--sub-tool string Use tool of this name, not the first tool in file ($GPTSCRIPT_SUB_TOOL)
--ui Launch the UI ($GPTSCRIPT_UI)
--workspace string Directory to use for the workspace, if specified it will not be deleted on exit ($GPTSCRIPT_WORKSPACE)
```

### SEE ALSO

* [gptscript credential](gptscript_credential.md) - List stored credentials
* [gptscript eval](gptscript_eval.md) -
* [gptscript fmt](gptscript_fmt.md) -
* [gptscript parse](gptscript_parse.md) -

30 changes: 30 additions & 0 deletions docs/docs/100-reference/01-command-line/gptscript_credential.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
---
title: "gptscript credential"
---
## gptscript credential

List stored credentials

```
gptscript credential [flags]
```

### Options

```
--all-contexts List credentials for all contexts ($GPTSCRIPT_CREDENTIAL_ALL_CONTEXTS)
-h, --help help for credential
--show-env-vars Show names of environment variables in each credential ($GPTSCRIPT_CREDENTIAL_SHOW_ENV_VARS)
```

### Options inherited from parent commands

```
--credential-context string Context name in which to store credentials ($GPTSCRIPT_CREDENTIAL_CONTEXT) (default "default")
```

### SEE ALSO

* [gptscript](gptscript.md) -
* [gptscript credential delete](gptscript_credential_delete.md) - Delete a stored credential

Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
---
title: "gptscript credential delete"
---
## gptscript credential delete

Delete a stored credential

```
gptscript credential delete <tool name> [flags]
```

### Options

```
-h, --help help for delete
```

### Options inherited from parent commands

```
--credential-context string Context name in which to store credentials ($GPTSCRIPT_CREDENTIAL_CONTEXT) (default "default")
```

### SEE ALSO

* [gptscript credential](gptscript_credential.md) - List stored credentials

57 changes: 57 additions & 0 deletions docs/docs/100-reference/01-command-line/gptscript_eval.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
---
title: "gptscript eval"
---
## gptscript eval



```
gptscript eval [flags]
```

### Options

```
--chat Enable chat ($GPTSCRIPT_EVAL_CHAT)
-h, --help help for eval
--internal-prompt ($GPTSCRIPT_EVAL_INTERNAL_PROMPT)
--json Output JSON ($GPTSCRIPT_EVAL_JSON)
--max-tokens int Maximum number of tokens to output ($GPTSCRIPT_EVAL_MAX_TOKENS)
--model string The model to use ($GPTSCRIPT_EVAL_MODEL)
--temperature string Set the temperature, "creativity" ($GPTSCRIPT_EVAL_TEMPERATURE)
--tools strings Tools available to call ($GPTSCRIPT_EVAL_TOOLS)
```

### Options inherited from parent commands

```
--cache-dir string Directory to store cache (default: $XDG_CACHE_HOME/gptscript) ($GPTSCRIPT_CACHE_DIR)
--chat-state string The chat state to continue, or null to start a new chat and return the state ($GPTSCRIPT_CHAT_STATE)
-C, --chdir string Change current working directory ($GPTSCRIPT_CHDIR)
--color Use color in output (default true) ($GPTSCRIPT_COLOR)
--config string Path to GPTScript config file ($GPTSCRIPT_CONFIG)
--confirm Prompt before running potentially dangerous commands ($GPTSCRIPT_CONFIRM)
--credential-context string Context name in which to store credentials ($GPTSCRIPT_CREDENTIAL_CONTEXT) (default "default")
--credential-override string Credentials to override (ex: --credential-override github.com/example/cred-tool:API_TOKEN=1234) ($GPTSCRIPT_CREDENTIAL_OVERRIDE)
--debug Enable debug logging ($GPTSCRIPT_DEBUG)
--debug-messages Enable logging of chat completion calls ($GPTSCRIPT_DEBUG_MESSAGES)
--default-model string Default LLM model to use ($GPTSCRIPT_DEFAULT_MODEL) (default "gpt-4o")
--disable-cache Disable caching of LLM API responses ($GPTSCRIPT_DISABLE_CACHE)
--dump-state string Dump the internal execution state to a file ($GPTSCRIPT_DUMP_STATE)
--events-stream-to string Stream events to this location, could be a file descriptor/handle (e.g. fd://2), filename, or named pipe (e.g. \\.\pipe\my-pipe) ($GPTSCRIPT_EVENTS_STREAM_TO)
--force-chat Force an interactive chat session if even the top level tool is not a chat tool ($GPTSCRIPT_FORCE_CHAT)
--force-sequential Force parallel calls to run sequentially ($GPTSCRIPT_FORCE_SEQUENTIAL)
-f, --input string Read input from a file ("-" for stdin) ($GPTSCRIPT_INPUT)
--no-trunc Do not truncate long log messages ($GPTSCRIPT_NO_TRUNC)
--openai-api-key string OpenAI API KEY ($OPENAI_API_KEY)
--openai-base-url string OpenAI base URL ($OPENAI_BASE_URL)
--openai-org-id string OpenAI organization ID ($OPENAI_ORG_ID)
-o, --output string Save output to a file, or - for stdout ($GPTSCRIPT_OUTPUT)
-q, --quiet No output logging (set --quiet=false to force on even when there is no TTY) ($GPTSCRIPT_QUIET)
--workspace string Directory to use for the workspace, if specified it will not be deleted on exit ($GPTSCRIPT_WORKSPACE)
```

### SEE ALSO

* [gptscript](gptscript.md) -

51 changes: 51 additions & 0 deletions docs/docs/100-reference/01-command-line/gptscript_fmt.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
---
title: "gptscript fmt"
---
## gptscript fmt



```
gptscript fmt [flags]
```

### Options

```
-h, --help help for fmt
-w, --write Write output to file instead of stdout ($GPTSCRIPT_FMT_WRITE)
```

### Options inherited from parent commands

```
--cache-dir string Directory to store cache (default: $XDG_CACHE_HOME/gptscript) ($GPTSCRIPT_CACHE_DIR)
--chat-state string The chat state to continue, or null to start a new chat and return the state ($GPTSCRIPT_CHAT_STATE)
-C, --chdir string Change current working directory ($GPTSCRIPT_CHDIR)
--color Use color in output (default true) ($GPTSCRIPT_COLOR)
--config string Path to GPTScript config file ($GPTSCRIPT_CONFIG)
--confirm Prompt before running potentially dangerous commands ($GPTSCRIPT_CONFIRM)
--credential-context string Context name in which to store credentials ($GPTSCRIPT_CREDENTIAL_CONTEXT) (default "default")
--credential-override string Credentials to override (ex: --credential-override github.com/example/cred-tool:API_TOKEN=1234) ($GPTSCRIPT_CREDENTIAL_OVERRIDE)
--debug Enable debug logging ($GPTSCRIPT_DEBUG)
--debug-messages Enable logging of chat completion calls ($GPTSCRIPT_DEBUG_MESSAGES)
--default-model string Default LLM model to use ($GPTSCRIPT_DEFAULT_MODEL) (default "gpt-4o")
--disable-cache Disable caching of LLM API responses ($GPTSCRIPT_DISABLE_CACHE)
--dump-state string Dump the internal execution state to a file ($GPTSCRIPT_DUMP_STATE)
--events-stream-to string Stream events to this location, could be a file descriptor/handle (e.g. fd://2), filename, or named pipe (e.g. \\.\pipe\my-pipe) ($GPTSCRIPT_EVENTS_STREAM_TO)
--force-chat Force an interactive chat session if even the top level tool is not a chat tool ($GPTSCRIPT_FORCE_CHAT)
--force-sequential Force parallel calls to run sequentially ($GPTSCRIPT_FORCE_SEQUENTIAL)
-f, --input string Read input from a file ("-" for stdin) ($GPTSCRIPT_INPUT)
--no-trunc Do not truncate long log messages ($GPTSCRIPT_NO_TRUNC)
--openai-api-key string OpenAI API KEY ($OPENAI_API_KEY)
--openai-base-url string OpenAI base URL ($OPENAI_BASE_URL)
--openai-org-id string OpenAI organization ID ($OPENAI_ORG_ID)
-o, --output string Save output to a file, or - for stdout ($GPTSCRIPT_OUTPUT)
-q, --quiet No output logging (set --quiet=false to force on even when there is no TTY) ($GPTSCRIPT_QUIET)
--workspace string Directory to use for the workspace, if specified it will not be deleted on exit ($GPTSCRIPT_WORKSPACE)
```

### SEE ALSO

* [gptscript](gptscript.md) -

51 changes: 51 additions & 0 deletions docs/docs/100-reference/01-command-line/gptscript_parse.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
---
title: "gptscript parse"
---
## gptscript parse



```
gptscript parse [flags]
```

### Options

```
-h, --help help for parse
-p, --pretty-print Indent the json output ($GPTSCRIPT_PARSE_PRETTY_PRINT)
```

### Options inherited from parent commands

```
--cache-dir string Directory to store cache (default: $XDG_CACHE_HOME/gptscript) ($GPTSCRIPT_CACHE_DIR)
--chat-state string The chat state to continue, or null to start a new chat and return the state ($GPTSCRIPT_CHAT_STATE)
-C, --chdir string Change current working directory ($GPTSCRIPT_CHDIR)
--color Use color in output (default true) ($GPTSCRIPT_COLOR)
--config string Path to GPTScript config file ($GPTSCRIPT_CONFIG)
--confirm Prompt before running potentially dangerous commands ($GPTSCRIPT_CONFIRM)
--credential-context string Context name in which to store credentials ($GPTSCRIPT_CREDENTIAL_CONTEXT) (default "default")
--credential-override string Credentials to override (ex: --credential-override github.com/example/cred-tool:API_TOKEN=1234) ($GPTSCRIPT_CREDENTIAL_OVERRIDE)
--debug Enable debug logging ($GPTSCRIPT_DEBUG)
--debug-messages Enable logging of chat completion calls ($GPTSCRIPT_DEBUG_MESSAGES)
--default-model string Default LLM model to use ($GPTSCRIPT_DEFAULT_MODEL) (default "gpt-4o")
--disable-cache Disable caching of LLM API responses ($GPTSCRIPT_DISABLE_CACHE)
--dump-state string Dump the internal execution state to a file ($GPTSCRIPT_DUMP_STATE)
--events-stream-to string Stream events to this location, could be a file descriptor/handle (e.g. fd://2), filename, or named pipe (e.g. \\.\pipe\my-pipe) ($GPTSCRIPT_EVENTS_STREAM_TO)
--force-chat Force an interactive chat session if even the top level tool is not a chat tool ($GPTSCRIPT_FORCE_CHAT)
--force-sequential Force parallel calls to run sequentially ($GPTSCRIPT_FORCE_SEQUENTIAL)
-f, --input string Read input from a file ("-" for stdin) ($GPTSCRIPT_INPUT)
--no-trunc Do not truncate long log messages ($GPTSCRIPT_NO_TRUNC)
--openai-api-key string OpenAI API KEY ($OPENAI_API_KEY)
--openai-base-url string OpenAI base URL ($OPENAI_BASE_URL)
--openai-org-id string OpenAI organization ID ($OPENAI_ORG_ID)
-o, --output string Save output to a file, or - for stdout ($GPTSCRIPT_OUTPUT)
-q, --quiet No output logging (set --quiet=false to force on even when there is no TTY) ($GPTSCRIPT_QUIET)
--workspace string Directory to use for the workspace, if specified it will not be deleted on exit ($GPTSCRIPT_WORKSPACE)
```

### SEE ALSO

* [gptscript](gptscript.md) -

2 changes: 2 additions & 0 deletions go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ require (
github.com/charmbracelet/x/ansi v0.1.1 // indirect
github.com/connesc/cipherio v0.2.1 // indirect
github.com/containerd/console v1.0.4 // indirect
github.com/cpuguy83/go-md2man/v2 v2.0.3 // indirect
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/dlclark/regexp2 v1.4.0 // indirect
github.com/dsnet/compress v0.0.1 // indirect
Expand Down Expand Up @@ -93,6 +94,7 @@ require (
github.com/pmezard/go-difflib v1.0.0 // indirect
github.com/pterm/pterm v0.12.79 // indirect
github.com/rivo/uniseg v0.4.7 // indirect
github.com/russross/blackfriday/v2 v2.1.0 // indirect
github.com/ssor/bom v0.0.0-20170718123548-6386211fdfcf // indirect
github.com/therootcompany/xz v1.0.1 // indirect
github.com/tidwall/match v1.1.1 // indirect
Expand Down
2 changes: 2 additions & 0 deletions go.sum
Original file line number Diff line number Diff line change
Expand Up @@ -87,6 +87,7 @@ github.com/connesc/cipherio v0.2.1/go.mod h1:ukY0MWJDFnJEbXMQtOcn2VmTpRfzcTz4OoV
github.com/containerd/console v1.0.3/go.mod h1:7LqA/THxQ86k76b8c/EMSiaJ3h1eZkMkXar0TQ1gf3U=
github.com/containerd/console v1.0.4 h1:F2g4+oChYvBTsASRTz8NP6iIAi97J3TtSAsLbIFn4ro=
github.com/containerd/console v1.0.4/go.mod h1:YynlIjWYF8myEu6sdkwKIvGQq+cOckRm6So2avqoYAk=
github.com/cpuguy83/go-md2man/v2 v2.0.3 h1:qMCsGGgs+MAzDFyp9LpAe1Lqy/fY/qCovCm0qnXZOBM=
github.com/cpuguy83/go-md2man/v2 v2.0.3/go.mod h1:tgQtvFlXSQOSOSIRvRPT7W67SCa46tRHOmNcaadrF8o=
github.com/creack/pty v1.1.9/go.mod h1:oKZEueFk5CKHvIhNR5MUki03XCEU+Q6VDXinZuGJ33E=
github.com/creack/pty v1.1.17 h1:QeVUsEDNrLBW4tMgZHvxy18sKtr6VI492kBhUfhDJNI=
Expand Down Expand Up @@ -302,6 +303,7 @@ github.com/rogpeppe/go-internal v1.12.0 h1:exVL4IDcn6na9z1rAb56Vxr+CgyK3nn3O+epU
github.com/rogpeppe/go-internal v1.12.0/go.mod h1:E+RYuTGaKKdloAfM02xzb0FW3Paa99yedzYV+kq4uf4=
github.com/rs/cors v1.11.0 h1:0B9GE/r9Bc2UxRMMtymBkHTenPkHDv0CW4Y98GBY+po=
github.com/rs/cors v1.11.0/go.mod h1:XyqrcTp5zjWr1wsJ8PIRZssZ8b/WMcMf71DJnit4EMU=
github.com/russross/blackfriday/v2 v2.1.0 h1:JIOH55/0cWyOuilr9/qlrm0BSXldqnqwMsf35Ld67mk=
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
github.com/rwcarlsen/goexif v0.0.0-20190401172101-9e8deecbddbd/go.mod h1:hPqNNc0+uJM6H+SuU8sEs5K5IQeKccPqeSjfgcKGgPk=
github.com/samber/lo v1.38.1 h1:j2XEAqXKb09Am4ebOg31SpvzUTTs6EN3VfgeLUhPdXM=
Expand Down
Loading
Loading