Skip to content

Commit f8de905

Browse files
committed
Chore: Add documentation for new tool parameters
1 parent 12e80c2 commit f8de905

File tree

1 file changed

+15
-12
lines changed

1 file changed

+15
-12
lines changed

docs/docs/07-gpt-file-reference.md

+15-12
Original file line numberDiff line numberDiff line change
@@ -43,18 +43,21 @@ Tool instructions go here.
4343

4444
Tool parameters are key-value pairs defined at the beginning of a tool block, before any instructional text. They are specified in the format `key: value`. The parser recognizes the following keys (case-insensitive and spaces are ignored):
4545

46-
| Key | Description |
47-
|-------------------|-----------------------------------------------------------------------------------------------------------------------------------------------|
48-
| `Name` | The name of the tool. |
49-
| `Model Name` | The OpenAI model to use, by default it uses "gpt-4-turbo" |
50-
| `Description` | The description of the tool. It is important that this properly describes the tool's purpose as the description is used by the LLM. |
51-
| `Internal Prompt` | Setting this to `false` will disable the built-in system prompt for this tool. |
52-
| `Tools` | A comma-separated list of tools that are available to be called by this tool. |
53-
| `Credentials` | A comma-separated list of credential tools to run before the main tool. |
54-
| `Args` | Arguments for the tool. Each argument is defined in the format `arg-name: description`. |
55-
| `Max Tokens` | Set to a number if you wish to limit the maximum number of tokens that can be generated by the LLM. |
56-
| `JSON Response` | Setting to `true` will cause the LLM to respond in a JSON format. If you set true you must also include instructions in the tool. |
57-
| `Temperature` | A floating-point number representing the temperature parameter. By default, the temperature is 0. Set to a higher number for more creativity. |
46+
| Key | Description |
47+
|--------------------|-----------------------------------------------------------------------------------------------------------------------------------------------|
48+
| `Name` | The name of the tool. |
49+
| `Model Name` | The OpenAI model to use, by default it uses "gpt-4". |
50+
| `Global Model Name`| The OpenAI model to use for all the tools. |
51+
| `Description` | The description of the tool. It is important that this properly describes the tool's purpose as the description is used by the LLM. |
52+
| `Internal Prompt` | Setting this to `false` will disable the built-in system prompt for this tool. |
53+
| `Tools` | A comma-separated list of tools that are available to be called by this tool. |
54+
| `Global Tools` | A comma-separated list of tools that are available to be called by all tools. |
55+
| `Credentials` | A comma-separated list of credential tools to run before the main tool. |
56+
| `Args` | Arguments for the tool. Each argument is defined in the format `arg-name: description`. |
57+
| `Max Tokens` | Set to a number if you wish to limit the maximum number of tokens that can be generated by the LLM. |
58+
| `JSON Response` | Setting to `true` will cause the LLM to respond in a JSON format. If you set true you must also include instructions in the tool. |
59+
| `Temperature` | A floating-point number representing the temperature parameter. By default, the temperature is 0. Set to a higher number for more creativity. |
60+
| `Chat` | Setting it to `true` will enable an interactive chat session for the tool. |
5861

5962

6063

0 commit comments

Comments
 (0)