You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/docs/07-gpt-file-reference.md
+15-12
Original file line number
Diff line number
Diff line change
@@ -43,18 +43,21 @@ Tool instructions go here.
43
43
44
44
Tool parameters are key-value pairs defined at the beginning of a tool block, before any instructional text. They are specified in the format `key: value`. The parser recognizes the following keys (case-insensitive and spaces are ignored):
|`Model Name`| The OpenAI model to use, by default it uses "gpt-4-turbo" |
50
-
|`Description`| The description of the tool. It is important that this properly describes the tool's purpose as the description is used by the LLM. |
51
-
|`Internal Prompt`| Setting this to `false` will disable the built-in system prompt for this tool. |
52
-
|`Tools`| A comma-separated list of tools that are available to be called by this tool. |
53
-
|`Credentials`| A comma-separated list of credential tools to run before the main tool. |
54
-
|`Args`| Arguments for the tool. Each argument is defined in the format `arg-name: description`. |
55
-
|`Max Tokens`| Set to a number if you wish to limit the maximum number of tokens that can be generated by the LLM. |
56
-
|`JSON Response`| Setting to `true` will cause the LLM to respond in a JSON format. If you set true you must also include instructions in the tool. |
57
-
|`Temperature`| A floating-point number representing the temperature parameter. By default, the temperature is 0. Set to a higher number for more creativity. |
|`Model Name`| The OpenAI model to use, by default it uses "gpt-4". |
50
+
|`Global Model Name`| The OpenAI model to use for all the tools. |
51
+
|`Description`| The description of the tool. It is important that this properly describes the tool's purpose as the description is used by the LLM. |
52
+
|`Internal Prompt`| Setting this to `false` will disable the built-in system prompt for this tool. |
53
+
|`Tools`| A comma-separated list of tools that are available to be called by this tool. |
54
+
|`Global Tools`| A comma-separated list of tools that are available to be called by all tools. |
55
+
|`Credentials`| A comma-separated list of credential tools to run before the main tool. |
56
+
|`Args`| Arguments for the tool. Each argument is defined in the format `arg-name: description`. |
57
+
|`Max Tokens`| Set to a number if you wish to limit the maximum number of tokens that can be generated by the LLM. |
58
+
|`JSON Response`| Setting to `true` will cause the LLM to respond in a JSON format. If you set true you must also include instructions in the tool. |
59
+
|`Temperature`| A floating-point number representing the temperature parameter. By default, the temperature is 0. Set to a higher number for more creativity. |
60
+
|`Chat`| Setting it to `true` will enable an interactive chat session for the tool. |
0 commit comments