You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Make sure you do not have openai key set as env variable or credential (sys.openai) set.
Have github.com/gptscript-ai/claude3-anthropic-provider/credential set.
Launching UI with default provider set to Anthropic provider results in user being prompted for openai api key.
When u enter an invalid value, it gets stuck and following errors are seen in console
⨯ unhandledRejection: error, status code: 401, message: Incorrect API key provided: asdf. You can find your API key at https://platform.openai.com/account/api-keys.
⨯ unhandledRejection: error, status code: 401, message: Incorrect API key provided: asdf. You can find your API key at https://platform.openai.com/account/api-keys.
Tried to launch UI by any of the 2 means results in the same behavior:
by passing --default-model - gptscript --ui --disable-cache --default-model 'claude-3-5-sonnet-20240620 from [github.com/gptscript-ai/claude3-anthropic-provider](http://github.com/gptscript-ai/claude3-anthropic-provider)' --ui [github.com/gptscript-ai/cli-demo](http://github.com/gptscript-ai/cli-demo)
or
by setting ENV variable GPTSCRIPT_DEFAULT_MODEL=claude-3-5-sonnet-20240620 from github.com/gptscript-ai/claude3-anthropic-provider.
Logs:
gptscript --ui --disable-cache --ui github.com/gptscript-ai/cli-demo
18:23:16 WARNING: Changing the default model can have unknown behavior for existing tools. Use the model field per tool instead.
18:23:18 started [main] [input=--file=github.com/gptscript-ai/cli-demo]
18:23:18 started [context: github.com/gptscript-ai/context/os]
18:23:18 sent [context: github.com/gptscript-ai/context/os]
18:23:18 ended [context: github.com/gptscript-ai/context/os] [output=The local operating systems is Darwin, release 23.3.0]
18:23:20 started [provider: https://raw.githubusercontent.com/gptscript-ai/claude3-anthropic-provider/5b581e7b84bdd99f5d56df593d99397c9f91a8e3/tool.gpt:Anthropic Claude3 Model Provider]
18:23:20 launched [Anthropic Claude3 Model Provider][https://raw.githubusercontent.com/gptscript-ai/claude3-anthropic-provider/5b581e7b84bdd99f5d56df593d99397c9f91a8e3/tool.gpt:Anthropic Claude3 Model Provider] port [10359] [/opt/homebrew/bin/gptscript sys.daemon /usr/bin/env python3 /Users/sangeethahariharan/Library/Caches/gptscript/repos/5b581e7b84bdd99f5d56df593d99397c9f91a8e3/tool.gpt/python3.12/main.py]
18:23:21 ended [provider: https://raw.githubusercontent.com/gptscript-ai/claude3-anthropic-provider/5b581e7b84bdd99f5d56df593d99397c9f91a8e3/tool.gpt:Anthropic Claude3 Model Provider] [output=http://127.0.0.1:10359]
18:23:21 sent [main]
content [1] content | Waiting for model response...
content [1] content | <tool call> service -> null
content [2] content | The local operating systems is Darwin, release 23.3.0
content [2] content |
18:23:22 started [service(4)] [input=null]
18:23:22 launched [service][https://raw.githubusercontent.com/gptscript-ai/ui/a5946810bc100357327cc11c1548d6bf7e408a09/tool.gpt:service] port [10993] [/opt/homebrew/bin/gptscript sys.daemon /usr/bin/env npm run --prefix /Users/sangeethahariharan/Library/Caches/gptscript/repos/a5946810bc100357327cc11c1548d6bf7e408a09/tool.gpt/node21 dev]
> [email protected] dev
> node server.mjs
> Socket server is ready at http://localhost:10993
○ Compiling / ...
✓ Compiled / in 2.1s (4300 modules)
GET / 200 in 2374ms
18:23:27 ended [service(4)] [output=\u003c!DOCTYPE html\u003e\u003chtml lang=\"en\"\u003e\u003chead\u003e\u003cmeta charSet=\"utf-8\"/\u003e\u003cmeta name=\"viewport\" content=\"width=dev...]
18:23:27 continue [main]
POST / 200 in 16ms
18:23:27 started [context: github.com/gptscript-ai/context/os]
18:23:27 sent [context: github.com/gptscript-ai/context/os]
18:23:27 ended [context: github.com/gptscript-ai/context/os] [output=The local operating systems is Darwin, release 23.3.0]
18:23:27 sent [main]
content [1] content | Waiting for model response...
content [1] content | <tool call> port -> null content [5] content | The local operating systems is Darwin, release 23.3.0
content [5] content |
18:23:29 started [port(6)] [input=null]
✓ Compiled /api/port in 218ms (2304 modules)
POST /api/port 200 in 256ms
18:23:30 ended [port(6)] [output=10993]
18:23:30 continue [main]
18:23:30 started [context: github.com/gptscript-ai/context/os]
18:23:30 sent [context: github.com/gptscript-ai/context/os]
18:23:30 ended [context: github.com/gptscript-ai/context/os] [output=The local operating systems is Darwin, release 23.3.0]
18:23:30 sent [main]
content [1] content | Waiting for model response...
content [1] content | <tool call> openFileNix -> {"file": "github.com/gptscript-ai/cli-demo", "port": "10993"} content [7] content | The local operating systems is Darwin, release 23.3.0
content [7] content |
18:23:32 started [open-file-nix(8)] [input={"file": "github.com/gptscript-ai/cli-demo", "port": "10993"}]
18:23:32 sent [open-file-nix(8)]
18:23:32 ended [open-file-nix(8)]
18:23:32 continue [main]
18:23:32 started [context: github.com/gptscript-ai/context/os]
18:23:32 sent [context: github.com/gptscript-ai/context/os]
18:23:32 ended [context: github.com/gptscript-ai/context/os] [output=The local operating systems is Darwin, release 23.3.0]
18:23:32 sent [main]
content [1] content | Waiting for model response... ○ Compiling /run ...
✓ Compiled /run in 1253ms (4996 modules)
GET /run?file=github.com/gptscript-ai/cli-demo 200 in 1634ms
content [1] content | The task has been completed. The service was run, the port was obtained (10993), and the file "github.com/gptscript-ai/cli-demo" was opened using the openFileNix tool with the specified port.
content [9] content | The local operating systems is Darwin, release 23.3.0
content [9] content |
18:23:35 ended [main] [output=The task has been completed. The service was run, the port was obtained (10993), and the file \"githu...]
INPUT:
--file=github.com/gptscript-ai/cli-demo
OUTPUT:
The task has been completed. The service was run, the port was obtained (10993), and the file "github.com/gptscript-ai/cli-demo" was opened using the openFileNix tool with the specified port.
POST /run?file=github.com/gptscript-ai/cli-demo 200 in 1633ms
POST /run?file=github.com/gptscript-ai/cli-demo 200 in 11ms
POST /run?file=github.com/gptscript-ai/cli-demo 200 in 5ms
POST /run?file=github.com/gptscript-ai/cli-demo 200 in 9ms
POST /run?file=github.com/gptscript-ai/cli-demo 200 in 7ms
⨯ unhandledRejection: error, status code: 401, message: Incorrect API key provided: asdf. You can find your API key at https://platform.openai.com/account/api-keys.
⨯ unhandledRejection: error, status code: 401, message: Incorrect API key provided: asdf. You can find your API key at https://platform.openai.com/account/api-keys.
The text was updated successfully, but these errors were encountered:
Server - gptscript version v0.8.5-rc4+3033b05a
Steps to reproduce the problem:
github.com/gptscript-ai/claude3-anthropic-provider/credential
set.When u enter an invalid value, it gets stuck and following errors are seen in console
Tried to launch UI by any of the 2 means results in the same behavior:
--default-model
-gptscript --ui --disable-cache --default-model 'claude-3-5-sonnet-20240620 from [github.com/gptscript-ai/claude3-anthropic-provider](http://github.com/gptscript-ai/claude3-anthropic-provider)' --ui [github.com/gptscript-ai/cli-demo](http://github.com/gptscript-ai/cli-demo)
or
GPTSCRIPT_DEFAULT_MODEL=claude-3-5-sonnet-20240620 from github.com/gptscript-ai/claude3-anthropic-provider
.Logs:
The text was updated successfully, but these errors were encountered: