Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: /v1/models returns a 404 on v2 server #668

Open
chaosharmonic opened this issue Jan 6, 2025 · 0 comments
Open

Bug: /v1/models returns a 404 on v2 server #668

chaosharmonic opened this issue Jan 6, 2025 · 0 comments

Comments

@chaosharmonic
Copy link

Contact Details

No response

What happened?

The new server looks to be redirecting external requests to routes on the UI instead of responding from the API directly. On trying to call it from SillyTavern I got a connection error, which I was able to reproduce using curl.

The old server still works without issue.

Version

llamafile 0.9

What operating system are you seeing the problem on?

Linux

Relevant log output

2025-01-06T12:42:39.401189 llamafile/server/listen.cpp:41 server listen http://127.0.0.1:8080
2025-01-06T12:42:53.720732 llamafile/server/client.cpp:679 45058 GET /v1/models
2025-01-06T12:42:53.720774 llamafile/server/client.cpp:736 45058 path not found: /zip/www/v1/models
2025-01-06T12:42:53.720780 llamafile/server/client.cpp:320 45058 error 404 Not Found
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant