-
Notifications
You must be signed in to change notification settings - Fork 12.8k
Closed
Labels
bug-unconfirmedmedium severityUsed to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)
Description
What happened?
building the llama-server from scratch using the latest 8e752a7
./examples/server/deps.sh
rm -rf build && cmake -S . -B build && cmake --build build --config Release -j $(sysctl -n hw.logicalcpu) --target llama-server
on running the server, then loading the http://127.0.0.1:8080
getting the following logs -
main: server is listening on http://127.0.0.1:8080 - starting the main loop
srv update_slots: all slots are idle
request: GET / 127.0.0.1 200
request: GET /index.js 127.0.0.1 404
request: GET /completion.js 127.0.0.1 200
request: GET /json-schema-to-grammar.mjs 127.0.0.1 404
the index.html is expecting /index.js
and other files, which are returning 404. i have downloaded the dependencies, but still few files are missing. kindly check if these files are downloaded with deps.sh
.
Name and Version
$ ./build/bin/llama-cli --version
version: 4131 (8e752a7)
built with Homebrew clang version 18.1.5 for arm64-apple-darwin23.3.0
What operating system are you seeing the problem on?
Mac
Relevant log output
No response
Metadata
Metadata
Assignees
Labels
bug-unconfirmedmedium severityUsed to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)Used to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)