Skip to content

Bug: llama-server not loading the UI #10404

Closed
@anagri

Description

@anagri

What happened?

building the llama-server from scratch using the latest 8e752a7

./examples/server/deps.sh
rm -rf build && cmake -S . -B build && cmake --build build --config Release -j $(sysctl -n hw.logicalcpu) --target llama-server

on running the server, then loading the http://127.0.0.1:8080 getting the following logs -

main: server is listening on http://127.0.0.1:8080 - starting the main loop
srv  update_slots: all slots are idle
request: GET / 127.0.0.1 200
request: GET /index.js 127.0.0.1 404
request: GET /completion.js 127.0.0.1 200
request: GET /json-schema-to-grammar.mjs 127.0.0.1 404

the index.html is expecting /index.js and other files, which are returning 404. i have downloaded the dependencies, but still few files are missing. kindly check if these files are downloaded with deps.sh.

Name and Version

$ ./build/bin/llama-cli --version
version: 4131 (8e752a7)
built with Homebrew clang version 18.1.5 for arm64-apple-darwin23.3.0

What operating system are you seeing the problem on?

Mac

Relevant log output

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bug-unconfirmedmedium severityUsed to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions