Skip to content

Webui/server config defaults#18028

Merged
ngxson merged 3 commits intoggml-org:masterfrom
ServeurpersoCom:webui/server-config-defaults
Dec 17, 2025
Merged

Webui/server config defaults#18028
ngxson merged 3 commits intoggml-org:masterfrom
ServeurpersoCom:webui/server-config-defaults

Conversation

@ServeurpersoCom
Copy link
Collaborator

Make sure to read the contributing guidelines before submitting a PR

server/webui: add CLI support for configuring WebUI defaults

Summary

Implements issue #17940: adds '--webui-config-file' CLI argument and 'LLAMA_WEBUI_CONFIG' environment variable to configure WebUI default settings from the server side.

Problem

WebUI settings are hardcoded in the frontend with no server-side configuration. This causes problems:

  • Server operators can't set sensible defaults for their deployment
  • Users must reconfigure settings every time localStorage is cleared
  • Multi-user servers can't enforce defaults (like disabling auto-file-paste)

Example: Issue #17940 requests ability to set 'pasteLongTextToFileLen: 0' by default, but currently every user must manually configure this in every new browser session.

The TODO that guided this implementation

While exploring the codebase, I found this intentional placeholder in 'server-models.cpp' (lines 6-16):

// TODO: add support for this on web UI
{"role", "router"},
// this is a dummy response to make sure webui doesn't break
{"default_generation_settings", {
    {"params", json{}},  // <-- Empty JSON waiting for implementation
    {"n_ctx", 0},
}},

This empty JSON was left by the original developers as a clear signal for where webui settings should go. This PR completes that TODO.

Backend Changes (C++)

What was added

1. CLI argument ('common/arg.cpp')

  • New flag: '--webui-config-file '
  • Loads JSON file with WebUI default settings

2. Environment variable ('server.cpp')

  • New env var: 'LLAMA_WEBUI_CONFIG'
  • Accepts JSON string, overrides file config
  • Example: 'LLAMA_WEBUI_CONFIG='{"pasteLongTextToFileLen":0}''

3. JSON parsing logic ('server.cpp')

  • Parses config at startup, before any router/child fork
  • Simple override: env var OR file (env var takes precedence)
  • Error handling: warn on missing file, error on invalid JSON

4. Expose in /props ('server-models.cpp' + 'server-context.cpp')

  • Added 'webui_settings' field to '/props' response
  • Works in both router mode and child mode
  • Fills in the TODO placeholder mentioned above

How it works

Server startup
  |
  v
Parse LLAMA_WEBUI_CONFIG env var (if set) OR config.json (if provided)
  |
  v
Store result in params.webui_config
  |
  v
Return in /props as webui_settings

The C++ backend is completely agnostic: it doesn't know what settings exist. It just:

  • Reads JSON
  • Passes through to frontend

This means adding new WebUI settings never requires C++ changes.

Frontend Changes (TypeScript)

What was added

1. API type ('types/api.d.ts')

  • Added optional 'webui_settings' field to props response
  • Type: 'Record<string, string | number | boolean>'

2. Syncable parameters ('services/parameter-sync.ts')

  • Added 14 WebUI settings to 'SYNCABLE_PARAMETERS':
    • pasteLongTextToFileLen
    • pdfAsImage
    • showThoughtInProgress
    • showToolCalls
    • disableReasoningFormat
    • keepStatsVisible
    • showMessageStats
    • askForTitleConfirmation
    • disableAutoScroll
    • renderUserContentAsMarkdown
    • autoMicOnEmpty
    • pyInterpreterEnabled
    • enableContinueGeneration
    • showSystemMessage
  • Updated 'extractServerDefaults()' to merge webui_settings

3. Store integration ('stores/server.svelte.ts' + 'stores/settings.svelte.ts')

  • Added getter for 'webuiSettings'
  • Plumbed through to settings store

4. Tests ('parameter-sync.spec.ts')

  • Test for webui_settings extraction
  • Validates user-specific settings are filtered out
  • Uses 'pasteLongTextToFileLen: 0' example from issue

Settings intentionally excluded

These are user-specific and should NOT be server-configurable:

  • apiKey (personal credentials)
  • systemMessage (personal prompt)
  • theme (UI preference)
  • mcpServers (personal MCP config)
  • custom (user's custom JSON)

Priority Order

Server operators set defaults, but users keep control:

1. WebUI hardcoded defaults (settings-config.ts)
2. Server config: LLAMA_WEBUI_CONFIG env var OR --webui-config-file
3. User's localStorage (always wins)

This means:

  • Server can provide sensible defaults
  • Users can still override anything
  • No breaking changes

Usage Examples

Config file

# webui-defaults.json
{
  "pasteLongTextToFileLen": 0,
  "pdfAsImage": true,
  "showThoughtInProgress": false
}

./llama-server --webui-config-file webui-defaults.json

Environment variable

LLAMA_WEBUI_CONFIG='{"pasteLongTextToFileLen":0}' ./llama-server

Both (env var overrides file)

LLAMA_WEBUI_CONFIG='{"pdfAsImage":true}' \
  ./llama-server --webui-config-file webui-defaults.json

Check it works

curl http://localhost:8080/props | jq .webui_settings

Testing

Backend

# Create test config
echo '{"pasteLongTextToFileLen":0}' > test.json

# Test file loading
./llama-server --webui-config-file test.json

# Test env var
LLAMA_WEBUI_CONFIG='{"pdfAsImage":true}' ./llama-server

# Check /props
curl http://localhost:8080/props | jq .webui_settings

Frontend

cd tools/server/webui
npm test -- parameter-sync.spec.ts

Backward Compatibility

No breaking changes:

  • 'webui_settings' is optional in /props
  • Old servers work fine (webui_settings: null or missing)
  • WebUI falls back to hardcoded defaults
  • No recompilation needed for existing deployments

What this enables

Now server operators can:

  • Set deployment-wide defaults without forking
  • Disable problematic features (like auto-file-paste)
  • Provide better defaults for their use case
  • Support multiple configurations via env vars

And adding new configurable settings only requires TypeScript changes: no C++ recompilation needed.

Fixes #17940

@ServeurpersoCom
Copy link
Collaborator Author

ServeurpersoCom commented Dec 14, 2025

Testing...

cd /var/www/ia/models
/root/llama.cpp.pascal/build/bin/llama-server \
 -ngl 999 -ctk q8_0 -ctv q8_0 -fa on --mlock -np 4 -kvu --port 8082 \
 --models-max 1 --models-preset backend.ini --webui-config-file frontend.json \
 > llama-server.log 2>&1 &

frontend.json

{
 "showSystemMessage": true,
 "showThoughtInProgress": true,
 "showToolCalls": true,
 "disableReasoningFormat": false,
 "keepStatsVisible": true,
 "showMessageStats": true,
 "askForTitleConfirmation": false,
 "pasteLongTextToFileLen": 0,
 "pdfAsImage": false,
 "disableAutoScroll": false,
 "renderUserContentAsMarkdown": false,
 "autoMicOnEmpty": false,
 "pyInterpreterEnabled": false,
 "enableContinueGeneration": true
}
Sans titre

EDIT: should not contain the sampling parameters

@ServeurpersoCom
Copy link
Collaborator Author

It work

Copy link
Collaborator

@allozaur allozaur left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good on my end

Copy link
Member

@ggerganov ggerganov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can merge after rebase and green CI. @ngxson up to you

@ngxson
Copy link
Collaborator

ngxson commented Dec 17, 2025

Also, you need to re-generate the server docs via llama-gen-docs, and copy-paste it to the ## Usage section of server/README.md

@ServeurpersoCom
Copy link
Collaborator Author

I will apply the optimizations and rebase / test

@ServeurpersoCom ServeurpersoCom self-assigned this Dec 17, 2025
Copy link
Collaborator

@ngxson ngxson left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can be merged once you update the docs 👍

Add CLI arguments --webui-config (inline JSON) and --webui-config-file
(file path) to configure WebUI default settings from server side.

Backend changes:
- Parse JSON once in server_context::load_model() for performance
- Cache parsed config in webui_settings member (zero overhead on /props)
- Add proper error handling in router mode with try/catch
- Expose webui_settings in /props endpoint for both router and child modes

Frontend changes:
- Add 14 configurable WebUI settings via parameter sync
- Add tests for webui settings extraction
- Fix subpath support with base path in API calls

Addresses feedback from @ngxson and @ggerganov
@ServeurpersoCom ServeurpersoCom force-pushed the webui/server-config-defaults branch from 2977afc to f93fee0 Compare December 17, 2025 14:06
@ServeurpersoCom
Copy link
Collaborator Author

Docs regenerated. Fun fact: llama-gen-docs initializes my GPU just to write markdown😂

@woof-dog
Copy link
Contributor

woof-dog commented Dec 17, 2025

By the way, this works great! Thank you for implementing this and helping me finally get it through! @mashdragon you will be glad to see this!

If I find the time, I can write up an example for the webui readme (in a separate PR perhaps).

@ngxson ngxson merged commit 6ce3d85 into ggml-org:master Dec 17, 2025
65 of 72 checks passed
@jbone1313
Copy link

@ServeurpersoCom this is not working for me. I verified that the props page has the correct settings and ensured that they actually change when set via the config file. But, the settings in the web UI do not match the props, nor does the web UI behave according to what is set in the props page.

b7472
Windows

@ServeurpersoCom
Copy link
Collaborator Author

ServeurpersoCom commented Dec 18, 2025

@ServeurpersoCom this is not working for me. I verified that the props page has the correct settings and ensured that they actually change when set via the config file. But, the settings in the web UI do not match the props, nor does the web UI behave according to what is set in the props page.

b7472 Windows

Your previous WebUI settings can override the server defaults. Server defaults only apply when you first visit the WebUI or after resetting settings.
Sans titre

Else can you clear your browser's localStorage/indexedDB and retry ?
(Browser console, tested on Firefox)

localStorage.clear();
sessionStorage.clear();
indexedDB.deleteDatabase('LlamacppWebui');
setTimeout(() => location.reload(), 1000);

@ServeurpersoCom
Copy link
Collaborator Author

There is a bug! "Reset to default" works, but defaults aren't applied on first visit after clearing browser storage. I'm working on a patch

@jbone1313
Copy link

Yeah, I just tried "reset to default" and the code you gave me for Firefox. Neither worked. After I cleared all browser data it worked.

While I am here:

  1. Is there a way to force the server defaults to override the previous Web UI settings without having to reset? It would be nice to know that what is set in the defaults will win every time.
  2. Is there a list of the actual names of the settable parameters? I already checked -h and the llama-server readme. The list you posted up-thread has many but not all of them. For example, the new "always show sidebar parameter" is not in that list.

@ServeurpersoCom
Copy link
Collaborator Author

ServeurpersoCom commented Dec 18, 2025

1. Is there a way to force the server defaults to override the previous Web UI settings without having to reset? It would be nice to know that what is set in the defaults will win every time.
2. Is there a list of the actual names of the settable parameters? I already checked -h and the llama-server readme. The list you posted up-thread has many but not all of them. For example, the new "always show sidebar parameter" is not in that list.
  1. This is a useful and frequently requested feature in businesses where administrators need complete control, (exactly like GPOs on workstations in an active directory domain). We could add a simple field to the JSON to instruct the WebUI to always apply its default settings (new "prohibit override" feature).
    -> Could you create a feature request for this ? But that's not a priority; for now, just click the "reset settings" button must work.

  2. Debugging default values ​​for new webui users and updating missing settings are priorities (what this PR must do) Misc. bug: WebUI, first application of default settings (JSON) #18185

@jbone1313
Copy link

jbone1313 commented Dec 19, 2025

RE 1:

I am happy to write up a feature request, but after thinking about this, it seems like this feature needs to be implemented consistently with how this issue will be handled: #18129 (comment) (Misc. bug: Web-Frontend overrides sampling parameters from models preset in router mode), and l imagine that one will be treated with high priority.

I am not a UI expert, but it seems like some elegant and consistent solution could apply to both cases.

Perhaps it could work thus: for a given parameter, the server (either server config defaults or router config.ini) - IF set - is the source of truth unless the parameter is manually set in Web UI. Perhaps the Web UI could show the server-set values as "greyed-out," and when they are manually overridden, they would be shown as "normal." This requires no extra JSON fields, and the UI itself would provide information about where the parameter values are being set.

RE "prohibiting overrides" for enterprise-like cases: perhaps that could be considered low priority and implemented later.

RE 2:

Apologies, but I do not see how that addresses my question 2 above regarding how to determine the actual list of settable parameters we can set in the JSON file.

@ServeurpersoCom
Copy link
Collaborator Author

ServeurpersoCom commented Dec 19, 2025

@jbone1313 What you're describing is exactly the design pattern we need for #18129 (sampling params override bug). Server values as source of truth (greyed in UI), with user overrides, and placeholder show the current backend config. Same architecture for both webui settings and sampling parameters. We all independently converged on this.

I've been running shared llama.cpp servers for friends for a long time and wanted this exact behavior. Now deploying more and more at work: proper server governance is critical.

Anico2 added a commit to Anico2/llama.cpp that referenced this pull request Jan 15, 2026
* server/webui: add server-side WebUI config support

Add CLI arguments --webui-config (inline JSON) and --webui-config-file
(file path) to configure WebUI default settings from server side.

Backend changes:
- Parse JSON once in server_context::load_model() for performance
- Cache parsed config in webui_settings member (zero overhead on /props)
- Add proper error handling in router mode with try/catch
- Expose webui_settings in /props endpoint for both router and child modes

Frontend changes:
- Add 14 configurable WebUI settings via parameter sync
- Add tests for webui settings extraction
- Fix subpath support with base path in API calls

Addresses feedback from @ngxson and @ggerganov

* server: address review feedback from ngxson

* server: regenerate README with llama-gen-docs
blime4 referenced this pull request in blime4/llama.cpp Feb 5, 2026
* server/webui: add server-side WebUI config support

Add CLI arguments --webui-config (inline JSON) and --webui-config-file
(file path) to configure WebUI default settings from server side.

Backend changes:
- Parse JSON once in server_context::load_model() for performance
- Cache parsed config in webui_settings member (zero overhead on /props)
- Add proper error handling in router mode with try/catch
- Expose webui_settings in /props endpoint for both router and child modes

Frontend changes:
- Add 14 configurable WebUI settings via parameter sync
- Add tests for webui settings extraction
- Fix subpath support with base path in API calls

Addresses feedback from @ngxson and @ggerganov

* server: address review feedback from ngxson

* server: regenerate README with llama-gen-docs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature Request: server/webui: allow configuring server props (default webui settings) from CLI args

6 participants