config llm profile
Configure LLM Proxy profiles.
config llm profile
Description: Configure LLM Proxy profiles.
edit <name>
set replace-api-key [enable|disable]
set unknown-api [enable|disable]
set log [none|blocked|...]
config chat
Description: LLM Proxy chat completions API (/v1/chat/completions).
set status [enable|disable]
set max-completion-tokens {integer}
set stream [bypass|block]
set max-req-len {integer}
set system-prompt-mode [bypass|replace|...]
set system-prompt {var-string}
end
config response
Description: LLM Proxy responses API (/v1/responses).
set status [enable|disable]
set max-output-tokens {integer}
set stream [bypass|block]
set max-req-len {integer}
set instructions-mode [bypass|replace|...]
set instructions {var-string}
end
config image-gen
Description: LLM Proxy image generation API (/v1/images/generations).
set status [enable|disable]
end
config list-models
Description: LLM Proxy list models API (/v1/models).
set status [enable|disable]
end
next
end
config llm profile
|
Parameter |
Description |
Type |
Size |
Default |
||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
name |
LLM Proxy profile name. |
string |
Maximum length: 35 |
|
||||||||
|
replace-api-key |
Replace the api key in request with the api key in llm.server config. |
option |
- |
enable |
||||||||
|
|
|
|||||||||||
|
unknown-api |
Support unknown API. |
option |
- |
disable |
||||||||
|
|
|
|||||||||||
|
log |
Log option. |
option |
- |
blocked |
||||||||
|
|
|
|||||||||||
config chat
|
Parameter |
Description |
Type |
Size |
Default |
||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
status |
Support chat completions API. |
option |
- |
enable |
||||||||||
|
|
|
|||||||||||||
|
max-completion-tokens |
Maximum number of completion tokens (0 - 10000000, default = 0, means no limit). |
integer |
Minimum value: 0 Maximum value: 10000000 |
0 |
||||||||||
|
stream |
support chat completions stream mode. |
option |
- |
bypass |
||||||||||
|
|
|
|||||||||||||
|
max-req-len |
Max size of chat completions request body, (0 - 65535 KiB, default = 1024, 0 means no limit). |
integer |
Minimum value: 0 Maximum value: 65535 |
1024 |
||||||||||
|
system-prompt-mode |
System prompt processing mode. |
option |
- |
bypass |
||||||||||
|
|
|
|||||||||||||
|
system-prompt |
Replace/Append chat completions system prompt. |
var-string |
Maximum length: 255 |
|
||||||||||
config response
|
Parameter |
Description |
Type |
Size |
Default |
||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
status |
Support responses API. |
option |
- |
enable |
||||||||||
|
|
|
|||||||||||||
|
max-output-tokens |
Maximum number of output tokens (0 - 10000000, default = 0, means no limit). |
integer |
Minimum value: 0 Maximum value: 10000000 |
0 |
||||||||||
|
stream |
support responses stream mode. |
option |
- |
bypass |
||||||||||
|
|
|
|||||||||||||
|
max-req-len |
Max size of responses request body, (0 - 65535 KiB, default = 1024, 0 means no limit). |
integer |
Minimum value: 0 Maximum value: 65535 |
1024 |
||||||||||
|
instructions-mode |
Instructions processing mode. |
option |
- |
bypass |
||||||||||
|
|
|
|||||||||||||
|
instructions |
Replace/Append responses instructions. |
var-string |
Maximum length: 255 |
|
||||||||||
config image-gen
|
Parameter |
Description |
Type |
Size |
Default |
||||||
|---|---|---|---|---|---|---|---|---|---|---|
|
status |
Support image generation API. |
option |
- |
enable |
||||||
|
|
|
|||||||||
config list-models
|
Parameter |
Description |
Type |
Size |
Default |
||||||
|---|---|---|---|---|---|---|---|---|---|---|
|
status |
Support list models API. |
option |
- |
enable |
||||||
|
|
|
|||||||||