Fortinet white logo
Fortinet white logo

CLI Reference

config llm server

config llm server

Configure LLM Proxy servers.

config llm server
    Description: Configure LLM Proxy servers.
    edit <name>
        set display-name {string}
        set type [built-in|customized]
        set built-in-server [openai|azure|...]
        set azure-resource-name {string}
        set end-point {var-string}
        set chat-completions-api [none|openai|...]
        set image-gen-api [none|openai|...]
        set anthropic-version {string}
        set azure-api-version {string}
        set models {string}
        set accept-custom-model [enable|disable]
        set custom-model-allow-regex {string}
        set custom-model-block-regex {string}
        set verify-cert [enable|disable]
        set api-key {string}
    next
end

config llm server

Parameter

Description

Type

Size

Default

name

LLM Proxy server name.

string

Maximum length: 35

display-name

display name of the LLM Server

string

Maximum length: 79

type

LLM server type

option

-

built-in

Option

Description

built-in

Built-in LLM server.

customized

User customized LLM server.

built-in-server

built-in LLM server

option

-

openai

Option

Description

openai

OpenAI LLM server.

azure

Azure LLM server (for non-OpenAI models).

azure-openai

Azure LLM server (for OpenAI models).

gemini

Gemini LLM server.

anthropic

Anthropic LLM server.

grok

Grok LLM server.

gemini-with-openai-api

Gemini LLM server with OpenAI API.

anthropic-with-openai-api

Anthropic LLM server with OpenAI API.

azure-resource-name

Azure resource name.

string

Maximum length: 63

end-point

Overwrite the default end-point of the vendor.

var-string

Maximum length: 255

chat-completions-api

Chat Completions API of this server

option

-

openai

Option

Description

none

No Chat-Completions API

openai

Follow OpenAI Chat-Completions API

azure-openai

Follow Azure OpenAI Chat-Completions API

gemini

Follow Google Gemini Chat API

anthropic

Follow Anthropic Chat API

image-gen-api

Image-Gen API of this server

option

-

openai

Option

Description

none

No Chat-Completions API

openai

Follow OpenAI Image-Gen API

azure-openai

Follow Azure OpenAI Chat-Completions API

anthropic-version

Anthropic version in API

string

Maximum length: 63

2023-06-01

azure-api-version

Azure API version.

string

Maximum length: 63

2024-02-01

models

models of the LLM Server

string

Maximum length: 79

accept-custom-model

accept custom model

option

-

disable

Option

Description

enable

Enable custom model other than models

disable

Disable custom model other than models

custom-model-allow-regex

custom model allow regex, allow all if empty

string

Maximum length: 255

custom-model-block-regex

custom model block regex, no block if empty

string

Maximum length: 255

verify-cert

Enable/disable certificate verification.

option

-

enable

Option

Description

enable

Enable certificate verification.

disable

Disable certificate verification.

api-key

API Keys of the LLM server

string

Maximum length: 255

config llm server

config llm server

Configure LLM Proxy servers.

config llm server
    Description: Configure LLM Proxy servers.
    edit <name>
        set display-name {string}
        set type [built-in|customized]
        set built-in-server [openai|azure|...]
        set azure-resource-name {string}
        set end-point {var-string}
        set chat-completions-api [none|openai|...]
        set image-gen-api [none|openai|...]
        set anthropic-version {string}
        set azure-api-version {string}
        set models {string}
        set accept-custom-model [enable|disable]
        set custom-model-allow-regex {string}
        set custom-model-block-regex {string}
        set verify-cert [enable|disable]
        set api-key {string}
    next
end

config llm server

Parameter

Description

Type

Size

Default

name

LLM Proxy server name.

string

Maximum length: 35

display-name

display name of the LLM Server

string

Maximum length: 79

type

LLM server type

option

-

built-in

Option

Description

built-in

Built-in LLM server.

customized

User customized LLM server.

built-in-server

built-in LLM server

option

-

openai

Option

Description

openai

OpenAI LLM server.

azure

Azure LLM server (for non-OpenAI models).

azure-openai

Azure LLM server (for OpenAI models).

gemini

Gemini LLM server.

anthropic

Anthropic LLM server.

grok

Grok LLM server.

gemini-with-openai-api

Gemini LLM server with OpenAI API.

anthropic-with-openai-api

Anthropic LLM server with OpenAI API.

azure-resource-name

Azure resource name.

string

Maximum length: 63

end-point

Overwrite the default end-point of the vendor.

var-string

Maximum length: 255

chat-completions-api

Chat Completions API of this server

option

-

openai

Option

Description

none

No Chat-Completions API

openai

Follow OpenAI Chat-Completions API

azure-openai

Follow Azure OpenAI Chat-Completions API

gemini

Follow Google Gemini Chat API

anthropic

Follow Anthropic Chat API

image-gen-api

Image-Gen API of this server

option

-

openai

Option

Description

none

No Chat-Completions API

openai

Follow OpenAI Image-Gen API

azure-openai

Follow Azure OpenAI Chat-Completions API

anthropic-version

Anthropic version in API

string

Maximum length: 63

2023-06-01

azure-api-version

Azure API version.

string

Maximum length: 63

2024-02-01

models

models of the LLM Server

string

Maximum length: 79

accept-custom-model

accept custom model

option

-

disable

Option

Description

enable

Enable custom model other than models

disable

Disable custom model other than models

custom-model-allow-regex

custom model allow regex, allow all if empty

string

Maximum length: 255

custom-model-block-regex

custom model block regex, no block if empty

string

Maximum length: 255

verify-cert

Enable/disable certificate verification.

option

-

enable

Option

Description

enable

Enable certificate verification.

disable

Disable certificate verification.

api-key

API Keys of the LLM server

string

Maximum length: 255