AI-Service
Imprint
Imprint
  • Getting started
  • AI Model Configuration
  • Plans
  • App Suite Compatibility
  • Migration Guide
  • Helm Chart Readme
  • Changelog

Models Configuration

Starting with the AI-Service 3.x the service uses a single config for defining configured models (e.g. GPT 4o, GPT 3.5) and services (e.g OpenAI, localAI). The services and models parameters from 2.6.x are deprecated and should no longer be used.

The modelsConfig parameter defines the configuration for the used AI models. It has parameters for the default models and a list of all supported models the AI service can use. Clients will send the desired model with each request to the service. The service maps the requested model with the configured models. If a model is requested by the client which is not configured, the default model will be used.

Info

This setting can be used to provide different pricing tiers to end users. A basic user plan could be based on a cheaper AI model like ChatGPT 4o-mini while a premium plan is based on the larger GPT 4o.

Define Models

The models parameter defines an object of predefined AI service (openai, localai etc.) and model (gpt-4 etc) combinations that are supported by the deployment.

  • key The key serves as an id. We recommend not to use a specific model name because these usually change/evolve over time.
  • value Object with the following properties:
    • displayName: The display name of the AI model (per language/locale; 'en' works as fallback)
    • service: The name of the ai service. Allowed values are: openai, claude-aws and localai
    • consentId: User consent must be given for each consentId. Defaults to "default".
    • model: the technical name of the LLM you want to use, e.g. gpt-4o
    • maxTokens (optional): The maximum number of tokens you want to allow (must be lower than the actual token limit)

Next to models, you need to provide selectableModels which covers two purposes:

  1. The first model in selectableModels defines the default model.
  2. In case, multiple models are defined and selectableModels contains more than one model, the end-user can chose a model from that list (also see below).

Minimal example for a default modelsConfig:

modelsConfig:
  models:
    default:
      displayName:
        en: "ChatGPT 4o"
      service: "openai"
      consentId: "default"
      model: "gpt-4o"
  selectableModels:
    - default

Full featured modelsConfig for advanced usage:

modelsConfig:
  models:
    standard:
      displayName:
        en: "ChatGPT 4o"
      service: "openai"
      consentId: "openai"
      model: "gpt-4o"
    alternative:
      displayName:
        en: "ChatGPT 4-mini"
      service: "openai"
      consentId: "openai"
      model: "gpt-4-mini"
    local:
      displayName:
        en: "LLaMA 2"
      service: "localai"
      consentId: "local"
      model: "Llama-2-7b"
  selectableModels:
    - standard
    - local

Using LocalAI with a self-hosted LLM

The AI Service supports LocalAI which allows you to use a self-hosted LLM. It acts as a drop-in API replacement compatible with the OpenAI API.

Set service to localai in your models config

modelsConfig:
  models:
    local:
      displayName:
        en: "LLaMA 2"
      service: "localai"
      consentId: "local"
      model: "Llama-2-7b"
  selectableModels:
    - local

Additionally, set the localaiBaseUrl to point to your LocalAI endpoint. If no localaiBaseUrl is set but an openaiBaseUrl is configured, the latter will be used as a fallback.

Consent dialog

Important

If you plan to offer AI services, you should review your data protection and legal requirements. The default consent text provided in the product may not fully meet your specific needs, so it is recommended to create a custom consent text that complies with applicable regulations and clearly explains the use of AI models.

You need to provide a configuration for the consent dialog for each consentId used in models. We recommend to use a consentId per service, for example one for OpenAI, one for your localAI. Assumption is that end-user need to give consent per service (not model/LLM) from a legal perspective.

The mandatory fields of a consent dialog configuration are:

  • name
  • location
  • aboutURL (URL)
  • dataUsagePolicyURL (URL)

Optional fields are:

  • button Text of the 'Save' button in the dialog
  • checkbox Text of the checkbox
  • teaser An optional text that is shown before text
  • text The main text of the consent dialog
  • title The dialog title
  • trailer An optional text that is shown after text
  • expires An optional ISO-8601 date string defining the consent expiry

All fields can be customized for different languages (en or de) or locales (en_US or de_DE). It is mandatory to provide at least one string for English (en) as default.

modelsConfig:
  consentDialog:
    # keys on this level are consentIds (concentId defaults to service name)
    openai:
      name:
        en: 'OpenAI'
      location:
        en: 'USA'
      aboutURL:
        en: 'https://openai.com/about'
      dataUsagePolicyURL:
        en: 'https://openai.com/policies/api-data-usage-policies'
      expires: '2026-06-01T12:00:00Z'

If you customize text you might want to utilize a few placeholders. The default text is:

By using any of the AI-based features, you agree to submit the subject data to an external artificial intelligence-based text transformer operated by %1$s located in %2$s. This process will only occur upon your proactive engagement to ensure that no data is automatically transmitted. It will not be used to train the underlying AI model, but is subject to %3$s's Data Use Policy. Please note that you may be restricted by law or contract from providing confidential information (including personal information) to any third party, including an AI and its operator. You may return to this consent dialog at any time to review and change your consent status under Settings > KI.

  • %1$s: Placeholder a link to the service using name and aboutURL (see consentDialog/$consentId/aboutURL/*)
  • %2$s: Placeholder for location (see consentDialog/$consentId/location/*)
  • %3$s: Placeholder for name (see consentDialog/$consentId/name/*)
  • %4$s: Placeholder for a link to dataUsagePolicyURL (see consentDialog/$consentId/dataUsagePolicyURL/*)

Links need to be defined in markdown-syntax, i.e. [Text](Link). Plain HTML is not allowed and will be escaped.

Consent expiry date

There may be situations where the consent text must be updated — for example, due to changes in legal requirements or terms and conditions. To ensure that all users, including those who have already provided consent, are prompted to agree to the updated version, you can define an expires date as UTC date for each consent entry using the ISO-8601 format. Any user who gave consent before this date will be shown the consent dialog again.

modelsConfig:
  consentDialog:
    # keys on this level are consentIds (concentId defaults to service name)
    openai:
      expires: '2026-06-01T12:00:00Z' # Use UTC by attaching the Z
      name:
        en: 'OpenAI'
      location:
        en: 'USA'
      aboutURL:
        en: 'https://openai.com/about'
      dataUsagePolicyURL:
        en: 'https://openai.com/policies/api-data-usage-policies'

Info

This mechanism does not check whether the consent configuration has actually changed. If you don't modify the configuration, users will still be prompted to consent again after the expiry date.

Model selection dialog

This configuration is only needed if multiple models are defined and listed in selectableModels and you want to customize the strings shown in the settings section (Settings > AI > Model).

The section contains a description and radio button per model which can be customized via options:

modelSelectionDialog:
  description:
    en: ''
  options:
    # keys on this level must be valid modelIds
    default:
      en: ''
    private:
      en: ''

Please make sure that the UI setting io.ox/core//ai/model is not protected (e.g. by setting a default value), otherwise the model selection will not shown up.

Special "private" mode for the model selection dialog

If and only if two models are defined in selectableModels and the second one uses private as consentId, the dialog works with different defaults:

  • $displayName (Up to date knowledge, fast, hosted in US)
  • $displayName (Best privacy and data protection, hosted in EU)

Here, $displayName refers to the displayName of the corresponding model.

Last Updated:: 4/15/25, 5:14 PM
Prev
Getting started
Next
Plans