Skip to main content

Parameters

See plugin common configurations for configuration options available to all plugins.

  • provider

    string


    required


    vaild vaule:

    openai, deepseek, or openai-compatible


    LLM service provider. When set to openai, the plugin will proxy the request to https://api.openai.com/chat/completions. When set to deepseek, the plugin will proxy the request to https://api.deepseek.com/chat/completions. When set to openai-compatible, the plugin will proxy the request to the custom endpoint configured in override.

  • auth

    object


    required


    Authentication configurations.

    • header

      object


      Authentication headers. At least one of the header and query should be configured.

    • query

      object


      Authentication query parameters. At least one of the header and query should be configured.

  • options

    object


    Model configurations.

    In addition to model, you can configure additional parameters and they will be forwarded to the upstream LLM service in the request body. For instance, if you are working with OpenAI, you can configure additional parameters such as temperature, top_p, and stream. See your LLM provider's API documentation for more available options.

    • model

      string


      Name of the LLM model, such as gpt-4 or gpt-3.5. See your LLM provider's API documentation for more available models.

  • override

    object


    Override setting.

    • endpoint

      string


      LLM provider endpoint. Required when provider is openai-compatible.

  • timeout

    integer


    default: 30000


    vaild vaule:

    greater than or equal to 1


    Request timeout in milliseconds when requesting the LLM service.

  • keepalive

    boolean


    default: true


    If true, keep the conneciton alive when requesting the LLM service.

  • keepalive_timeout

    integer


    default: 60000


    vaild vaule:

    greater than or equal to 1000


    Request timeout in milliseconds when requesting the LLM service.

  • keepalive_pool

    integer


    default: 30


    Keepalive pool size for when connecting with the LLM service.

  • ssl_verify

    boolean


    default: true


    If true, verify the LLM service's certificate.


API7.ai Logo

The digital world is connected by APIs,
API7.ai exists to make APIs more efficient, reliable, and secure.

Product

API7 Gateway

SOC2 Type IIISO 27001HIPAAGDPRRed Herring

Copyright © APISEVEN PTE. LTD 2019 – 2025. Apache, Apache APISIX, APISIX, and associated open source project names are trademarks of the

Apache Software Foundation