LLM service vendor for Agentica Chat.

IAgenticaVendor is a type represents an LLM (Large Language Model) vendor of the Agentica.

Currently, Agentica supports OpenAI SDK. However, it does not mean that you can use only OpenAI's GPT model in the Agentica. The OpenAI SDK is just a connection tool to the LLM vendor's API, and you can use other LLM vendors by configuring its baseURL and API key.

Therefore, if you want to use another LLM vendor like Claude or Gemini, please configure the baseURL to the api, and set IAgenticaController's schema model as "cluade" or "gemini".

Samchon

interface IAgenticaVendor {
    api: OpenAI;
    model: ChatModel | {} & string;
    options?: RequestOptions;
    semaphore?: number | Semaphore<number>;
}

Properties

api: OpenAI

OpenAI API instance.

model: ChatModel | {} & string

Chat model to be used.

({}) & string means to support third party hosting cloud(eg. openRouter, aws)

options?: RequestOptions

Options for the request.

semaphore?: number | Semaphore<number>

Number of concurrent requests allowed.

If you configure this property, Agentica will constrain the number of concurrent requests to the LLM vendor. If you want to share the semaphore instance with other agents, you can directly assign the Semaphore instance to this property.

Otherwise, it will not limit the number of concurrent requests, and the Agentica will send requests asynchronously without any limit.