morpheus.llm.services.openai_chat_service.OpenAIChatService
- class OpenAIChatService(*, api_key=None, org_id=None, base_url=None, default_model_kwargs=None)[source]
Bases:
morpheus.llm.services.llm_service.LLMService
A service for interacting with OpenAI Chat models, this class should be used to create clients.
- Parameters
- api_key
- org_id
- base_url
- default_model_kwargs
The API key for the LLM service, by default None. If
None
the API key will be read from theOPENAI_API_KEY
environment variable. If neither are present an error will be raised.The organization ID for the LLM service, by default None. If
None
the organization ID will be read from theOPENAI_ORG_ID
environment variable. This value is only required if the account associated with theapi_key
is a member of multiple organizations, by default NoneThe api host url, by default None. If
None
the url will be read from theOPENAI_BASE_URL
environment variable. If neither are present the OpenAI default will be used, by default NoneDefault arguments to use when creating a client via the
get_client
function. Any argument specified here will automatically be used when callingget_client
. Arguments specified in theget_client
function will overwrite default values specified here. This is useful to set model arguments before creating multiple clients. By default None
Methods
create
(service_type, *service_args, ...)Returns a service for interacting with LLM models. get_client
(*, model_name[, set_assistant, ...])Returns a client for interacting with a specific model. APIKey BaseURL OrgId - class APIKey(value=None, use_env=True)[source]
Bases:
morpheus.utils.env_config_value.EnvConfigValue
- Attributes
- source
- use_env
- value
- class BaseURL(value=None, use_env=True)[source]
Bases:
morpheus.utils.env_config_value.EnvConfigValue
- Attributes
- source
- use_env
- value
- class OrgId(value=None, use_env=True)[source]
Bases:
morpheus.utils.env_config_value.EnvConfigValue
- Attributes
- source
- use_env
- value
- static create(service_type, *service_args, **service_kwargs)[source]
Returns a service for interacting with LLM models.
- Parameters
- service_type
- service_args
- service_kwargs
The type of the service to create
Additional arguments to pass to the service.
Additional keyword arguments to pass to the service.
- get_client(*, model_name, set_assistant=False, max_retries=10, json=False, **model_kwargs)[source]
Returns a client for interacting with a specific model. This method is the preferred way to create a client.
- Parameters
- model_name
- set_assistant: bool, optional
- max_retries: int, optional
- json: bool, optional
- model_kwargs
The name of the model to create a client for.
When
True
, a second input field namedassistant
will be used to proide additional context to the model, by default FalseThe maximum number of retries to attempt when making a request to the OpenAI API, by default 10
When
True
, the response will be returned as a JSON object, by default FalseAdditional keyword arguments to pass to the model when generating text. Arguments specified here will overwrite the
default_model_kwargs
set in the service constructor