morpheus_llm.llm.services.openai_chat_service.OpenAIChatService
- class OpenAIChatService(*, api_key=None, org_id=None, base_url=None, default_model_kwargs=None)[source]
Bases:
morpheus_llm.llm.services.llm_service.LLMService
A service for interacting with OpenAI Chat models, this class should be used to create clients.
- Parameters
- api_keystr, optional
The API key for the LLM service, by default None. If
None
the API key will be read from theOPENAI_API_KEY
environment variable. If neither are present an error will be raised.- org_idstr, optional
The organization ID for the LLM service, by default None. If
None
the organization ID will be read from theOPENAI_ORG_ID
environment variable. This value is only required if the account associated with theapi_key
is a member of multiple organizations, by default None- base_urlstr, optional
The api host url, by default None. If
None
the url will be read from theOPENAI_BASE_URL
environment variable. If neither are present the OpenAI default will be used, by default None- default_model_kwargsdict, optional
Default arguments to use when creating a client via the
get_client
function. Any argument specified here will automatically be used when callingget_client
. Arguments specified in theget_client
function will overwrite default values specified here. This is useful to set model arguments before creating multiple clients. By default None
Methods
create
(service_type, *service_args, ...)Returns a service for interacting with LLM models. get_client
(*, model_name[, set_assistant, ...])Returns a client for interacting with a specific model. APIKey BaseURL OrgId - class APIKey(value=None, use_env=True)[source]
Bases:
morpheus.utils.env_config_value.EnvConfigValue
- Attributes
- source
- use_env
- value
- class BaseURL(value=None, use_env=True)[source]
Bases:
morpheus.utils.env_config_value.EnvConfigValue
- Attributes
- source
- use_env
- value
- class OrgId(value=None, use_env=True)[source]
Bases:
morpheus.utils.env_config_value.EnvConfigValue
- Attributes
- source
- use_env
- value
- static create(service_type, *service_args, **service_kwargs)[source]
Returns a service for interacting with LLM models.
- Parameters
- service_typestr
The type of the service to create
- service_argslist
Additional arguments to pass to the service.
- service_kwargsdict[str, typing.Any]
Additional keyword arguments to pass to the service.
- get_client(*, model_name, set_assistant=False, max_retries=10, json=False, **model_kwargs)[source]
Returns a client for interacting with a specific model. This method is the preferred way to create a client.
- Parameters
- model_namestr
The name of the model to create a client for.
- set_assistant: bool, optional
When
True
, a second input field namedassistant
will be used to proide additional context to the model, by default False- max_retries: int, optional
The maximum number of retries to attempt when making a request to the OpenAI API, by default 10
- json: bool, optional
When
True
, the response will be returned as a JSON object, by default False- model_kwargsdict[str, typing.Any]
Additional keyword arguments to pass to the model when generating text. Arguments specified here will overwrite the
default_model_kwargs
set in the service constructor