morpheus.llm.services.openai_chat_service.OpenAIChatClient

class OpenAIChatClient(model_name, set_assistant=False, **model_kwargs)[source]

Bases: morpheus.llm.services.llm_service.LLMClient

Client for interacting with a specific OpenAI chat model. This class should be constructed with the OpenAIChatService.get_client method.

Parameters
model_namestr

The name of the model to interact with.

set_assistant: bool, optional default=False

When True, a second input field named assistant will be used to proide additional context to the model.

model_kwargsdict[str, typing.Any]

Additional keyword arguments to pass to the model when generating text.

Methods

generate(input_dict) Issue a request to generate a response based on a given prompt.
generate_async(input_dict) Issue an asynchronous request to generate a response based on a given prompt.
generate_batch(inputs) Issue a request to generate a list of responses based on a list of prompts.
generate_batch_async(inputs) Issue an asynchronous request to generate a list of responses based on a list of prompts.
get_input_names() Returns the names of the inputs to the model.
generate(input_dict)[source]

Issue a request to generate a response based on a given prompt.

Parameters
input_dictdict

Input containing prompt data.

async generate_async(input_dict)[source]

Issue an asynchronous request to generate a response based on a given prompt.

Parameters
input_dictdict

Input containing prompt data.

generate_batch(inputs)[source]

Issue a request to generate a list of responses based on a list of prompts.

Parameters
inputsdict

Inputs containing prompt data.

async generate_batch_async(inputs)[source]

Issue an asynchronous request to generate a list of responses based on a list of prompts.

Parameters
inputsdict

Inputs containing prompt data.

get_input_names()[source]

Returns the names of the inputs to the model.

Previous morpheus.llm.services.openai_chat_service
Next morpheus.llm.services.openai_chat_service.OpenAIChatService
© Copyright 2023, NVIDIA. Last updated on Feb 2, 2024.