***
layout: overview
slug: nemo-curator/nemo\_curator/models/client/openai\_client
title: nemo\_curator.models.client.openai\_client
-------------------------------------------------
## Module Contents
### Classes
| Name | Description |
| ---------------------------------------------------------------------------------- | ----------------------------------------------------------------- |
| [`AsyncOpenAIClient`](#nemo_curator-models-client-openai_client-AsyncOpenAIClient) | A wrapper around OpenAI's Python async client for querying models |
| [`OpenAIClient`](#nemo_curator-models-client-openai_client-OpenAIClient) | A wrapper around OpenAI's Python client for querying models |
### API
```python
class nemo_curator.models.client.openai_client.AsyncOpenAIClient(
max_concurrent_requests: int = 5,
max_retries: int = 3,
base_delay: float = 1.0,
kwargs = {}
)
```
**Bases:** [AsyncLLMClient](/nemo-curator/nemo_curator/models/client/llm_client#nemo_curator-models-client-llm_client-AsyncLLMClient)
A wrapper around OpenAI's Python async client for querying models
```python
nemo_curator.models.client.openai_client.AsyncOpenAIClient._query_model_impl(
messages: collections.abc.Iterable,
model: str,
conversation_formatter: nemo_curator.models.client.llm_client.ConversationFormatter | None = None,
generation_config: nemo_curator.models.client.llm_client.GenerationConfig | dict | None = None
) -> list[str]
```
async
Internal implementation of query\_model without retry/concurrency logic.
```python
nemo_curator.models.client.openai_client.AsyncOpenAIClient.setup() -> None
```
Setup the client.
```python
class nemo_curator.models.client.openai_client.OpenAIClient(
kwargs = {}
)
```
**Bases:** [LLMClient](/nemo-curator/nemo_curator/models/client/llm_client#nemo_curator-models-client-llm_client-LLMClient)
A wrapper around OpenAI's Python client for querying models
```python
nemo_curator.models.client.openai_client.OpenAIClient.query_model(
messages: collections.abc.Iterable,
model: str,
conversation_formatter: nemo_curator.models.client.llm_client.ConversationFormatter | None = None,
generation_config: nemo_curator.models.client.llm_client.GenerationConfig | dict | None = None
) -> list[str]
```
```python
nemo_curator.models.client.openai_client.OpenAIClient.setup() -> None
```
Setup the client.