core._rank_utils#
Low-level rank utilities with minimal dependencies to avoid circular imports.
Module Contents#
Functions#
Safely get the rank of the current process. |
|
Log a message only on a single rank. |
API#
- core._rank_utils.safe_get_rank() int#
Safely get the rank of the current process.
Returns the rank from torch.distributed if initialized, otherwise falls back to the RANK environment variable, defaulting to 0.
- Returns:
The rank of the current process.
- Return type:
int
- core._rank_utils.log_single_rank(
- logger: logging.Logger,
- *args: Any,
- rank: int = 0,
- **kwargs: Any,
Log a message only on a single rank.
If torch distributed is initialized, write log on only one rank.
- Parameters:
logger – The logger to write the logs.
*args – All logging.Logger.log positional arguments.
rank – The rank to write on. Defaults to 0.
**kwargs – All logging.Logger.log keyword arguments.