nemo_automodel.training.timers
#
Megatron based timers.
Module Contents#
Classes#
Timer base class. |
|
Dummy Timer. |
|
Timer class with ability to start/stop. |
|
Class for a group of Timers. |
API#
- class nemo_automodel.training.timers.TimerBase(name: str)[source]#
Bases:
abc.ABC
Timer base class.
Initialization
Base class for Timers.
- Parameters:
name (str) β The name of the timer.
- with_barrier(barrier=True)[source]#
Set the barrier option for use in context manager.
- Parameters:
barrier (bool, optional) β Whether to use barrier in context manager. Defaults to True.
- Returns:
Returns self for chaining.
- Return type:
- __exit__(exc_type, exc_val, exc_tb)[source]#
Stop the timer when exiting a context using the configured barrier option.
- abstractmethod start(barrier=False)[source]#
Start the timer.
- Parameters:
barrier (bool, optional) β Synchronizes ranks before starting. Defaults to False.
- abstractmethod stop(barrier=False)[source]#
Stop the timer.
- Parameters:
barrier (bool, optional) β Synchronizes ranks before stopping. Defaults to False.
- abstractmethod elapsed(reset=True, barrier=False)[source]#
Calculates the elapsed time and restarts timer.
- Parameters:
reset (bool, optional) β Resets timer before restarting. Defaults to True.
barrier (bool, optional) β Synchronizes ranks before stopping. Defaults to False.
- Returns:
Elapsed time.
- Return type:
float
- class nemo_automodel.training.timers.DummyTimer[source]#
Bases:
nemo_automodel.training.timers.TimerBase
Dummy Timer.
Initialization
Dummy timer init.
- class nemo_automodel.training.timers.Timer(name)[source]#
Bases:
nemo_automodel.training.timers.TimerBase
Timer class with ability to start/stop.
Comment on using
barrier
: If this flag is passed, then all the caller processes will wait till all reach the timing routine. It is up to the user to make sure all the ranks inbarrier_group
call it otherwise, it will result in a hang. Comment onbarrier_group
: By default it is set to None which in torch distributed land, it will result in the global communicator.Initialization
Initialize Timer.
- Parameters:
name (str) β Name of the timer.
- set_barrier_group(barrier_group)[source]#
Sets barrier group.
- Parameters:
barrier_group (ProcessGroup) β Torch ProcessGroup for barrier.
- start(barrier=False)[source]#
Start the timer.
- Parameters:
barrier (bool, optional) β Synchronizes ranks before starting. Defaults to False.
- stop(barrier=False)[source]#
Stop the timer.
- Parameters:
barrier (bool, optional) β Synchronizes ranks before stopping. Defaults to False.
- elapsed(reset=True, barrier=False)[source]#
Calculates the elapsed time and restarts timer.
- Parameters:
reset (bool, optional) β Resets timer before restarting. Defaults to True.
barrier (bool, optional) β Synchronizes ranks before stopping. Defaults to False.
- Returns:
Elapsed time.
- Return type:
float
- class nemo_automodel.training.timers.Timers(log_level, log_option)[source]#
Class for a group of Timers.
Initialization
Initialize group of timers.
- Parameters:
log_level (int) β Log level to control what timers are enabled.
log_option (str) β Setting for logging statistics over ranks for all the timers. Allowed: [βmaxβ, βminmaxβ, βallβ].
- __call__(name, log_level=None, barrier=False)[source]#
Call timer with name and log level.
Returns a timer object that can be used as a context manager.
- Parameters:
name (str) β Name of the timer.
log_level (int, optional) β Log level of the timer. Defaults to None.
barrier (bool, optional) β Whether to use barrier in context manager. Defaults to False.
.. rubric:: Example
with timers(βmy_timerβ): # Code to time β¦
With barrier#
with timers(βmy_timerβ, barrier=True): # Code to time β¦
- _get_elapsed_time_all_ranks(names, reset, barrier)[source]#
Returns elapsed times of timers in names.
Assumptions: - All the ranks call this function. -
names
are identical on all ranks. If the above assumptions are not met, calling this function will result in hang.- Parameters:
names (List[str]) β list of timer names
reset (bool) β reset the timer after recording the elapsed time
barrier (bool) β if set, do a global barrier before time measurments
- Returns:
Tensor of size [world_size, len(names)] with times in float.
- Return type:
torch.tensor
- _get_global_min_max_time(names, reset, barrier, normalizer)[source]#
Report only min and max times across all ranks.
- _get_global_min_max_time_string(
- names,
- reset,
- barrier,
- normalizer,
- max_only,
Report strings for max/minmax times across all ranks.
- _get_all_ranks_time_string(names, reset, barrier, normalizer)[source]#
Report times across all ranks.
- get_all_timers_string(
- names: List[str] = None,
- normalizer: float = 1.0,
- reset: bool = True,
- barrier: bool = False,
Returns the output string with logged timer values according to configured options.
- Parameters:
names (List[str]) β Names of the timers to log. If None, all registered timers are fetched. Defaults to None.
normalizer (float, optional) β Normalizes the timer values by the factor. Defaults to 1.0.
reset (bool, optional) β Whether to reset timer values after logging. Defaults to True.
barrier (bool, optional) β Whether to do a global barrier before time measurments. Defaults to False.
- Raises:
Exception β Raises if log option is invalid.
- Returns:
Formatted string with the timer values.
- Return type:
str
- log(
- names: List[str],
- rank: int = None,
- normalizer: float = 1.0,
- reset: bool = True,
- barrier: bool = False,
Logs the timers passed in names to stdout.
Example usage is to log average per step value for timer βfooβ, this function can be called with normalizer factor set to logging interval.
- Parameters:
names (List[str]) β Names of the timers to log.
rank (int, optional) β logs the timers to a specific rank. If set to None, logs to the last rank. Defaults to None.
normalizer (float, optional) β Normalizes the timer values by the factor. Defaults to 1.0.
reset (bool, optional) β Whether to reset timer values after logging. Defaults to True.
barrier (bool, optional) β Whether to do a global barrier before time measurments. Defaults to False.
- write(
- names: List[str],
- writer,
- iteration: int,
- normalizer: float = 1.0,
- reset: bool = True,
- barrier: bool = False,
Write timers to a tensorboard writer.
Note that we only report maximum time across ranks to tensorboard.
- Parameters:
names (List[str]) β Names of the timers to log.
writer (SummaryWriter) β Tensorboard SummaryWriter object
iteration (int) β Current iteration.
normalizer (float, optional) β Normalizes the timer values by the factor. Defaults to 1.0.
reset (bool, optional) β Whether to reset timer values after logging. Defaults to True.
barrier (bool, optional) β Whether to do a global barrier before time measurments. Defaults to False.