Optimize model and use for offline inference#
In this example, we show to obtain runner from .nav package and use it for inference.
We recommend running this example in NVIDIA NGC PyTorch container. To run the example, simply run the optimize.py script:
./optimize.py