The NVIDIA team is very excited with all the new features and capabilities that DeepStream 6.3 brings to the table. However, there are a few important items that need to be understood before you start developing with DeepStream 6.3. This is especially true if you are an existing developer and plan to bring existing models to the latest release of DeepStream.
Make sure you understand how to migrate your older DeepStream custom models to DeepStream 6.3 before you start.
DeepStream 6.3 now supports TensorRT 18.104.22.168 for x86 and Jetson. TensorRT 22.214.171.124 does not maintain compatibility with previous versions of TensorRT 8.x and models created with older versions of TenorRT require updated calibration files. Depending on where you are in your development journey there are a few steps that need to take before you get started.
If you are new to DeepStream OR you do not need to reuse any old models, You are good to go! Make sure you download the latest models from NGC.
If you want to bring models that were developed for previous versions of DeepStream or TensorRT, you’ll need to create a new calibration cache files. This can be easily done using native TAO Toolkit tools. An example on how to perform this task is available in the Exporting a Model section of the TAO Toolkit documentation.