Framework Inference

For NSFW models, the inference script generates image scores for NSFW content. Values closer to 1 are strong NSFW while -1 are strong safe.

To enable inference stage with a NSFW model, configure the configuration files:

  1. In the defaults section of conf/config.yaml, update the fw_inference field to point to the desired NSFW inference configuration file. For example, if you want to use the nsfw/nsfw.yaml configuration, change the fw_inference field to nsfw/nsfw.

    Copy
    Copied!
                

    defaults: - fw_inference: nsfw/nsfw ...

  2. In the stages field of conf/config.yaml, make sure the fw_inference stage is included. For example,

    Copy
    Copied!
                

    stages: - fw_inference ...

  3. Configure image_path field of conf/fw_inference/nsfw/nsfw.yaml.

  4. Execute launcher pipeline: python3 main.py

Previous Fine-tuning
Next Model Export to TensorRT-LLM
© Copyright 2023-2024, NVIDIA. Last updated on Apr 25, 2024.