Training with Predefined Configurations

User Guide (Latest Version)

NVIDIA provides configurations for two ChatGLM model size: 6B, which can work for ChatGLM2-6B and ChatGLM3-6B.

To run ChatGLM training update conf/config.yaml:

Copy
Copied!
            

defaults: - training: chatglm/chatglm3-6b stages: - training

Specify chatglm, model version(2 or 3) and the desired model size for training configuration, chatglm/chatglm<version>-6b.

Execute launcher pipeline: python3 main.py

Configuration

Default configurations for model size specific training can be found in the folder conf/training/chatglm. The configuration is divided into four sections run, trainer, exp_manager, and model.

Copy
Copied!
            

run: name: chatglm3_6b results_dir: ${base_results_dir}/${.name} time_limit: "0-04:00:00" dependency: "singleton"

Set the number of nodes and devices for training:

Copy
Copied!
            

trainer: num_nodes: 16 devices: 8 max_steps: 300000 # consumed_samples = global_step * global_batch_size max_time: "05:23:30:00" # days:hours:minutes:seconds

Set configurations for creating a checkpoint:

Copy
Copied!
            

exp_manger: create_checkpoint_callback: True checkpoint_callback_params: monitor: val_loss save_top_k: 10 mode: min always_save_nemo: False # saves nemo file during validation, not implemented for model parallel save_nemo_train_end: False # not recommended when training large models on clusters with short time limits filename: 'megatron_chatglm--{val_loss:.2f}-{step}-{consumed_amples}' model_parallel_size: ${multiply:${training.model.tensor_model_parallel_size}, ${training.model.pipeline_model_parallel_size}}

Set wandb configurations:

Copy
Copied!
            

exp_manager: create_wandb_logger: True wandb_logger_kwargs: project: nemo_chatglm3 name: ${training.run.name}

Set tensor parallel and pipeline parallel size:

Copy
Copied!
            

model: tensor_model_parallel_size: 1 pipeline_model_parallel_size: 1

Set data distribution configuration:

Copy
Copied!
            

model: data: data_prefix: - .0333 - ${data_dir}/my-chatglm_00_text_document - .0333 - ${data_dir}/my-chatglm_01_text_document ...

Previous Data Preparation
Next Checkpoint Conversion
© | | | | | | |. Last updated on Jun 19, 2024.