Submit Search
NVIDIA Developer
Blog
Forums
Join
Submit Search
NVIDIA Developer
Blog
Forums
Join
Menu
Train a Large-Scale NLP Model with NeMo Megatron
Submit Search
Submit Search
NVIDIA Docs Hub
NVIDIA LaunchPad
Train a Large-Scale NLP Model with NeMo Megatron
Train a Large-Scale NLP Model with NeMo Megatron (Latest Release)
Base Command
Overview
Step #1: Logging Into Base Command
Accessing the Base Command Environment
Logging into NVIDIA NGC
Step #2: Configuring NGC CLI
Train a Large-Scale NLP Model with Nemo Megatron
Overview
Intro to BigNLP Scripts
Step #1: Create a Workspace
Example to Create Workspace
Step #2: Download and Preprocess the Data
Use BigNLP to Download and Prepare the Pile Dataset
Step #3: Training Using NGC Batch Run
Training NGC Batch Command
NGC Batch Run Mount Setup
Training
Resume Training
Step #4: Convert the BigNLP Model from PyTorch to NeMo
Step #5: Evaluate the BigNLP Model
Step #6: Inference with BigNLP Model
Notices
Support
Legal
Agreements
Privacy Policy
Notice
Trademarks
Copyright
Edge Computing
Data Center / Cloud
Data Center / Cloud
© Copyright 2022-2023, NVIDIA.
Last updated on Feb 2, 2023.