Eosc/pretraining/nvidia

From Nordic Language Processing Laboratory
Revision as of 14:36, 6 November 2020 by Oe (talk | contribs) (Created page with "= Background = This page provides a recipe to large-scale pre-training of a BERT neural language model, using the [https://github.com/NVIDIA/DeepLearningExamples/tree/master/...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Background

This page provides a recipe to large-scale pre-training of a BERT neural language model, using the NVIDIA BERT implementation (which is based on TensorFlow, in contrast to the NVIDIA Megatron code).

Software Installation

Data Preparation

Training Example