Eosc/pretraining/nvidia

From Nordic Language Processing Laboratory
Revision as of 10:11, 7 November 2020 by Oe (talk | contribs) (Background)
Jump to: navigation, search

Background

This page provides a recipe to large-scale pre-training of a BERT neural language model, using the high-efficiency NVIDIA BERT implementation (which is based on TensorFlow, in contrast to the NVIDIA Megatron code).

Software Installation

Data Preparation

Training Example