Difference between revisions of "Eosc/pretraining/nvidia"

From Nordic Language Processing Laboratory
Jump to: navigation, search
(Created page with "= Background = This page provides a recipe to large-scale pre-training of a BERT neural language model, using the [https://github.com/NVIDIA/DeepLearningExamples/tree/master/...")
(No difference)

Revision as of 14:36, 6 November 2020

Background

This page provides a recipe to large-scale pre-training of a BERT neural language model, using the NVIDIA BERT implementation (which is based on TensorFlow, in contrast to the NVIDIA Megatron code).

Software Installation

Data Preparation

Training Example