Eosc/pretraining

From Nordic Language Processing Laboratory
Revision as of 14:40, 31 August 2020 by Andreku (talk | contribs)
Jump to: navigation, search

Background

This page provides an informal, technically-oriented survey over available (and commonly used) architectures and implementations for large-scale pre-training (and fine-tuning) of contextualized neural language models.

The NLPL use case, will install, validate, and maintain a selection of these implementations, in an automated and uniform manner, on multiple HPC systems.

ELMo

BERT

RoBERTa

ELECTRA