Difference between revisions of "Eosc/pretraining"

From Nordic Language Processing Laboratory
Jump to: navigation, search
Line 14: Line 14:
 
= BERT =
 
= BERT =
  
== RoBERTa ==
+
= RoBERTa =
  
== ELECTRA ==
+
= ELECTRA =

Revision as of 14:40, 31 August 2020

Background

This page provides an informal, technically-oriented survey over available (and commonly used) architectures and implementations for large-scale pre-training (and fine-tuning) of contextualized neural language models.

The NLPL use case, will install, validate, and maintain a selection of these implementations, in an automated and uniform manner, on multiple HPC systems.

ELMo

BERT

RoBERTa

ELECTRA