Difference between revisions of "Eosc/pretraining"

From Nordic Language Processing Laboratory
Jump to: navigation, search
(Created page with "= Background = This page provides an informal, technically-oriented survey over available (and commonly used) architecturs and implementations for large-scale pre-training (a...")
 
m (Background)
Line 2: Line 2:
  
 
This page provides an informal, technically-oriented survey
 
This page provides an informal, technically-oriented survey
over available (and commonly used) architecturs and implementations
+
over available (and commonly used) architectures and implementations
 
for large-scale pre-training (and fine-tuning) of contextualized
 
for large-scale pre-training (and fine-tuning) of contextualized
 
neural language models.
 
neural language models.
 +
 
The NLPL use case, will install, validate, and maintain a selection
 
The NLPL use case, will install, validate, and maintain a selection
 
of these implementations, in an automated and uniform manner,
 
of these implementations, in an automated and uniform manner,

Revision as of 10:31, 27 August 2020

Background

This page provides an informal, technically-oriented survey over available (and commonly used) architectures and implementations for large-scale pre-training (and fine-tuning) of contextualized neural language models.

The NLPL use case, will install, validate, and maintain a selection of these implementations, in an automated and uniform manner, on multiple HPC systems.

ELMo

BERT