Eosc/pretraining

From Nordic Language Processing Laboratory
Revision as of 10:21, 27 August 2020 by Oe (talk | contribs) (Created page with "= Background = This page provides an informal, technically-oriented survey over available (and commonly used) architecturs and implementations for large-scale pre-training (a...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Background

This page provides an informal, technically-oriented survey over available (and commonly used) architecturs and implementations for large-scale pre-training (and fine-tuning) of contextualized neural language models. The NLPL use case, will install, validate, and maintain a selection of these implementations, in an automated and uniform manner, on multiple HPC systems.

ELMo

BERT