Difference between revisions of "Eosc/pretraining"

From Nordic Language Processing Laboratory
Jump to: navigation, search
(Created page with "= Background = This page provides an informal, technically-oriented survey over available (and commonly used) architecturs and implementations for large-scale pre-training (a...")
(No difference)

Revision as of 10:21, 27 August 2020

Background

This page provides an informal, technically-oriented survey over available (and commonly used) architecturs and implementations for large-scale pre-training (and fine-tuning) of contextualized neural language models. The NLPL use case, will install, validate, and maintain a selection of these implementations, in an automated and uniform manner, on multiple HPC systems.

ELMo

BERT