Translation/mttools

From Nordic Language Processing Laboratory
Revision as of 11:55, 21 October 2022 by Yvessche (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Using the mttools module

  • Activate the NLPL software repository and load the module:
    module use -a /projappl/nlpl/software/modules/etc         # Puhti
    module use -a /cluster/shared/nlpl/software/modules/etc   # Saga
    module load nlpl-mttools/
  • Module-specific help is available by typing:
    module help nlpl-mttools

The following scripts are part of this module:

  • moses-scripts
    • Tokenization, casing, corpus cleaning and evaluation scripts from Moses
    • Source: https://github.com/moses-smt/mosesdecoder (scripts directory)
    • Installed revision: 3990724
    • The subfolders generic, recaser, tokenizer, training are in PATH
  • sacremoses
  • subword-nmt
    • Unsupervised Word Segmentation (a.k.a. Byte Pair Encoding) for Machine Translation and Text Generation
    • Source: https://github.com/rsennrich/subword-nmt
    • Installed version: 0.3.8
    • The subword-nmt executable is in PATH
  • sentencepiece
  • sacreBLEU
    • Reference BLEU implementation that auto-downloads test sets and reports a version string to facilitate cross-lab comparisons
    • Source: https://github.com/mjpost/sacreBLEU
    • Installed version: 2.2.1
    • The sacrebleu executable is in PATH
  • multeval
    • Tool to evaluate machine translation with various scores (BLEU, TER, METEOR) and to perform statistical significance testing with bootstrap resampling
    • Source: https://github.com/jhclark/multeval
    • Installed version: 0.5.1 with METEOR 1.5
    • The multeval.sh script is in PATH
  • compare-mt
    • Compare the output of multiple systems for language generation, including machine translation, summarization, dialog response generation. Computes common evaluation scores and runs analyses to find salient differences between the systems.
    • To run METEOR, consult the module-specific help page for the exact path.
    • Source: https://github.com/neulab/compare-mt
    • Installed version: 0.2.10
    • The compare-mt executable is in PATH


Contact: Yves Scherrer, University of Helsinki, firstname.lastname@helsinki.fi