This document details how to register to the challenge, download the development datasets and evaluation code, download the surprise datasets and submit your results for official evaluation. If you are experiencing any issue, please contact us at firstname.lastname@example.org.
Any recent Linux system can be used. The evaluation code has been tested on Ubuntu 16.04, Debian Jessie and CentOS 6 (should be OK on Mac OS too). It runs faster with a multicore machine (10 cores is a good number).
Registration & Datasets download
You first need to register by sending an email to email@example.com. You can then download the datasets available on the data page and open the archives using the password received after registration.
Registration is necessary to download the challenge’s data and submit your results for evaluation.
We will keep you informed in case there is any update.
Development datasets and evaluation code
The Dev datasets consists in three languages. To download the evaluation code, please follow the instructions from the challenge’s zerospeech2017 github repository.
Once done, you are ready to work on your model(s) and tune them on the dev data. Please bear in mind that the objective is to find the optimal hyperparameter that will generalize well on the two unknown surprise languages; so, in order to avoid overfitting on these dev languages, we encourage you to use a cross validation scheme, where you fit your hyper parameters on two languages and evaluate them on the third.
Surprise datasets and result submission
The Surprise dataset consists in two languages. To check the precise instructions for submitting your system and results through the creation of a DOI and a zip file, please follow the instructions from the challenge’s zerospeech2017_surprise github repository.
In principle, each team should make only ONE submission. A small number of submissions per team are tolerated (for instance, to compare a small number of models; capped at 5 per team) so long as you pledge to report and discuss all of your submissions in the journal article.
Be careful when you submit your DOI. Each submission is irreversible, as a DOI object cannot be erased. Once submitted your results will appear on the leaderboard result page together with a time stamp. Your submission is therefore public, as everybody will have access to your system and results, including the reviewers of the paper(s) you may submit to conferences or journals, who will be urged to verify that you report and discuss all of your submitted results.