Launch experiments locally or on a cluster running SLURM in a single file.
The experiment_launcher package provides a way to run multiple experiments using
SLURM or Joblib,
with minimum effort - you just have to set the "local" parameter to True for Joblib,
and to False for SLURM.
You can do a minimal installation of experiment_launcher with:
pip3 install -e .
-
Create in your own project two files test.py and launch_test.py
-
Create an experiment file as in test.py
- This file consists of two base functions:
experiment,parse_args; and theif __name__ == '__main__'block - The function
experimentis the core of your experiment- It takes as arguments your experiment settings (e.g., the number of layers in a neural network, the learning rate, ...)
- The arguments need to be assigned a default value in the function definition
- The arguments
seedandresults_dirmust always be included - By default,
results_diris the/path_to_your_sub_experiment
- The function
parse_argsincludes a CLIArgumentParser- In this function you should define the command line arguments
- These arguments must be the same as the ones define in the function
experiment - You don't need to define the arguments
seedandresults_dir- they are defined inadd_launcher_base_args
- In
if __name__ == '__main__'simply include:if __name__ == '__main__': args = parse_args() run_experiment(experiment, args)
- This file consists of two base functions:
-
Create a launcher file as in launch_test.py
- Specify the running configurations by calling a
Launcherconstructor:n_expsis the number of random seeds for each single experiment configuration- If
joblib_n_jobs > 0, then each node will runjoblib_n_jobsexperiments possibly in parallel. E.g., ifjoblib_n_jobsis3, then3jobs will run in parallel, even ifn_coresis1. For better performance, one should specifyn_cores >= joblib_n_jobs * 1
- Create a single experiment configuration
- Use
launcher.add_default_paramsto add parameters shared across configurations (e.g., the dataset) - Use
launcher.add_experimentto create a particular configuration (e.g., different learning rates)
- Use
- Specify the running configurations by calling a
-
To run the experiment call
cd examples python launch_test.py -
Log files will be placed in
./logsif running locally/work/scratch/USERNAME(the default for theLichtenberg-Hochleistungsrechner of the TU Darmstadt)
- The seeds are created sequentially from
0ton_exps