DRAGON, for DiRected Acyclic Graphs OptimizatioN, is an open source Python package for the optimization of Deep Neural Networks Hyperparameters and Architecture [1]. DRAGON is not a no code package, but you can get familiar with it quickly thanks to the Documentation.
-
A flexible seach space
- The search space based on Directed Acyclic Graphs (DAGs) where the nodes can be any PyTorch layer (custom or not) and the edges are the connections between them.
- The code to implement the DAGs-based search space was inspired by the zellij package developed for hyperparameters optimization.
- DRAGON search space includes cell-based search spaces [4]_.
-
Flexible optimization algorithms
- The search algorithms defined in DRAGON are based on search operators used to modify elements of the search space (e.g., mutations, neighborhoods, crossover), which can be used to develop new search algorithms.
- Efficient algorithms are also implemented in DRAGON such as the Random Search, Evolutionary Algorithm, Mutant UCB, and HyperBand.
-
Applications to various tasks
- The flexibility of DRAGON makes it usable for various applications.
- For example: image classification, time series forecasting, electricity consumption forecasting, wind power forecasting or tabular data.
-
Easy parallelization over multiple GPUs
- The Search Space is a mix-variable search space. Numerical, categorical and graph objects may be jointly optimized. Each object is associated with a variable, which defines what values an object can take.
- Base on this search space, several Search Operators are defined, showing how the objects can be manipulate to find the neighboring values.
Basic version
After cloning the git repository, install DRAGON, using:
pip install -e dragon
Distributed version
If you plan on using the distributed version, you have to install the mpi4py package:
pip install mpi4py
Documentation
Additional dependencies are required to run the documentation notebooks:
pip install -e dragon[docs]
- Julie Keisler: julie.keisler.rfo@gmail.com
[1] Keisler, J., Talbi, E. G., Claudel, S., & Cabriel, G. (2024). An algorithmic framework for the optimization of deep neural networks architectures and hyperparameters. Journal of Machine Learning Research.