• Home
  • News
  • Preferred Networks Releases Optuna v1.0, Open-source Hyperparameter Optimization Framework for Machine Learning

News

Preferred Networks Releases Optuna v1.0, Open-source Hyperparameter Optimization Framework for Machine Learning

2020.01.14

January 14, 2020, Tokyo Japan – Preferred Networks, Inc. (PFN, Head Office: Tokyo, President & CEO: Toru Nishikawa) has released Optuna™ v1.0, the first major version of the open-source hyperparameter optimization framework for machine learning. Projects using the existing beta version can be updated to Optuna v1.0 with minimal changes to the code.

In machine learning and deep learning, it is critical that complex hyperparameters *1, which control the behavior of an algorithm during the training process, are optimized to deliver a trained model with better accuracy.
Optuna automates the trial-and-error process of optimizing hyperparameters. It finds hyperparameter values that enable the algorithm to give good performance. Since its beta version release as open-source software (OSS) in December 2018, Optuna has received development support from numerous contributors and added a number of new features based on feedbacks from the OSS community as well as in the company.

Main features of Optuna v1.0 include:

  •  Efficient hyperparameter tuning with state of the art optimization algorithms
  •  Support for various machine learning libraries including PyTorch, TensorFlow, Keras, FastAI, scikit-learn, LightGBM, and XGBoost
  •  Support for parallel execution across multiple computing machines to significantly reduce the optimization time
  •  Search space can be described by Python control statements
  •  Various visualization techniques that allow users to conduct diverse analyses of the optimization results

Official website of Optuna: https://optuna.org/

Optuna has received many contributions from external developers. PFN will continue to quickly incorporate the results of the latest machine learning research into the development of Optuna and work with the OSS community to promote the use of Optuna.

*1:Hyperparameters include learning rate, batch size, number of training iterations, number of neural network layers, and number of channels.

 

About the hyperparameter optimization framework for machine learning Optuna™
Optuna was open-sourced by PFN in December 2018 as a hyperparameter optimization framework written in Python. Optuna automates the trial-and-error process of finding hyperparameters that deliver good performance. Optuna is used in many PFN projects and was an important factor in PFDet team’s award-winning performances in the first Kaggle Open Images object detection competition.
https://optuna.org/

Contact

Contact us here.