HippoTrainer: Gradient-Based Hyperparameter Optimization for PyTorch

We release a Python library for gradient-based hyperparameter optimization, implementing cutting-edge algorithms that leverage automatic differentiation to efficiently tune hyperparameters.

March 2025 · 4 min · Daniil Dorin, Igor Ignashin, Nikita Kiselev, Andrey Veprikov
Overview

Just Relax It! Leveraging relaxation for discrete variables optimization

We release a cutting-edge Python library designed to streamline the optimization of discrete probability distributions in neural networks, offering a suite of advanced relaxation techniques compatible with PyTorch.

December 2024 · 12 min · Daniil Dorin, Igor Ignashin, Nikita Kiselev, Andrey Veprikov