Open Source Vizier: Reliable and Flexible Blackbox Optimization.
Documentation | Installation | Code Structure | Citing Vizier
What is Open Source (OSS) Vizier?
OSS Vizier is a Python-based interface for blackbox optimization and research, based on Google Vizier, one of the first hyperparameter tuning services designed to work at scale. Please see our ReadTheDocs documentation for detailed information.
It consists of three main APIs:
- User API: Allows a user to setup a main Vizier Server, which can host blackbox optimization algorithms to serve multiple clients simultaneously in a fault-tolerant manner to tune their objective functions.
- Developer API: Defines abstractions and utilities for implementing new optimization algorithms for research and to be hosted in the service.
- Benchmarking API: A wide collection of objective functions and methods to benchmark and compare algorithms.
Installation
The simplest way is to run:
pip install google-vizier
which will download the code and launch the provided install.sh. This script installs the necessary dependencies, and builds the relevant protobuf libraries needed for the service. Check if all unit tests work by running run_tests.sh. OSS Vizier requires Python 3.9+.
Code Structure
Frequently used import targets
Includes a brief summary of important symbols and modules.
Service users
from vizier.service import pyvizier as vz: Exposes the same set of symbol names asvizier.pyvizier.vizier.service.pyvizier.Foois a subclass or an alias ofvizier.pyvizier.Foo, and can be converted into protobufs.
from vizier.service import ...: Include binaries and internal utilities.
Developer essentials
from vizier import pyvizier as vz: Pure python building blocks of Vizier. Cross-platform code including pythia policies must use this pyvizier instance.TrialandStudyConfigare most important classes.
from vizier.pyvizier import converters: Convert between pyvizier objects and numpy arrays.TrialToNumpyDict: Converts parameters (and metrics) into a dict of numpy arrays. Preferred conversion method if you intended to train an embedding of categorical/discrete parameters, or data includes missing parameters or metrics.TrialToArrayConverter: Converts parameters (and metrics) into an array.
from vizier.interfaces import serializablePartiallySerializable,Serializable
Algorithm abstractions
from vizier import pythiaPolicy,PolicySupporter: Key abstractionsLocalPolicyRunner: Use it for running aPolicyin RAM.
from vizier import algorithmsDesigner:DesignerPolicy: WrapsDesignerinto a pythia Policy.GradientFreeMaximizer: For optimizing acquisition functions.(Partially)SerializableDesigner: Designers who wish to optimize performance by saving states.
Tensorflow modules
from vizier import tfp: Tensorflow-probability utilities.acquisitions: Acquisition functions module.AcquisitionFunction: abstractionUpperConfidenceBound,ExpectedImprovement, etc.
bijectors: Bijectors module.YeoJohnson: Implements both Yeo-Johnson and Box-Cox transformations.optimal_power_transformation: Returns the optimal power transformation.flip_sign: returns a sign-flip bijector.
from vizier import keras as vzkvzk.layers: Layers usually wrapping tfp classesvariable_from_prior: Utility layer for handling regularized variables.
vzk.optim: Wrappers around optimizers in tfp or kerasvzk.models: Most of the useful models don't easily fit into keras' Model abstraction, but we may add some for display.
Citing Vizier
If you found this code useful, please consider citing the OSS Vizier paper as well as the Google Vizier paper. Thanks!
@inproceedings{oss_vizier,
author = {Xingyou Song and
Sagi Perel and
Chansoo Lee and
Greg Kochanski and
Daniel Golovin},
title = {Open Source Vizier: Distributed Infrastructure and API for Reliable and Flexible Blackbox Optimization},
booktitle = {Automated Machine Learning Conference, Systems Track (AutoML-Conf Systems)},
year = {2022},
}
@inproceedings{google_vizier,
author = {Daniel Golovin and
Benjamin Solnik and
Subhodeep Moitra and
Greg Kochanski and
John Karro and
D. Sculley},
title = {Google Vizier: {A} Service for Black-Box Optimization},
booktitle = {Proceedings of the 23rd {ACM} {SIGKDD} International Conference on
Knowledge Discovery and Data Mining, Halifax, NS, Canada, August 13
- 17, 2017},
pages = {1487--1495},
publisher = {{ACM}},
year = {2017},
url = {https://doi.org/10.1145/3097983.3098043},
doi = {10.1145/3097983.3098043},
}
