Documentation (develop) | Documentation (release) | Tutorials | API reference |
Trieste (pronounced tree-est) is a Bayesian optimization toolbox built on TensorFlow. Trieste is named after the bathyscaphe Trieste, the first vehicle to take a crew to Challenger Deep in the Mariana Trench, the lowest point on the Earth’s surface: the literal global minimum.
Why Trieste?
- Highly modular design and easily customizable. Extend it with your custom model or acquisition functions. Ideal for practitioners that want to use it in their systems or for researchers wishing to implement their latest ideas.
- Seamless integration with TensorFlow. Leveraging fully its auto differentiation - no more writing of gradients for your acquisition functions!, and scalability capabilities via its support for highly parallelized modern hardware (e.g. GPUs).
- General purpose toolbox. Advanced algorithms covering all corners of Bayesian optimization and Active learning - batch, asynchronous, constraints, multi-fidelity, multi-objective - you name it, Trieste has it.
- Versatile model support out-of-the-box. From gold-standard Gaussian processes (GPs; GPflow) to alternatives like sparse variational GPs, Deep GPs (GPflux) or Deep Ensembles (Keras), that scale much better with the number of function evaluations.
- Real-world oriented. Our Ask-Tell interface allows users to apply Bayesian optimization across a range of non-standard real-world settings where control over black-box function is partial. Built on TensorFlow and with comprehensive testing Trieste is production-ready.
Here's a quick overview of the main components of a Bayesian optimization loop. For more details, see our Documentation where we have multiple Tutorials covering both the basic functionalities of the toolbox, as well as more advanced usage.
Let's set up a synthetic black-box objective function we wish to minimize, for example, a popular Branin optimization function, and generate some initial data
from trieste.objectives import Branin, mk_observer
observer = mk_observer(Branin.objective)
initial_query_points = Branin.search_space.sample(5)
initial_data = observer(initial_query_points)
First step is to create a probabilistic model of the objective function, for example a Gaussian Process model
from trieste.models.gpflow import build_gpr, GaussianProcessRegression
gpflow_model = build_gpr(initial_data, Branin.search_space)
model = GaussianProcessRegression(gpflow_model)
Next ingredient is to choose an acquisition rule and acquisition function
from trieste.acquisition import EfficientGlobalOptimization, ExpectedImprovement
acquisition_rule = EfficientGlobalOptimization(ExpectedImprovement())
Finally, we optimize the acquisition function using our model for a number of steps and we check the obtained minimum
from trieste.bayesian_optimizer import BayesianOptimizer
bo = BayesianOptimizer(observer, Branin.search_space)
num_steps = 15
result = bo.optimize(num_steps, initial_data, model)
query_point, observation, arg_min_idx = result.try_get_optimal_point()
Trieste supports Python 3.9+ and TensorFlow 2.5+, and uses semantic versioning.
To install the latest (stable) release of the toolbox from PyPI, use pip
:
$ pip install trieste
or to install from sources, run
$ pip install .
in the repository root.
To install this project in editable mode, run the commands below from the root directory of the trieste
repository.
git clone https://github.com/secondmind-labs/trieste.git
cd trieste
pip install -e .
For installation to be able to run quality checks, as well as other details, see the guidelines for contributors.
Trieste has a documentation site with tutorials on how to use the library, and an API reference. You can also run the tutorials interactively. They can be found in the notebooks directory, and are written as Python scripts for running with Jupytext. To run them, first install trieste from sources as above, then install additional dependencies with
$ pip install -r notebooks/requirements.txt
Finally, run the notebooks with
$ jupyter-notebook notebooks
Alternatively, you can copy and paste the tutorials into fresh notebooks and avoid installing the library from source. To ensure you have the required plotting dependencies, simply run:
$ pip install trieste[plotting]
Like tensorflow-probability, Trieste currently uses Keras 2. When using Tensorflow versions 2.16 onwards (which default to Keras 3) this needs to be imported from tf_keras
rather than tf.keras
. Alternatively, for a shortcut that works with all versions of Tensorflow, you can write:
from gpflow.keras import tf_keras
Bugs, feature requests, pain points, annoying design quirks, etc: Please use GitHub issues to flag up bugs/issues/pain points, suggest new features, and discuss anything else related to the use of Trieste that in some sense involves changing the Trieste code itself. We positively welcome comments or concerns about usability, and suggestions for changes at any level of design. We aim to respond to issues promptly, but if you believe we may have forgotten about an issue, please feel free to add another comment to remind us.
We have a public Secondmind Labs slack workspace. Please use this invite link and join the #trieste channel, whether you'd just like to ask short informal questions or want to be involved in the discussion and future development of Trieste.
All constructive input is very much welcome. For detailed information, see the guidelines for contributors.
To cite Trieste, please reference our arXiv paper where we review the framework and describe the design. Sample Bibtex is given below:
@misc{trieste2023,
author = {Picheny, Victor and Berkeley, Joel and Moss, Henry B. and Stojic, Hrvoje and Granta, Uri and Ober, Sebastian W. and Artemev, Artem and Ghani, Khurram and Goodall, Alexander and Paleyes, Andrei and Vakili, Sattar and Pascual-Diaz, Sergio and Markou, Stratis and Qing, Jixiang and Loka, Nasrulloh R. B. S and Couckuyt, Ivo},
title = {Trieste: Efficiently Exploring The Depths of Black-box Functions with TensorFlow},
publisher = {arXiv},
year = {2023},
doi = {10.48550/ARXIV.2302.08436},
url = {https://arxiv.org/abs/2302.08436}
}