Skip to content

Commit

Permalink
Update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
madeline-scyphers committed Jul 22, 2024
1 parent 34bbf23 commit 803b6d4
Show file tree
Hide file tree
Showing 3 changed files with 13 additions and 32 deletions.
32 changes: 13 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,33 +1,27 @@
# Bayesian Optimization for Anything
BOA is a high-level Bayesian optimization framework and model wrapping tool for providing an easy-to-use interface
between models and the python libraries [Ax](https://ax.dev) and [BoTorch](https://botorch.org)

## Key features

- **Model agnostic**
BOA is a high-level Bayesian optimization framework and model-wrapping toolkit. It is designed to be highly flexible and easy-to-use. BOA is built upon the lower-level packages [Ax](https://ax.dev) (Adaptive Experimentation Platform) and [BoTorch](https://botorch.org) to do the heavy lifting of the BO process and subsequent analysis. It supplements these lower-level packages with model-wrapping tools, language-agnostic features, and a flexible interface framework.

- Can be used for models in any language (not just python)
- Can be used for Wrappers in any language (You don't even need to write any python!) See `Script Wrapper` for details on how to do that.
- Simple to implement for new models, with minimal coding required

- **Scalable**

- Can be used for simple models or complex models that require a lot of computational resources
- Scheduler to manage individual model runs
- Supports parallelization
## Key features

- **Modular & customizable**
- **Language-Agnostic**: Although BOA itself is written in Python, users do not need to write any Python code in order to use it. The user’s model, as well as the model wrapper, can be written in any programming language. Users can configure and run an optimization, save outputs, and view results entirely without writing any Python code. This allows the user to write their code in any language they want, or even reuse processing code they already have, and still have access to two of the most full-featured BO (BoTorch) and GP (GPyTorch) libraries available today.
- **Scalability and Parallelization**: BOA handles optimization tasks of any size, from small problems to large, complex models. It supports parallel evaluations, allowing multiple optimization trials to run at the same time. This greatly reduces optimization time, especially when using powerful computing resources like supercomputing clusters. In many other BO packages, even if batched trial evaluation is supported, the actual parallelization implementation is left as an exercise to the user.
- **Reducing Boilerplate Code**: BOA aims to reduce the amount of boilerplate code often needed to set up and launch optimizations. BOA does this by providing an application programming interface (API) to the lower-level BO libraries BoTorch and Ax that it is built upon. This API is responsible for initializing, starting, and controlling the user’s optimization. The BOA API can be accessed and controlled almost entirely through a human readable, text based, YAML configuration file, reducing the need to write boilerplate setup code.
- **Automatic Saving and Resuming**: BOA automatically saves the state of an optimization process, allowing users to pause and resume optimizations easily. This ensures continuous progress and makes it easy to recover and retrieve results, even if there are interruptions or system crashes, making the workflow more resilient and user-friendly. Users can also add additional trials to a completed optimization or explore incoming results as the optimization is still running.
- **Support for Multi-Objective Optimization**: Streamlined and customizable support for multi-objective optimization.
- **Handling High-Dimensional and Complex Models**: Support for high-dimensional problems.
- **Customizability**: BOA allows customization of the optimization process as needed, including adding constraints, adjusting the kernel or acquisition function, or incorporating an early stopping criterion.

- Can take advantages of the many features of Ax/BoTorch
- Customizable objective functions, multi-objective optimization, acquisition functions, etc
- Choice of built-in evaluation metrics, but it’s also easy to implement custom metrics

## Next Steps

Head over to [installation guide](https://boa-framework.readthedocs.io/en/latest/user_guide/getting_started.html#installation) to get started with installing BOA.

Or

## Install requirements
Head over to [Bayesian Optimization Guide](https://boa-framework.readthedocs.io/en/stable/user_guide/bo_overview.html) to read about Bayesian Optimization and how it works.

- see [installation guide](https://boa-framework.readthedocs.io/en/latest/user_guide/getting_started.html#installation)

| | |
|-----------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------|
Expand Down
12 changes: 0 additions & 12 deletions docs/gallery.rst

This file was deleted.

1 change: 0 additions & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@ Contents
troubleshooting
changelog
Code reference <code_reference>
gallery
contributing


Expand Down

0 comments on commit 803b6d4

Please sign in to comment.