-
Notifications
You must be signed in to change notification settings - Fork 315
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Building a benchmark dataset with qNegIntegratedPosteriorVariance: use of ScalarizedPosteriorTransform instead of ScalarizedObjective (DEPRECATED) type objectives #1312
Comments
Mostly curious if there's an easy way to plug this in via the Service API, and if not (which is totally fine), whether you'd recommend using the Developer API or BoTorch. FYI this isn't a major blocker for me - just something I want to keep in mind for later. For now, I might go with quasi-random generation of candidates. |
Thanks for the question @sgbaird. So basically you're asking is there a way to use a Scalarized Objective? I would just drop into the dev API for that and use: ax_client.experiment.objective = ScalarizedObjective(...) We don't support it from |
@danielcohenlive thanks! I'll give that a try |
I'm going to close this for now, but feel free to reopen if you have further question |
…edPosteriorTransform The example usage at the end was already updated to use `ScalarizedPosteriorTransform`. facebook/Ax#1312
…edPosteriorTransform (#1898) Summary: <!-- Thank you for sending the PR! We appreciate you spending the time to make BoTorch better. Help us understand your motivation by explaining why you decided to make this change. You can learn more about contributing to BoTorch here: https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md --> ## Motivation The example usage at the end was already updated to use `ScalarizedPosteriorTransform`. facebook/Ax#1312 ### Have you read the [Contributing Guidelines on pull requests](https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests)? Yes Pull Request resolved: #1898 Test Plan: N/A ## Related PRs Doc change Reviewed By: esantorella Differential Revision: D46882718 Pulled By: saitcakmak fbshipit-source-id: 152df53b3185f39734f545b3ed32ffde1e99c5cd
Aside: The docs for
For ease, here's the link to The BoTorch tutorial for Writing a custom acquisition function shows usage of from botorch.acquisition.objective import ScalarizedPosteriorTransform
from botorch.acquisition.analytic import UpperConfidenceBound
pt = ScalarizedPosteriorTransform(weights=torch.tensor([0.1, 0.5]))
SUCB = UpperConfidenceBound(gp, beta=0.1, posterior_transform=pt) I have a notebook that I'd like to get functioning as a demonstration example. I'll give this a go. |
I want to build a dataset meant for benchmarking, and I figured that using qNIPV would make sense here #930. The inputs are red, green, and blue LED powers, and the outputs are eight discrete wavelengths sparks-baird/self-driving-lab-demo#121. While I could use quasi-random methods for generation, in a more real-world scenario where the experiments can take a much longer time to run, creating an example with something more sophisticated seemed like the way to go.
(Aside: maybe it would make sense to use qNEHVI + SAASBO here?)
I'm noticing in the BoTorch docs it says to use ScalarizedPosteriorTransform instead.
Here is a self-contained example using vanilla single-objective optimization based on #460 (comment):
The text was updated successfully, but these errors were encountered: