-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LatentGP API: approximate posteriors & wrapping ApproxPosteriorGP #223
Comments
To my mind I would be favour of |
You're right, we can just construct function AbstractGPs.posterior(la::LaplaceApproximation, lfx::LatentFiniteGP, ys)
cache = # ...
f_post = ApproxPosteriorGP(la, lfx.fx, cache)
return LatentGP(f_post, lfx.lik, lfx.fx.Σy)
end The one really annoying thing is that |
That's a good point -- it is annoying. IIRC this is just a limitation of our abstractions at the minute. I think we can be fairly confident that |
When I say abstraction, I think I should really say it's an implementation detail. In particular, we implement the LatentFiniteGP in terms of a FiniteGP and a likelihood, and the jitter is placed inside the FiniteGP. Probably we should just make the LatentFiniteGP have all four fields that it needs (latent process, inputs, jitter, and likelihood) to make things more explicit and to avoid depending on implementation details of the FiniteGP. I think it probably is fine to write what you've written above though, because I think we can be confident that |
How about adding a constructor along the pseudocode lines of function update_latent_posterior(latent::Union{LatentGP, LatentFiniteGP}, new_f_posterior::ApproxPosteriorGP)
return LatentGP(new_f_posterior, latent.lik, latent[.fx].Σy
end to hide the implementation detail for now? |
Oh, the other part that's missing is that the only API implemented for |
That would require duplicating a whole bunch of methods though, instead of simply being able to call |
Sounds good to me.
I think this makes seense. IIRC there are three that we might want: the latent, latent passed through the inverse link function, and the marginal distribution in the observed space. I'm not sure whether we want to implement them on
Good point. The current implementation is probably fine. |
And/or should we have a separate LatentApproxPosteriorGP?...The text was updated successfully, but these errors were encountered: