Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use log-space representation for positive arguments #20

Closed
theogf opened this issue Jan 7, 2020 · 7 comments
Closed

Use log-space representation for positive arguments #20

theogf opened this issue Jan 7, 2020 · 7 comments

Comments

@theogf
Copy link
Member

theogf commented Jan 7, 2020

Should the positive parameters (for example in ScaleTransform and ARDTransform) be already defined in log-space to avoid problems? This is probably safer but might confuse the users

@trthatcher
Copy link

trthatcher commented Jan 7, 2020

@theogf
Copy link
Member Author

theogf commented Jan 7, 2020

And you also did something similar in MLKernels with Hyperparameter which was very nice, I am just wondering if this would be too heavy...

@trthatcher
Copy link

Honestly, I'm not a good person to get feedback from for this issue. My use of the kernel methods didn't require any kind of automatic differentiation so I only needed the kernels to be a container for some scalar parameters. For me, all I really cared about was having something simple so I could do this:

ExponentialKernel().α

I did play around with the HyperParameter type since there was interest in autodiff, however I felt I was charging in blind since I didn't have a good sense of how it would be used since I've never worked with GPs. The HyperParameter type was based on Scikit Learn's approach to kernels:

https://github.com/scikit-learn/scikit-learn/blob/b194674c42d54b26137a456c510c5fdba1ba23e0/sklearn/gaussian_process/kernels.py#L48-L374

However, I backtracked when this issue was opened:

trthatcher/MLKernels.jl#65

In the case of KernelFunctions, you have a transformation type instead of raw parameters. Given that the complexity is already a little higher, it might not be too bad to store the data in logspace. However, I'd get feedback from others @willtebbutt @devmotion @IsakFalk

@willtebbutt
Copy link
Member

My feeling on this is that it doesn't belong in this package, it belongs in downstream code e.g. GaussianProcesses.jl / AugmentedGPs.jl / Stheno.jl etc. The transformation isn't unique, so to my mind baking in log feels kind of arbitrary.

@IsakFalk
Copy link
Contributor

I agree with @willtebbutt, it's probably better for downstream packages to work out how they want to represent parameters rather than to enforce it here.

@theogf
Copy link
Member Author

theogf commented Jan 15, 2020

The transformation isn't unique, so to my mind baking in log feels kind of arbitrary.
True but one could say that using no transformation is an arbitrary choice as well 😄

it belongs in downstream code

Also agreed, and I already did it for my case but my concern was : if a user updates make the values negative, there is no direct check on this at evaluation time. If we leave it like this we should at least warn about it, in the documentation for example.

@theogf
Copy link
Member Author

theogf commented Jun 24, 2020

Problem deferred to #127

@theogf theogf closed this as completed Jun 24, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants