-
Notifications
You must be signed in to change notification settings - Fork 73
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can I add custom gradients? #77
Comments
#15 may help the code in that version was
https://github.com/ekmett/ad/blob/master/src/Numeric/AD/Jacobian.hs |
point being i believe this answers your question, though please chime in if you want more examples or links / hit a new problem. but closing for now :) |
Thanks! I have a nice minimal example working like this:
However, the Jacobian module only seems to provide primitives for defining derivatives for scalar or two-argument functions. I'd like to define a custom gradient for a function
Any thoughts on how I could do this? It doesn't seem efficient to try and hack this functionality in terms of |
Ill try to mull myself. I think edwark is likely to have better thoughts if
he has the time to muse em
…On Fri, May 3, 2019 at 6:49 AM Katherine Ye ***@***.***> wrote:
Thanks! I have a nice minimal example working like this:
λ> import Numeric.AD
λ> import Numeric.AD.Jacobian
λ> import Numeric.AD.Internal.Forward
λ> (lift1 (\x -> x^2) (\x -> 2 * x)) (Forward 3 1)
Forward 9 6
However, the Jacobian module only seems to provide primitives for defining
derivatives for scalar or two-argument functions. I'd like to define a
custom gradient for a function f : R^n -> R, for example something like
liftN (\[x, y, z] -> x * y * z) (\[x, y, z] -> [y * z, x * z, x * y])
Any thoughts on how I could do this? It doesn't seem efficient to try and
hack this functionality in terms of lift1.
—
You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub
<#77 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AAABBQVVGL5CATONY35EOW3PTQKFBANCNFSM4HKIK3CA>
.
|
@hypotext it is a gap, and its currently not quite possible to do gradients etc with "custom rules", which is a problem i think we'd both agree |
Yes, I would really appreciate if that were a feature!
…On Mon, Jan 20, 2020 at 3:43 PM Carter Tazio Schonwald < ***@***.***> wrote:
@hypotext <https://github.com/hypotext> it is a gap, and its currently
not quite possible to do gradients etc with "custom rules", which is a
problem i think we'd both agree
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#77?email_source=notifications&email_token=AAVCM5FYMM5AAO54JV4EWXLQ6YEGTA5CNFSM4HKIK3CKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEJNYWCQ#issuecomment-576424714>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAVCM5DHJRIKFZMLOERBW2DQ6YEGTANCNFSM4HKIK3CA>
.
|
ok, lets start with several sub questions
what types would you expect/hope the interfaces to have? |
I'd like to be able to specify that when the autodiff hits a function with
a particular name, that that function is a "leaf" of the autodiff tree, and
to use a custom function as its derivative. Can't remember other details
now as this issue was from a while ago, but I would trust your judgement as
library designers!
…On Mon, Jan 20, 2020 at 3:52 PM Carter Tazio Schonwald < ***@***.***> wrote:
ok, lets start with several sub questions
1. what should/would this api look like?
2. how should/would this interact with calculating higher derivatives
etc?
what types would you expect/hope the interfaces to have?
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#77?email_source=notifications&email_token=AAVCM5FKT5RVKIZEALWFOJDQ6YFKTA5CNFSM4HKIK3CKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEJNZHQY#issuecomment-576426947>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAVCM5HY3L4NNRZB6JUSIBTQ6YFKTANCNFSM4HKIK3CA>
.
|
ok, :) |
I'm honestly thinking about experimenting with writing a ghc core plugin to support better optimization for autodiff computations on ghc core level sometime this jan/feb |
Hi! May I ask about the status of this issue? I am in a similar situation as the author. Thank you! |
If I want to give a custom or known gradient for a function, how can I do that in this library? (I don't want to autodifferentiate through this function.) I am using the
grad
function.If the library doesn't provide this feature, is there some way I can easily implement this functionality myself, perhaps by changing the definitions of leaf nodes or by editing the dual numbers that presumably carry the numerical gradients?
Here's a concrete example of what I mean:
Say I have some function I want to take the gradient of, say
f(x, y) = x^2 + 3 * g(x, y)^2
. Then say thatg(x, y)
is a function whose definition is complicated and involves lots of Haskell code, but whose gradient I've already calculated analytically and is quite simple. Thus, when I takegrad f
and evaluate it at a point(x, y)
, I'd like to just plug in my custom gradient for g, instead of autodiffing through it: something likemy_nice_grad_of_g (x, y)
.I see other autodiff libraries do provide this feature, for example Stan and Tensorflow both allow users to define gradients of a function.
Thanks!
The text was updated successfully, but these errors were encountered: