Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make 3-arg dot rrule partially lazy #796

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

mohamed82008
Copy link
Member

This addresses #788. I had to remove the projection to make it work otherwise I get the following error due to a missing projection method. Projecting the lazy array to a dense array when A is dense partially defeats the purpose of this PR so I am leaving it up to the review process to decide what to do here. I can define a projection method if that's preferred.

julia> show(err)
1-element ExceptionStack:
MethodError: no method matching (::ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}})(::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{2}, Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}, typeof(*), Tuple{Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1}, Nothing, typeof(*), Tuple{Float64, Vector{Float64}}}, Adjoint{Float64, Vector{Float64}}}})

Closest candidates are:
  (::ChainRulesCore.ProjectTo{T})(::ChainRulesCore.NotImplemented) where T
   @ ChainRulesCore ~/.julia/packages/ChainRulesCore/zgT0R/src/projection.jl:121
  (::ChainRulesCore.ProjectTo)(::ChainRulesCore.Thunk)
   @ ChainRulesCore ~/.julia/packages/ChainRulesCore/zgT0R/src/projection.jl:124
  (::ChainRulesCore.ProjectTo{AbstractArray})(::Number)
   @ ChainRulesCore ~/.julia/packages/ChainRulesCore/zgT0R/src/projection.jl:253
  ...

Stacktrace:
 [1] (::ChainRules.var"#1966#1970"{Adjoint{Float64, Vector{Float64}}, Float64, Vector{Float64}, ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}}})()
   @ ChainRules ~/.julia/dev/ChainRules/src/rulesets/LinearAlgebra/dense.jl:39
 [2] unthunk
   @ ~/.julia/packages/ChainRulesCore/zgT0R/src/tangent_types/thunks.jl:204 [inlined]
 [3] wrap_chainrules_output
   @ ~/.julia/packages/Zygote/nsBv0/src/compiler/chainrules.jl:110 [inlined]
 [4] map
   @ ./tuple.jl:293 [inlined]
 [5] map
   @ ./tuple.jl:294 [inlined]
 [6] wrap_chainrules_output
   @ ~/.julia/packages/Zygote/nsBv0/src/compiler/chainrules.jl:111 [inlined]
 [7] ZBack
   @ ~/.julia/packages/Zygote/nsBv0/src/compiler/chainrules.jl:211 [inlined]
 [8] (::Zygote.var"#75#76"{Zygote.ZBack{ChainRules.var"#dot_pullback#1968"{Vector{Float64}, Matrix{Float64}, Vector{Float64}, Vector{Float64}, ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}}}}, ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}}, ChainRulesCore.ProjectTo{AbstractArray, @NamedTuple{element::ChainRulesCore.ProjectTo{Float64, @NamedTuple{}}, axes::Tuple{Base.OneTo{Int64}}}}}}})(Δ::Float64)
   @ Zygote ~/.julia/packages/Zygote/nsBv0/src/compiler/interface.jl:91
 [9] top-level scope
   @ REPL[6]:1

This is related to the discussion in FluxML/Zygote.jl#1507.

@mohamed82008
Copy link
Member Author

This is also technically a breaking change if we don't project.

@oxinabox oxinabox requested a review from mcabbott May 31, 2024 14:59
@oxinabox
Copy link
Member

We may need to teach LazyArrays about projections, so that they project lazily?

@mohamed82008
Copy link
Member Author

I like the idea of lazy projections. Maybe that would also provide a way to opt out of the projection at the gradient level by calling a get_projection_parent function to undo the last projection.

@albertomercurio
Copy link

hello, are there any updates on this? I really need to compute the gradient of dot(x, A, y), with x and y dense vectors and A sparse matrix. It currently converts the matrix into a dense one, which is very inefficient for large matrices.

@mohamed82008
Copy link
Member Author

I suppose you could just define your own mydot function for now and define a rule following this PR.

@mohamed82008
Copy link
Member Author

Maybe even create a package LazyLinearAlgebra that defines most linalg operations in a lazy fashion and their lazy rules.

@albertomercurio
Copy link

What makes so difficult to directly apply the rrule on the standard dot function? Sorry for the simple question, but I'm not so familiar with automatic differentiation. I just need it to compute expectation values of an operator.

@oxinabox
Copy link
Member

problem is it is generally required to project the tangents back down onto the tangent space once it is computed.
Explaining why is a little complex but it is probably in the docs.
But it shows up with things like a the tangent to a real number that has been multiplied by a complex number must be real,
and structured matrixes (like diagonal matrixes) need to also follow that structure.

Anyway, that operation needs to also be done lazily if we want to return a lazy array from the 3 arg dot rrule.

Which can be done, but it requires overloading ProjectTo on LazyArray

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants