Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

primal_dual_hybrid_gradient solver: why compute derivative ? #1619

Open
Zeqiang-Lai opened this issue May 17, 2022 · 1 comment
Open

primal_dual_hybrid_gradient solver: why compute derivative ? #1619

Zeqiang-Lai opened this issue May 17, 2022 · 1 comment

Comments

@Zeqiang-Lai
Copy link

I just could not make correspondence between the math and implementation of PDHG.

It seems that we do not need to compute gradient when update x

image

L.derivative(x).adjoint(y, out=primal_tmp)

@sbanert
Copy link
Contributor

sbanert commented Jun 27, 2022

The derivative of a linear operator is the operator itself, so this surely doesn't harm anything. On the other hand, the code in this form also works for a non-linear operator, e.g., $Lx = A x + a$ with $A$ being linear, where "works" means that the iteration has the correct fixed points.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants