Gradient Descent Optimization algorithms for .NET Core
- Adagrad
- Adam
- Adadelta
- RMSprop
PM> Install-Package Gdo.koryakinp
compute a deriviative dx
and provide it to Update()
method of the optimizer:
var opt1 = new Adagrad(0.1);
opt1.SetValue(10);
opt1.Update(dx);
var res1 = opt1.Value;
Adagrad
optimizer will use 0.1 learning rate to update the value.
Similarly you can use different optimizers:
var opt2 = new Adam(0.01, 100, 1000);
opt2.SetValue(10);
opt2.Update(dx);
var res2 = opt2.Value;
var opt3 = new RMSprop(0.1, 25);
opt3.SetValue(10);
opt3.Update(dx);
var res3 = opt3.Value;
var opt4 = new Adadelta(0.1, 25);
opt4.SetValue(10);
opt4.Update(dx);
var res4 = opt4.Value;
Pavel koryakin [email protected]
This project is licensed under the MIT License - see the LICENSE.md for details.