-
Notifications
You must be signed in to change notification settings - Fork 2
Learning Parameters
Ned Taylor edited this page Feb 28, 2024
·
4 revisions
Learning parameters are stored and handled in the base_optimiser_type derived type. For a brief overview of what is available in that derived type, please refer to that page.
For optimisation methods, there exist the following methods:
- None
- SGD (momentum-based)
- RMSprop
- Adam
- AdaGrad
For regularisation methods, there exist the following methods in the library:
- None
- L1
- L2
- L1L2
For learning rate decay methods, there exist the following methods in the library:
- None
- Exponential
- Step
- Inverse