Submodule | Maintainers | Contact Info |
---|---|---|
gelu | @AakashKumarNain @WindQAQ | [email protected] [email protected] |
hardshrink | @WindQAQ | [email protected] |
lisht | @WindQAQ | [email protected] |
mish | @digantamisra98 @WindQAQ | [email protected], [email protected] |
softshrink | @WindQAQ | [email protected] |
sparsemax | @AndreasMadsen | [email protected] |
tanhshrink | @fsx950223 | [email protected] |
rrelu | @fsx950223 | [email protected] |
Submodule | Activation | Reference |
---|---|---|
gelu | gelu | https://arxiv.org/abs/1606.08415 |
hardshrink | hardshrink | |
lisht | lisht | https://arxiv.org/abs/1901.05894 |
mish | mish | https://arxiv.org/abs/1908.08681 |
softshrink | softshrink | |
sparsemax | sparsemax | https://arxiv.org/abs/1602.02068 |
tanhshrink | tanhshrink | |
rrelu | rrelu | https://arxiv.org/abs/1505.00853 |
In order to conform with the current API standard, all activations must:
- Be a
tf.function
unless it is a straightforward call to a custom op or likely to be retraced. - Register as a keras global object so it can be serialized properly:
@tf.keras.utils.register_keras_serializable(package='Addons')
- Add the addon to the
py_library
in this sub-package's BUILD file.
- Simple unittests that demonstrate the layer is behaving as expected.
- When applicable, run all unittests with TensorFlow's
@run_in_graph_and_eager_modes
(for test method) orrun_all_in_graph_and_eager_modes
(for TestCase subclass) decorator. - Add a
py_test
to this sub-package's BUILD file. - Add activation name to activations_test.py to test serialization.
- Update the table of contents in this sub-package's README.