Lava Deep Learning 0.1.0
Lava Deep Learning Library
This first release of lava-dl under BSD-3 license provides two new modes of training deep event-based neural networks, either directly with SLAYER 2.0 or through hybrid ANN/SNN training using the Bootstrap module.
SLAYER 2.0 (lava.lib.dl.slayer) provides direct training of heterogenous event-based computational blocks with support for a variety of learnable neuron models, complex synaptic computation, arbitrary recurrent connection, and many more new features. The API provides high level building blocks that are fully autograd enabled and training utilities that make getting started with training SNNs extremely simple.
Bootstrap (lava.lib.dl.bootstrap) is a new training method for rate-coded SNNs. In contrast to prior ANNto-SNN conversion schemes, it relies on an equivalent “shadow” ANN during training to maintain fast training speed but to also accelerate SNN inference post-training dramatically with only few spikes. Although Bootstrap is currently separate from SLAYER, its API mirrors the familiar SLAYER API, enabling fast hybrid ANN-SNN training for minimal performance loss in ANN to SNN conversion.
At this point in time, Lava processes cannot be trained directly with backpropagation. Therefore, we will soon release the Network Exchange (lava.lib.dl.netx) module for automatic generation of Lava processes from SLAYER or Bootstrap-trained networks. At that point, networks trained with SLAYER or Bootstrap can be executed in Lava.
Open-source contributions to these libraries are highly welcome. You are invited to extend the collection neuron models supported by both SLAYER and Bootstrap. Check out the Neurons and Dynamics tutorial to learn how to create custom neuron models from the fundamental linear dynamics’ API.
New Features and Improvements
- lava.lib.dl.slayer is an extension of SLAYER for natively training a combination of different neuron models and architectures including arbitrary recurrent connections. The library is fully autograd compatible with custom CUDA acceleration when supported by the hardware.
- lava.lib.dl.bootstrap is a new training method for accelerated training of rate based SNN using dynamically estimated ANN as well as hybrid training with fully spiking layers for low latency rate coded SNNs.
Bug Fixes and Other Changes
- This is the first release of Lava. No bug fixes or other changes.
Breaking Changes
- This is the first release of Lava. No breaking or other changes.
Known Issues
- No known issues at this point.
What's Changed
New Contributors
- @bamsumit made their first contribution in #5
- @mgkwill made their first contribution in #1
- @mathisrichter made their first contribution in #6
Full Changelog: https://github.com/lava-nc/lava-dl/commits/v0.1.0