Skip to content

Commit

Permalink
Updates to paper.md
Browse files Browse the repository at this point in the history
  • Loading branch information
AlessandroPierro committed Nov 2, 2023
1 parent c659212 commit f7a63e9
Showing 1 changed file with 40 additions and 22 deletions.
62 changes: 40 additions & 22 deletions paper.md
Original file line number Diff line number Diff line change
@@ -1,39 +1,38 @@
---
title: 'Lava Optimization: a Python package for neuromorphic constrained optimization'
title: 'Lava Optimization: a Python package for neuromorphic mathematical optimization'
tags:
- Python
- Optimization
- Neuromorphic computing
- Mathematical Optimization
- Spiking Neural Networks
- QUBO
- QP
- Operations Research
- Python
authors:
- name: Gabriel Fonseca Guerra
orcid:
- name: Gabriel A. Fonseca Guerra
orcid: 0000-0001-5403-4634
equal-contrib: false
affiliation: 1
- name: Alessandro Pierro
orcid:
orcid: 0000-0002-5682-627X
equal-contrib: false
affiliation: 1
- name: Philipp Stratmann
orcid:
orcid: 0000-0001-6791-9159
equal-contrib: false
affiliation: 1
- name: Sumedh Risbud
orcid:
- name: Sumedh R. Risbud
orcid: 0000-0003-4777-1139
equal-contrib: false
affiliation: 1
- name: Ashish Rao Mangalore
orcid:
orcid: 0000-0002-8496-7678
equal-contrib: false
affiliation: 1
- name: Timothy Shea
orcid:
orcid: 0000-0001-5705-0904
equal-contrib: false
affiliation: 1
- name: Andreas Wild
orcid:
orcid: 0000-0003-0380-5675
equal-contrib: false
affiliation: 1
affiliations:
Expand All @@ -58,23 +57,42 @@ This collaborative effort showcases the potential of `Lava Optimization` in adva

# Statement of need

- Difficulty of programmability of neuromorphic
- SpynNaker: more suited for neuroscience, hard to use to program algorithms [@rhodes2018spynnaker]
- NENGO [@bekolay2014nengo], NESTML [@plotnikov2016nestml], BrainScaleS supports PyNN [@davison2009pynn], Brian [@goodman2009brian]
- Previous implementations of neuromorhic optimization algorithms were mostly standalone solvers, with primitive user experience and exposing low-level implementation details to the user-facing APIs
Constrained optimization searches for the values of input variables that minimize or maximize a given objective function, while the variables are subject to constraints.
This kind of problem is ubiquitous throughout scientific domains and industries.
Constrained optimization is a promising application for neuromorphic computing as it naturally aligns with the dynamics of spiking neural networks [@davies2021advancing].
When individual neurons represent states of variables, the neuronal connections can directly encode constraints between the variables: in its simplest form, recurrent inhibitory synapses connect neurons that represent mutually exclusive variable states, while recurrent excitatory synapses link neurons representing reinforcing states.
Implemented on massively parallel neuromorphic hardware, such a spiking neural network can simultaneously evaluate conflicts and cost functions involving many variables, and update all variables accordingly.
This allows a quick convergence towards an optimal state.
In addition, the fine-scale timing dynamics of SNNs allow them to readily escape from local minima.

Several libraries are available for modeling spiking neural netowkrs:
- SpynNaker [@rhodes2018spynnaker]
- NENGO [@bekolay2014nengo]
- NESTML [@plotnikov2016nestml]
- PyNN [@davison2009pynn]
- Brian [@goodman2009brian]

Limitations
- more suited for neuroscience, hard to use to program general algorithms
- Difficulty of programmability / lack of abstraction

- Lava offers:

- Lava Optimization introduce a clear separation of abstraction layers, modularity/composability/ortogonality/DRY, and easy to change
- Lava Optim: on-chip online tracking of best ongoing solutions instead of post-processing recorded activity (improve performance, reduce memory/bandwidth requirements)
- Lava Optimization includes the mathematical primitives needed for optimization problems/algorithms
# Overview of the package

- Lava Optimization introduce a clear separation of abstraction layers, modularity / composability / ortogonality / DRY, and easy to change
- Lava Optim: on-chip online tracking of best ongoing solutions instead of post-processing recorded activity (improve performance, reduce memory / bandwidth requirements)
- Lava Optimization includes the mathematical primitives needed for optimization problems / algorithms
- Solver tuner / visualization / reporting / profiling analysis tools specifically for optimization
- Easy to extend / possible future directions: regularization terms and higher-order optimization
- Lava Optimization also includes applications specific solvers / interfaces (scheduling, tsp, clustering, vrp)
- Modern software practices, for DevOps / automated testing, distributed with pip

- Mention (if applicable) a representative set of past or ongoing research projects using the software and recent scholarly publications enabled by it.: lava BO and lava MPC

# Documentation

The package is extensively documented at [https://lava-nc.org/optimization.html](https://lava-nc.org/optimization.html).

# Acknowledgements

We acknowledge contributions from Chinonso Onah, Gavin Parpart, and Shay Snyder.
Expand Down

0 comments on commit f7a63e9

Please sign in to comment.