Skip to content

Commit

Permalink
Add reference
Browse files Browse the repository at this point in the history
  • Loading branch information
AlessandroPierro authored Oct 20, 2023
1 parent 8800526 commit 17b2428
Show file tree
Hide file tree
Showing 2 changed files with 18 additions and 2 deletions.
17 changes: 17 additions & 0 deletions paper.bib
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
@inproceedings{10.1145/3589737.3605998,
author = {Snyder, Shay and Risbud, Sumedh R. and Parsa, Maryam},
title = {Neuromorphic Bayesian Optimization in Lava},
year = {2023},
isbn = {9798400701757},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3589737.3605998},
doi = {10.1145/3589737.3605998},
abstract = {The ever-increasing demands of computationally expensive and high-dimensional problems require novel optimization methods to find near-optimal solutions in a reasonable amount of time. Bayesian Optimization (BO) stands as one of the best methodologies for learning the underlying relationships within multi-variate problems. This allows users to optimize time consuming and computationally expensive black-box functions in feasible time frames. Existing BO implementations use traditional von-Neumann architectures, in which data and memory are separate. In this work, we introduce Lava Bayesian Optimization (LavaBO) as a contribution to the open-source Lava Software Framework. LavaBO is the first step towards developing a BO system compatible with heterogeneous, fine-grained parallel, in-memory neuromorphic computing architectures (e.g., Intel's Loihi platform). We evaluate the algorithmic performance of the LavaBO system on multiple problems such as training state-of-the-art spiking neural networks through back-propagation and evolutionary learning. Compared to traditional algorithms (such as grid and random search), we highlight the ability of LavaBO to explore the parameter search space with fewer expensive function evaluations, while discovering the optimal solutions.},
booktitle = {Proceedings of the 2023 International Conference on Neuromorphic Systems},
articleno = {9},
numpages = {5},
keywords = {bayesian optimization, neuromorphic computing, asynchronous computing},
location = {Santa Fe, NM, USA},
series = {ICONS '23}
}
3 changes: 1 addition & 2 deletions paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,6 @@ As this short list shows, JOSS papers are only expected to contain a limited set

# Summary

A summary describing the high-level functionality and purpose of the software for a diverse, non-specialist audience.

- Challenges of optimization and opportunities of neuromorphic computing
- Scalability, low latency, optimality, energy
Expand All @@ -61,7 +60,7 @@ A summary describing the high-level functionality and purpose of the software fo
- `Lava Optimization` increases productivity on developing and testing novel neuromorphic algorithms and applications
- The library abstracts away the neuromoprhic aspect of the backend, exposing an API typical of constrained optimization (variables, constraints, cost, etc.)
- Supports the community in developing algorithms that are iterative, discrete, and distributed
- We leveraged the library architecture to develop multi-backend QUBO and QP solvers, and received contirbutions from the community for a Bayesian and LCA solvers
- We leveraged the library architecture to develop multi-backend QUBO and QP solvers, and received contirbutions from the community for a Bayesian `[@10.1145/3589737.3605998]` and LCA solvers


# Statement of need
Expand Down

0 comments on commit 17b2428

Please sign in to comment.