-
Notifications
You must be signed in to change notification settings - Fork 144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Plateau Neuron - Fixed Point Model #781
base: main
Are you sure you want to change the base?
Conversation
Update to lava main 20230816
Signed-off-by: kevin <[email protected]>
Signed-off-by: Marcus G K Williams <Marcus G K Williams [email protected]>
from lava.magma.core.process.ports.ports import InPort, OutPort | ||
|
||
|
||
class Plateau(AbstractProcess): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a minor point. Since this is a LIF neuron, did you consider to let it inherit from the AbstractLIF process, and adding a 'LIF' in the class name? Not sure if it makes sense in this specific example, though. Up to you.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I originally didn't do this since I thought of it as two combined LIF neurons instead of a modified LIF neuron. I'll look over the AbstractLIF process and see if it makes sense to inherit from that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think the class should inherit from AbstrictLIF, since it does not have current or bias vars.
0, 6400, 4800, 3600, 2700, 2025, 1518, 1138, 853, 639 | ||
] | ||
self.assertListEqual(expected_v_dend, test_v_dend) | ||
self.assertListEqual(expected_v_soma, test_v_soma) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you check if lintin passes?
'flakeheaven lint src/lava tests'
My guess is that there are a few points that must change, including missing lines at the end of files. Not functionally relevant, but important to keep a clean code base :-)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've run the linting, (flakeheaven and bandit) and they pass on my local copy of the code. Also, I have the empty line at the end of the files locally, but it doesn't seem to show up on the github versions. Does github just not show the empty line at the end?
src/lava/proc/plateau/models.py
Outdated
|
||
Precisions of state variables | ||
|
||
- du_dend : unsigned 12-bit integer (0 to 4095) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the process, you specify that the variables are float. If you want them to be integers, I would specify that in the Process.
Also, is this model really using fixed point precision? Or is it bit accurate to Loihi?
In the later case, we tend to call it "...ModelBitAcc", rather than "...ModelFixed"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wasn't sure if bit-accurate would be appropriate for this, since I don't have a version of this that runs on Loihi. I tried to give the variables the same precision as on Loihi.
src/lava/proc/plateau/models.py
Outdated
Precisions of state variables | ||
|
||
- du_dend : unsigned 12-bit integer (0 to 4095) | ||
- du_soma : unsigned 12-bit integer (0 to 4095) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please note: On Loihi, variables typically have 8, 16, or 24 bit precision. Thus, you may want to avoid 12bit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I chose 12-bit for the du and dv values since that was the precision of those values in the PyLifModelBitAcct. Since I'm choosing the precisions based on bit-accurate models, should I change this model to bit-accurate, as suggested in a different comment?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your contribution. Great code quality!
Just a few minor points I would ask you to change.
Please reach out if you have any questions.
Thanks for the feedback! I'll go through code and make the requested changes. |
Signed-off-by: kevin <[email protected]>
Signed-off-by: kevin <[email protected]>
…nd, and vth_soma Signed-off-by: kevin <[email protected]>
…nd, dv_soma, vth_dend, and vth_soma Signed-off-by: kevin <[email protected]>
Signed-off-by: kevin <[email protected]>
…ateauModelFixed docstring Signed-off-by: kevin <[email protected]>
…nd and dv_soma (0, 4095) Signed-off-by: kevin <[email protected]>
Signed-off-by: kevin <[email protected]>
Signed-off-by: kevin <[email protected]>
I updated my local branch with several of the recommended changes, and commented on some of the requested changes that I have questions about. |
@mgkwill Thanks for fixing the codacy issues! I'm still a bit new to github, especially using it collaboratively, so I had started making changes before merging that pull request. I added a commit to sort out some of the merge conflicts I created and then merged the pull request. |
Signed-off-by: kevin <[email protected]>
…plateau-neuron
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've made most of the changes requested to the code.
There are two points I left unresolved:
- Fixed vs BitAcct: I've modeled the behavior in a bit-acct way, but there is not an actual Nc implementation of this model.
- Inheriting AbstractLIF: AbstractLIF has several attributes (u, bias_mant, etc.) which this model does not have, so I don't think it should inherit from that class.
Issue Number: #775
Objective of pull request: Add Plateau neuron model to lava as described in the above issue.
Pull request checklist
Your PR fulfills the following requirements:
flakeheaven lint src/lava tests/
) and (bandit -r src/lava/.
) pass locallypytest
) passes locallyPull request type
Please check your PR type:
What is the current behavior?
What is the new behavior?
Does this introduce a breaking change?
Supplemental information