Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Plateau Neuron - Fixed Point Model #781
base: main
Are you sure you want to change the base?
Plateau Neuron - Fixed Point Model #781
Changes from 4 commits
0263677
d70b6e7
3f649d1
6647107
5db90d7
bab1fd1
5d8b10c
1961780
4be4918
70433d4
4b26480
fb217ba
d1c66f0
c469bd8
e8a8215
6ccbf74
cea6296
6e3744d
0c6bf24
d2aed48
6dd7e3b
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the process, you specify that the variables are float. If you want them to be integers, I would specify that in the Process.
Also, is this model really using fixed point precision? Or is it bit accurate to Loihi?
In the later case, we tend to call it "...ModelBitAcc", rather than "...ModelFixed"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wasn't sure if bit-accurate would be appropriate for this, since I don't have a version of this that runs on Loihi. I tried to give the variables the same precision as on Loihi.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please note: On Loihi, variables typically have 8, 16, or 24 bit precision. Thus, you may want to avoid 12bit.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I chose 12-bit for the du and dv values since that was the precision of those values in the PyLifModelBitAcct. Since I'm choosing the precisions based on bit-accurate models, should I change this model to bit-accurate, as suggested in a different comment?
Check warning on line 54 in src/lava/proc/plateau/models.py
Codacy Production / Codacy Static Code Analysis
src/lava/proc/plateau/models.py#L54
Check warning on line 55 in src/lava/proc/plateau/models.py
Codacy Production / Codacy Static Code Analysis
src/lava/proc/plateau/models.py#L55
Check warning on line 128 in src/lava/proc/plateau/models.py
Codacy Production / Codacy Static Code Analysis
src/lava/proc/plateau/models.py#L128
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a minor point. Since this is a LIF neuron, did you consider to let it inherit from the AbstractLIF process, and adding a 'LIF' in the class name? Not sure if it makes sense in this specific example, though. Up to you.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I originally didn't do this since I thought of it as two combined LIF neurons instead of a modified LIF neuron. I'll look over the AbstractLIF process and see if it makes sense to inherit from that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think the class should inherit from AbstrictLIF, since it does not have current or bias vars.
Check warning on line 8 in tests/lava/proc/plateau/test_models.py
Codacy Production / Codacy Static Code Analysis
tests/lava/proc/plateau/test_models.py#L8
Check warning on line 12 in tests/lava/proc/plateau/test_models.py
Codacy Production / Codacy Static Code Analysis
tests/lava/proc/plateau/test_models.py#L12
Check warning on line 14 in tests/lava/proc/plateau/test_models.py
Codacy Production / Codacy Static Code Analysis
tests/lava/proc/plateau/test_models.py#L14
Check warning on line 20 in tests/lava/proc/plateau/test_models.py
Codacy Production / Codacy Static Code Analysis
tests/lava/proc/plateau/test_models.py#L20
Check warning on line 23 in tests/lava/proc/plateau/test_models.py
Codacy Production / Codacy Static Code Analysis
tests/lava/proc/plateau/test_models.py#L23
Check warning on line 128 in tests/lava/proc/plateau/test_models.py
Codacy Production / Codacy Static Code Analysis
tests/lava/proc/plateau/test_models.py#L128
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you check if lintin passes?
'flakeheaven lint src/lava tests'
My guess is that there are a few points that must change, including missing lines at the end of files. Not functionally relevant, but important to keep a clean code base :-)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've run the linting, (flakeheaven and bandit) and they pass on my local copy of the code. Also, I have the empty line at the end of the files locally, but it doesn't seem to show up on the github versions. Does github just not show the empty line at the end?