Skip to content

Commit

Permalink
major renaming using functional categories: act- learn- init- etc cro…
Browse files Browse the repository at this point in the history
…ssed by structural types: -net -layer - path -- all the algo code goes in there, and then the former "base" files are just layer.go network.go and path.go. much cleaner and well-organized.
  • Loading branch information
rcoreilly committed Nov 4, 2024
1 parent 9027d2a commit 37c5166
Show file tree
Hide file tree
Showing 74 changed files with 39,254 additions and 11,202 deletions.
10 changes: 6 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,8 @@ See the [ra25 example](examples/ra25/README.md) for a complete working example,

# Current Status / News

* November 2024: **v2.0.0-dev-x.x.x**: ongoing updates using the new [goal](https://cogentcore.org/core/goal) _Go augmented language_ framework that supports direct multidimensional tensor indexing, advanced `#` math mode expressions, and major improvements to `gosl` for converting Go to GPU shader language, based on WGPU, that now elminates _all_ hand-written GPU code: everything is fully generated entirely from the original Go source. Previously, we were maintaining a fair amount of redundant CPU and GPU code. All of the logging and data analysis code will be completely rewritten to take advantage of the `goal` expressions that make it much cleaner to directly compute all stats, which are managed by the `datafs` data filesystem that gives direct, flexible, general-purpose browser access to all data. Many files now have `.goal` extensions, which auto-generate corresponding `.go` files. New tooling in Cogent Code makes it easy to manage this, and support for VS Code and other editors is forthcoming.

* August 2024: **v2.0.0-dev-x.x.x**: in process transition to Cogent Core infrastructure, and major improvements in the [Rubicon](Rubicon.md) framework. Also finally figured out how to avoid the computationally expensive integration of calcium at each individual synapse at a cycle-by-cycle level, using a reasonable linear model based on activity windows on the sending and receiving neuron, with multiplicative factors (r^2 is .96 at capturing the cycle-by-cycle values). A full "2.0" release will be made once Cogent Core gets to a 1.0 stable release, and all the tests etc are updated.

* June 2023: **v1.8.0** Neuron and Synapse memory now accessed via methods with arbitrary strides so GPU and CPU can each have optimal memory ordering -- NVIDIA A100 performance now comparable to Mac M1, which also improved by 25%. Includes data parallel processing (multiple input patterns processed in parallel using shared weights) which makes GPU faster than CPU even with small networks (e.g., ra25). See [GPU](GPU.md) and [data parallel](#data_parallel) section.
Expand All @@ -39,13 +41,13 @@ Use `core next-release` to push a tag at the next patch increment, or `core rele

# Design and Organization

* `ActParams` (in [act.go](axon/act.go)), `InhibParams` (in [inhib.go](axon/inhib.go)), and `LearnNeurParams` / `LearnSynParams` (in [learn.go](axon/learn.go)) provide the core parameters and functions used.
* `ActParams` (in [act.goal](axon/act.goal)), `InhibParams` (in [inhib.goal](axon/inhib.goal)), and `LearnNeurParams` / `LearnSynParams` (in [learn.goal](axon/learn.goal)) provide the core parameters and functions used. These are organized into `LayerParams` (accessed via `Layer.Params`) and `PathParams` (accessed via `Path.Params`) structs, which are fully GPU-friendly parameter-only containers, in [layerparams.go](layerparams.go) and [pathparams.go](pathparams.go) respectively.

* There are 3 main levels of structure: `Network`, `Layer` and `Path` (pathway). The network calls methods on its Layers, and Layers iterate over both `Neuron` data structures (which have only a minimal set of methods) and the `Path`s, to implement the relevant computations. The `Path` fully manages everything about a pathway of connectivity between two layers, including the full list of `Synapse` elements in the connection. The Layer also has a set of `Pool` elements, one for each level at which inhibition is computed (there is always one for the Layer, and then optionally one for each Sub-Pool of units).
* There are 3 main levels of structure: `Network`, `Layer` and `Path` (pathway). The `Path` fully manages everything about a pathway of connectivity between two layers, including the full list of `Synapse` elements in the connection. The Layer also has a set of `Pool` elements, one for each level at which inhibition is computed: there is always one for the Layer, and then optionally one for each Sub-Pool of units.

* The `networkbase.go`, `layerbase.go`, and `pathbase.go` code builds on the [emergent](https://github.com/emer/emergent) infrastructure to manage all the core structural aspects (data structures etc), while the non-base code implements algorithm-specific functions. Everything is defined on the same core types (e.g., `axon.Network`). The [layer_compute.go](axon/layer_compute.go) file breaks out has the core algorithm specific code, while [layer.go](axon/layer.go) has other algorithm specific code.
* The `network.goal`, `layer.goal`, and `path.goal` code builds on the [emergent](https://github.com/emer/emergent) infrastructure to manage all the core structural aspects (data structures etc).

* To enable the [GPU](GPU.md) implementation, all of the layer parameters are in `LayerParams` (accessed via `Layer.Params`) and path params in `PathParams` (accessed via `Path.Params`), in [layerparams.go](layerparams.go) and [pathparams.go](pathparams.go) respectively. `LayerParams` contains `ActParams` field (named `Act`), etc.
* Algorithm-specific code is organized by the broad `act`, `inhib`, `learn`, and `init` functional categories, and then by structural level within that. Thus, `act-layer.goal` contains the layer-level code for computing activations (spiking), while `learn-path.goal` has the synaptic-level learning algorithms, etc.

* The ability to share parameter settings across multiple layers etc is achieved through a **styling**-based paradigm -- you apply parameter "styles" to relevant layers -- see [Params](https://github.com/emer/emergent/wiki/Params) for more info. We adopt the CSS (cascading-style-sheets) standard where parameters can be specifed in terms of the Name of an object (e.g., `#Hidden`), the *Class* of an object (e.g., `.TopDown` -- where the class name TopDown is manually assigned to relevant elements), and the *Type* of an object (e.g., `Layer` applies to all layers). Multiple space-separated classes can be assigned to any given element, enabling a powerful combinatorial styling strategy to be used.

Expand Down
Loading

0 comments on commit 37c5166

Please sign in to comment.