Skip to content

Releases: ENOT-AutoDL/ENOT_Tutorials

v3.5.2

15 Feb 11:23
Compare
Choose a tag to compare
feat: PReLU operation pruning
feat: GRU operation pruning

fix: permute with tuple shape works

🍰

v3.5.1

27 Dec 08:00
5f4ffbb
Compare
Choose a tag to compare
feat: InstanceNorm pruning
fix: pruning info deserialization error on cpu-only devices (map_location argument)

🍰

v3.5.0

04 Dec 10:08
5f4ffbb
Compare
Choose a tag to compare
feat: support Python 3.10
refactor: rm NAS
docs: clarify distill context manager and adapters behavior

🍰

v3.4.8

09 Nov 13:20
59630af
Compare
Choose a tag to compare
feat: support PyTorch 2.x
feat: new quantization distiller interface - context decorator

fix: decrease memory consumption of quantization distiller

🍰

v3.4.7

17 Oct 08:06
Compare
Choose a tag to compare
feat: handle latency measurement error for OptimalPruningLabelSelector
feat: starting points generator and tutorial
feat: logging score for knapsack

docs: ENOT prunable modules (separate package)

refactor: rm integrated module replacing from ENOT (moved to ENOT prunable modules package), rm redundant deps

🍰

v3.4.6

11 Sep 15:04
d6a0cad
Compare
Choose a tag to compare
feat: Linear + BatchNorm1d fusion for FakeQuantized model

🍰

v3.4.5

08 Sep 19:50
d6a0cad
Compare
Choose a tag to compare
feat(pruning): knapsack label selector

🍰

v3.4.4

08 Sep 11:45
d6a0cad
Compare
Choose a tag to compare
fix: broken GroupNorm weights revert
refactor: revert method to go back to original (non-prunable) models

v3.4.3

01 Sep 15:49
d6a0cad
Compare
Choose a tag to compare
feat: latency priority formation of trust region
feat: pruning state serialization/deserialization

fix: AvgPool/Flatten and output quantization for STM quantization scheme
fix: arbitrary dims in prunable GroupNorm and memory optimization

🍰