Releases: ENOT-AutoDL/ENOT_Tutorials
Releases · ENOT-AutoDL/ENOT_Tutorials
v3.5.2
v3.5.1
feat: InstanceNorm pruning
fix: pruning info deserialization error on cpu-only devices (map_location argument)
🍰
v3.5.0
feat: support Python 3.10
refactor: rm NAS
docs: clarify distill context manager and adapters behavior
🍰
v3.4.8
feat: support PyTorch 2.x
feat: new quantization distiller interface - context decorator
fix: decrease memory consumption of quantization distiller
🍰
v3.4.7
feat: handle latency measurement error for OptimalPruningLabelSelector
feat: starting points generator and tutorial
feat: logging score for knapsack
docs: ENOT prunable modules (separate package)
refactor: rm integrated module replacing from ENOT (moved to ENOT prunable modules package), rm redundant deps
🍰
v3.4.6
feat: Linear + BatchNorm1d fusion for FakeQuantized model
🍰
v3.4.5
feat(pruning): knapsack label selector
🍰
v3.4.4
fix: broken GroupNorm weights revert
refactor: revert method to go back to original (non-prunable) models
v3.4.3
feat: latency priority formation of trust region
feat: pruning state serialization/deserialization
fix: AvgPool/Flatten and output quantization for STM quantization scheme
fix: arbitrary dims in prunable GroupNorm and memory optimization
🍰