-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementing Contrastive Self-Supervised Learning with Radiation Augmentations, SimCLR, PyTorch Lightning, and Hyperparameter Optimization #48
Closed
Closed
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
author Jordan Stomps <[email protected]> 1666192691 -0400 committer Jordan Stomps <[email protected]> 1691503697 -0400 removing accidental jupyter notebook inclusion implementing contrastive learning with pytorch lightning, pytorch-metric-learning, and designed augmentations creating background augmentation adding sig2bckg augmentation adding masking augmentation testing an implementation of gain shift formalizing gain-shift method in augmentation class adding fit functions and implementation for resolution augmentation experimenting with new gain shift correcting positive gain drift formulation adding resampler as second candidate for drift adding gain-shift algorithm manual testing adding resampling noise to resolution transformation rough draft nuclear interactions complete design of nuclear interactions condensing gain_shift algorithms cleaning and finalizing docs for gain_shift addressing edge cases with DANSE.resolution [WIP] attempting to improve escape peak intensities correcting fit roi for nuclear interactions bug fix for mask augmentation adding a peak count conservation method to resolution augmentation adding init to scripts folder overhaul of augmentations to address experience in example use expect background spectra to be resampled before being used initializing necessary PyTorch and SimCLR scripts collecting more NT-Xent implementations making classes for augmentations and data management finish draft adaptation for minos WIP bugfixing dry run hunting a float/long type error debugging projection head output debugged ballooning representations and supervised raw_scores; learning rates too high adding ability for different minos data major refactor to pytorch-metric-learning by Kevin Musgrave churning results and adding projection head saving pytorch lightning implementation adding functionality for background subtraction in contrastive learning pep8 bug fixing semi-supervised labeled loss alpha scaling term changing resample from Poisson->Binomial bugfixing and removing extraneous print statements adding effective learning rate for small batch size and potential functionality for projection head EMA added some functionalities for using AdamW instead of LARS adding input arg for specifying augmentations adjusting syntax errors adding CNN functionality working functionality for squeezing vectors dependent on convolution using os catching missing max pooling
1 task
Closing in favor of #52. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR constitutes the bulk of my code (excluding notebooks used to generate and analyze results) used in my dissertation. Some highlights:
augs.py
) including a way to use them in contrastive learning (transforms.py
).ann.py
/lightModel.py
) multilayer perceptron (LinearNN
) and convolutional neural network (ConvNN
), and a projection head (critic.py
).SlimCLR.py
) and PyTorch Lightning (SlimCLRLight.py
) using the packagepytorch-learning-metrics
for a normalized cross-entropy loss function based on SimCLR.SSLHyperOpt.py
) and the projection head (ProjHyperOpt.py
) using the packagehyperopt
.contrastive-environment.yml
).There is a lot of code in this branch, so it will undoubtedly make sense to split it up. I also need to clean up scripts. Many include functions that I ended up not using, or were borrowed from other people (e.g.
specTools.py
from Ken) and require proper attribution. If we want to move to review and merge this PR, we should probably finish reviewing and merging the preliminary work PRs in #42, #44, #45, #46.