You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As a stop-gap until we are able to devote the time to productionize our segmentation pipeline, we should store the scripts that were used in the SSF data generation process in a central location along with the minimum documentation needed to remind ourselves what we did, should we need to reprocess something by hand.
Create GitHub repository to store scripts
Get examples of scripts to run
Training of the denoiser
Denoising
Segmentation
Classification
Trace extraction
Write up the process for running these scripts by hand such that another developer could push a single ophys experiment through that pipeline if necessary.
This work should be limited to no more than 1.5 days of effort, since this is not what we want our final pipeline to look like.
The text was updated successfully, but these errors were encountered:
As a stop-gap until we are able to devote the time to productionize our segmentation pipeline, we should store the scripts that were used in the SSF data generation process in a central location along with the minimum documentation needed to remind ourselves what we did, should we need to reprocess something by hand.
This work should be limited to no more than 1.5 days of effort, since this is not what we want our final pipeline to look like.
The text was updated successfully, but these errors were encountered: