You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This ticket is being written to refine the work in #374.
In order to better control the code being used for denoising, we want to write our own wrapper around tensorflow to accomplish the inference step in denoising (the act of actually applying a model to a noisy movie to produce a denoised movie).
The module should conform to Pika's coding standards.
The module should be well tested.
The module should use our typical argschema-driven CLI.
The module should run efficiently on CPU nodes (it may have the option of utilizing GPU ndoes). As a baseline, we have been able to denoise a 20 GB movie on a CPU node in about four hours using this branch of deepinterpolation https://github.com/danielsf/deepinterpolation/tree/staging/ophys_etl
The core_inferrence class in deepinterpolation can be used as a guide
This ticket is being written to refine the work in #374.
In order to better control the code being used for denoising, we want to write our own wrapper around tensorflow to accomplish the inference step in denoising (the act of actually applying a model to a noisy movie to produce a denoised movie).
The
core_inferrence
class in deepinterpolation can be used as a guidehttps://github.com/danielsf/deepinterpolation/blob/staging/ophys_etl/deepinterpolation/inferrence_collection.py#L264
Components that will need to be implemented
predict
method on the data provided by the data iteratorTasks
Validation
Take a sample of the noisy movies from
/allen/programs/mindscope/workgroups/surround/motion_correction_labeling_2022
And the corresponding models from
/allen/programs/mindscope/workgroups/surround/denoising_labeling_2022/bespoke_models
And reproduce the corresponding denoised movies from
/allen/programs/mindscope/workgroups/surround/denoising_labeling_2022/denoised_movies
The text was updated successfully, but these errors were encountered: