Skip to content

nih-megcore/NIMH_MEG_workshop

Repository files navigation

MEG_workshop_2024

Calendar
Biowulf Processing HPC-On-Demand (v1)
Biowulf Processing SSH Tunnel (v2)
Install Code
Create Data

Day 1 (11/18/2024)

Time Topic Presenter
9:00 - 9:10 Bagel Config + Coffee Download + Computer Setup
9:10 - 9:20 Course Intro Jeff
9:20 - 10:15 Intro to MEG and general overview of source localization Fred
10:15 - 10:45 MEG Hardware and Signal Generation / Collection Stephen
10:45 - 11:00 Break
11:00 - 11:45 Stimuli, trigger processing, epochs, evoked data, bad chans, bad epochs, filtering, abberant signals Tom + Anna
11:45 - 12:30 Lunch
12:30 - 1:00 Basic Linux/Biowulf Demo: More info here - biowulf bash/terminal Allison
1:00 - 1:30 Filtering Data, averaging/evoked data, frequency analysis, hilbert transform, brain rhythms Allison
1:30 - 2:00 Lab 1 Preprocessing
2:00 - 2:45 Lab 2 Frequency Analysis
2:45 - 3:00 Break
3:00 - 3:45 MRI Processing, placing fiducials, coreg, source model (volume + surface), BEM, Forward Model Anna + Jeff
3:45 - 4:30 Lab 3 MRI Integration

Day 2 (11/19/2024)

Time Topic Presenter
9:00 - 9:15 Bagel Config + Coffee Download + Computer Setup
9:15 - 10:00 Using the MEG lab and MEG core services Anna
10:00 - 11:00 Source Localization (Dipoles, multiple dipoles, MNE, dSPM, Beamformer) Jeff + Tom
11:00 - 11:15 Break
11:15 - 12:00 Lab 4 - Source Localization
12:00 - 12:45 Lunch
12:45 - 1:30 Single subject to group data Jeff
1:30 - 2:15 Lab 5 - Beamforming
2:15 - 2:45 Git Jeff
2:45 - 3:00 Break - Squid Meet & Greet
2:00 - 2:30 Lab 6 - Group Data Analysis
3:00 - 3:45 Statistics (parametric / log transform / clusters / BLOBs) Fred
3:45 - 4:30 Possibilities of MEG and Course Review Allison

Biowulf users (must be NIH associated)

###########################
https://hpcondemand.nih.gov # <<- CLICK ON THIS LINK to get to the below webpages
###########################



hpc_login provision open_graphical

biowulf_Desktop_wterminal

Copy the following lines into your terminal. This will copy the code/notebooks and data into your local folder.

sinteractive --mem=16G --cpus-per-task=12 --gres=lscratch:10  #Wait for this to start
module use --append /data/MEGmodules/modulefiles  #You can add this to your .bashrc for convenience
module load meg_workshop

get_code   #Copy the code to your current directory
get_data   #Copy and untar the data to your /data/${USER}/meg_data_workshop

cd NIMH_MEG_workshop
jupyter lab

StartupJupyter

Alternative version using SSH Tunnels -- Biowulf users (must be NIH associated)

Log into biowulf: ssh -Y [email protected]

#You can type tmux before starting sinteractive to have a persistent session between disconnecting wifi
#Allocate resources for processing
sinteractive --mem=16G --cpus-per-task=12 --gres=lscratch:10 --tunnel --time=08:00:00

You will see a line like the below. Follow the instructions (start a new terminal into biowulf), then return to original terminal for the rest of the commands. Tunnel

Copy notebooks to your local folder. Change directories first if you don't want the code/data in your home folder.

module use --append /data/MEGmodules/modulefiles  #You can add this to your .bashrc for convenience
module load meg_workshop

get_code   #Copy the code to your current directory
get_data   #Copy and untar the data to your /data/${USER}/meg_data_workshop

cd NIMH_MEG_workshop
./start_notebook_Day1.sh  #Start the notebook for Day1 - allows for time series scrolling
#OR use for Day2 material  -- ./start_notebook_Day2.sh - Visualize the 3D brain renderings

Enter this into the address bar of your web browser localhost:<PORT>
JupyterLogin

NOTE:If you get something about a token: Copy it from the commandline

Install (not required for biowulf users)

The following software is required to run all parts of the coding sections: afni + freesurfer + git + miniconda(/conda)
To run the majority of the code: miniconda/conda + git are required

Miniconda will provide the minimum features for the installation:
https://docs.conda.io/projects/miniconda/en/latest/
In a terminal, find your Download folder (typically cd /home/<USERNAME>/Downloads or cd /Users/<USERNAME>/Downloads).

chmod +x Miniconda3-latest.....sh   #Make this file executable - Fill in the rest of name (it will be Linux / Mac/ or Windows)
./Miniconda3-latest...sh  #Run the installer.  Open a new terminal after finishing the installation directions

Mamba is not required, but will install faster than conda (functionally they are the same)
To install mamba - conda install --channel=conda-forge --name=base mamba

Install (version1) - requires make, mamba, git

git clone  https://github.com/nih-megcore/NIMH_MEG_workshop.git
cd NIMH_MEG_workshop
make install 

Install (version2)

#Clone this repository - If you don't have git, just download the zip file from the green button at the top of page
git clone https://github.com/nih-megcore/NIMH_MEG_workshop.git

#Install MNE - Substitute conda for mamba if any errors
mamba create --override-channels --channel=conda-forge --name=MEG_workshop mne==1.5 pip jupyterlab -y
conda activate MEG_workshop
pip install h5io pymatreader

#Install the Workshop files
cd NIMH_MEG_workshop
pip install -e .    #Install this code
pip install git+https://github.com/nih-megcore/nih_to_mne.git  #Install some auxilliary NIH code

Install dataset - if you want to create from scratch. Use the link provided by email if you want to download

mamba create -n datalad -c conda-forge datalad gdown -y
conda activate datalad

This will pull the NIMH_hv OpenNeuro repository and associated files.
This will download the freesurfer processed MRI files as well

 ./extras/datalad_pull.sh

Install auxiliary code - (not necessary for course Day1)

Install freesurfer: https://surfer.nmr.mgh.harvard.edu/fswiki/rel7downloads
Install Afni: https://afni.nimh.nih.gov/pub/dist/doc/htmldoc/background_install/main_toc.html

Check installs

Run ./check_requirements.sh to check for freesurfer / afni / mne / jupyter installation.

Data Format for code

Input data is in BIDS format from the NIH HV protocol.
Data should be located in /data/${USER}/meg_workshop_data
   This will automatically be performed on biowulf with the get_data command.
Derivatives data will be in /data/${USER}/meg_workshop_data/{Day1,Day2}/${bids_id}/ses-01/meg/
   Day2 derivatives will have pre-calculated bem, fwd, trans, src files