Basic analysis example: checks task
Preprocessing and applying a general linear model using AFNI
This is a very simple visual task, with alternating 12s blocks of flashing checkerboard stimuli in the left and right visual hemifields. Because of the contralateral organization of visual cortex, we can identify the right visual cortex by selecting voxels that prefer stimulation on the left side of visual space, and vice versa. Here, we provide a bare-bones example using the AFNI software package.
Step 1: Download data from XNAT and automatically convert to BIDS format with xnat-tools
In this example, we will use the data from demodat participant 005, session 1. Running the following series of commands on the command line in Oscar will download the data we need, convert it to BIDS format, and run the BIDS validator to check for any issues. We will be using the new xnat-tools Oscar utility script explained here.
First, we need to create a configuration .toml file that contains some information xnat-tools needs to download the correct data and put it where we want. Let's call this file x2b_demodat_config.toml and place wherever you'd like (simplest would be your home directory). Paste the following into your .toml file, and change mail-user
to your email address. The script will default to placing the downloaded and BIDS-converted data in a folder called "bids-export" in your home directory; if you'd like to change this location, add a new line at the bottom with your desired path, i.e.: bids_root="/oscar/home/<yourusername>/xnat-export"
. Make sure to save this .toml file when you are done editing.
To run the xnat-tools export and BIDS conversion, change directory to /oscar/data/bnc/shared/scripts/oscar-scripts/
. On the command line, type:
module load anaconda/latest
python run_xnat2bids.py --config ~/x2b_demodat_config.toml
If you named your .toml file differently or placed it somewhere other than your home directory, make sure to include the full path to your file and the correct filename. Enter your XNAT username and password when prompted.
You should receive output that looks like this:
If you entered your email address, you should receive an email when your xnat2bids job begins, and another when it finishes.
This will create a source data folder for subject 005 within $bids_root/bnc/study-demodat/xnat-export
and a BIDS-compatible data directory for subject 005 within $bids_root/bnc/study-demodat/bids/
.
✳️ We will call this output BIDS-compatible folder (/oscar/home/<yourusername>/xnat-export/bnc/study-demodat/bids/
, unless you specified a different $bids_root
location) $bidsdir
for the remainder of the tutorial.
Step 2: Extract stimulus timing information from stimulus presentation output files.
To make our data BIDS compatible and facilitate future data sharing, we need to create events.tsv files that correspond to each of our functional runs and contain information about each stimulus event of interest (onset time, condition, etc.). First, download the participant's data files (in our case, created by PsychoPy) and place them in the sourcedata subfolder of your BIDS directory in a subfolder named 'beh'. So, for this participant and session, the full path should be: $bidsdir/sourcedata/sub-005/ses-session1/beh
.
Next, download our example python script make_events.py, and run it from the command line with python make_events.py --bids_dir $bidsdir --subj sub-005 --sess ses-session1
. For this script to run, you'll need both numpy and pandas installed in your python environment (if you're doing this on Oscar and you run module load anaconda/latest
, you should be all set). This script will create BIDS-formatted events.tsv files corresponding to each functional run in $bidsdir/sub-005/ses-session1/func/
.
If you are unable to run this script for any reason, you can download the events.tsv output files here, and manually place them in $bidsdir/sub-005/ses-session1/func/
.
Step 3: Convert events.tsv files into AFNI stimulus timing files
We needed to make those events.tsv files for BIDS compatibility, but in order to run our statistical analysis in AFNI, we need to transform them into .1D text files required by AFNI for specifying stimulus timing information. Instead of one file per run, as we had with the events.tsv files, here we need one file per condition (e.g. left hemifield checks), with one line per run of the task, specifying all the onset times for that condition. We have created an example python script make_afni_stimtimes.py, which you can run from the command line just as you did make_events.py: python make_afni_stimtimes.py --bids_dir $bidsdir --subj sub-005 --sess ses-session1
. This will create stimulus timing files in $bidsdir/derivatives/afni/sub-005/ses-session1/stimtimes/
.
If you are unable to run this script for any reason, you can download the .1D files here and manually place them in $bidsdir/derivatives/afni/sub-005/ses-session1/stimtimes/
.
Step 4: Use afni_proc.py to create a simple preprocessing stream and run the general linear model for the checks task
✳️ To access AFNI on Oscar, type module load afni/21.2.04
.
This basic example of a univariate analysis with AFNI is based on the example 6b for afni_proc.py. The -blocks flag lists the processing blocks that will be executed, in order:
tshift (slice time correction)
align (aligning the EPIs to the anatomical scan)
volreg (motion correction within each functional run)
blur (spatial smoothing with a 4mm FWHM smoothing kernel),
mask (create a "brain" mask from the functional data, restricted by the anatomy)
scale (scale each voxel to have a mean of 100 per run)
regress (build a general linear model and execute with 3dREMLfit)
For the regression, we use -regress_stim_times
to provide the checks_left_stimtimes.1D and checks_right_stimtimes.1D files for this participant, -regress_stim_labels
to assign those conditions the labels of "left" and "right" respectively, -regress_basis
to model each stimulus as a block lasting 12 seconds, and -regress_opts_3dD
to specify our contrasts. Here, we do a "left_vs_right" contrast to find voxels whose activity is greater for left hemifield stimulation than for right, and a "right_vs_left" contrast that does the opposite (and should yield the same statistical map, but with opposite-signed t-values).
Copy the text in the box below (changing value of the bidsdir
variable to your own location - path should end in /bids
), save it as a file called demodat_afniproc.sh
, and then execute it on the command line with bash demodat_afniproc.sh
. This demodat_afniproc.sh
script will then create a much longer proc.sub-005
tcsh script, which will be automatically executed because we included the -execute flag at the bottom of the script. Looking at the proc.sub-005 script is the best way to gain a deeper understanding of each of AFNI's processing steps.
After the demodat_afniproc.sh
script executes successfully, a results directory will be created: $bidsdir/derivatives/afni/sub-005/ses-session1/sub-005.checks.results
. Start AFNI from within this directory (just type afni
on the command line), set the underlay to anat_final.sub-005 and the overlay to stats.sub-005_REML. In the Define Overlay menu, set the OLay to "#7 left_vs_right#0_Coef" and the Thr to "#8 left_vs_right#0_Tstat", and change the threshold to your desired alpha (here we've used p = 0.001). This left vs. right contrast shows regions of the brain that show a stronger BOLD response to left vs. right visual hemifield stimulation, so we can easily localize the right visual cortex and the right LGN, as expected.
Last updated