Behavior and Neuroimaging Core User Manual
  • About
  • Infrastructure Overview
  • XNAT
    • Getting Started
    • Accessing XNAT
    • BIDS Ready Protocols
    • New XNAT projects
    • Uploading Data
    • Downloading Data
  • Demo Dataset
    • Introduction
    • How to access it
    • Protocol Information
    • Basic analysis example: checks task
  • XNAT to BIDS
    • Getting Started
    • XNAT2BIDS Software
    • Exporting to BIDS using Oscar
      • Oscar Utility Script
        • Running xnat2bids using default configuration
        • Running xnat2bids with a custom configuration
        • Syncing your XNAT project & Oscar data directory
        • Extra tools & features
      • Step-wise via Interact Session
    • BIDS Validation
      • Oscar
      • Docker
    • Converting non-MR data
      • Physiological data
      • EEG data
  • XNAT TO BIDS (Legacy)
    • Oscar SBATCH Scripts
  • BIDS and BIDS Containers
    • Introduction to BIDS
    • mriqc
    • fmriprep
    • BIDS to NIMH Data Archive (NDA)
  • Analysis Pipelines
    • Freesurfer
    • 🚧CONN Toolbox
    • FSL topup and eddy
    • Tractography: DSI Studio
    • Brown University MRS Data Collection and Preprocessing Protocol
    • LC Model
      • Installation
      • Example Run
      • Running LCModel on your own data
    • Quantitative Susceptibility Mapping (QSM)
  • Standalone Tools
    • Automated MR spectroscopy voxel placement with voxalign
      • Installation
      • Multi-session alignment
      • Center on MNI coordinate
      • Quantify voxel overlap
    • dicomsort: a tool to organize DICOM files
    • ironmap
    • convert enhanced multi-frame DICOMs to legacy single-frame
    • DICOM anonymization
  • MRF GUIDES
    • MRI simulator room
      • Motion Trainer: Balloon Task
      • Simulating scanner triggers
    • Stimulus display & response collection
    • Eyetracking at the scanner
    • Exporting data via scannershare
    • EEG in the scanner
    • Exporting spectroscopy RDA files
  • Community
    • MRF/BNC user community meetings
Powered by GitBook
On this page

Was this helpful?

  1. Demo Dataset

Basic analysis example: checks task

Preprocessing and applying a general linear model using AFNI

PreviousProtocol InformationNextGetting Started

Last updated 1 year ago

Was this helpful?

This is a very simple visual task, with alternating 12s blocks of flashing checkerboard stimuli in the left and right visual hemifields. Because of the contralateral organization of visual cortex, we can identify the right visual cortex by selecting voxels that prefer stimulation on the left side of visual space, and vice versa. Here, we provide a bare-bones example using the software package.

Step 1: Download data from XNAT and automatically convert to BIDS format with xnat-tools

In this example, we will use the data from demodat participant 005, session 1. Running the following series of commands on the command line in Oscar will download the data we need, convert it to BIDS format, and run the BIDS validator to check for any issues. We will be using the new xnat-tools Oscar utility script explained .

First, we need to create a configuration .toml file that contains some information xnat-tools needs to download the correct data and put it where we want. Let's call this file x2b_demodat_config.toml and place wherever you'd like (simplest would be your home directory). Paste the following into your .toml file, and change mail-user to your email address. The script will default to placing the downloaded and BIDS-converted data in a folder called "bids-export" in your home directory; if you'd like to change this location, add a new line at the bottom with your desired path, i.e.: bids_root="/oscar/home/<yourusername>/xnat-export". Make sure to save this .toml file when you are done editing.

# Configuring arguments here will override default parameters.
[slurm-args]
mail-user = "example-user@brown.edu"
mail-type = "ALL"

[xnat2bids-args]
sessions = [
    "XNAT_E00114"
    ]
skipseq=[6]
overwrite=true
verbose=1

To run the xnat-tools export and BIDS conversion, change directory to /oscar/data/bnc/shared/scripts/oscar-scripts/. On the command line, type:

module load anaconda/latest

python run_xnat2bids.py --config ~/x2b_demodat_config.toml

If you named your .toml file differently or placed it somewhere other than your home directory, make sure to include the full path to your file and the correct filename. Enter your XNAT username and password when prompted.

You should receive output that looks like this:

Enter XNAT Username: elorenc1
Enter Password: 
DEBUG: {'message': 'Argument List', 'session': 'XNAT_E00114', 'slurm_param_list': ['--time 04:00:00', '--mem 16000', '--nodes 1', '--cpus-per-task 2', '--job-name xnat2bids', '--mail-user example-user@brown.edu', '--mail-type ALL', '--output /oscar/scratch/elorenc1/logs/%x-XNAT_E00114-%J.txt'], 'x2b_param_list': ['XNAT_E00114', '/users/elorenc1/bids-export/', '--host https://xnat.bnc.brown.edu', '--user elorenc1', '--skipseq 6', '--overwrite', '--verbose']}
DEBUG: {'message': 'Executing xnat2bids', 'session': 'XNAT_E00114', 'command': ['sbatch', '--time', '04:00:00', '--mem', '16000', '--nodes', '1', '--cpus-per-task', '2', '--job-name', 'xnat2bids', '--mail-user', 'example-user@brown.edu', '--mail-type', 'ALL', '--output', '/oscar/scratch/elorenc1/logs/%x-XNAT_E00114-%J.txt', '--wrap', 'apptainer', 'exec', '--no-home', '-B', '/users/elorenc1/bids-export/', '/oscar/data/bnc/simgs/brownbnc/xnat-tools-v1.2.1.sif', 'xnat2bids', '[XNAT_E00114,', '/users/elorenc1/bids-export/,', '--host,', 'https://xnat.bnc.brown.edu,', '--user,', 'elorenc1,', '--skipseq,', '6,', '--overwrite,', '--verbose]']}
INFO: Launched 1 xnat2bids job
INFO: Job ID: 9992280
INFO: Launched bids-validator to check BIDS compliance
INFO: Job ID: 9992281
INFO: Processed Scans Located At: /users/elorenc1/bids-export/

If you entered your email address, you should receive an email when your xnat2bids job begins, and another when it finishes.

This will create a source data folder for subject 005 within $bids_root/bnc/study-demodat/xnat-export and a BIDS-compatible data directory for subject 005 within $bids_root/bnc/study-demodat/bids/.

✳️ We will call this output BIDS-compatible folder (/oscar/home/<yourusername>/xnat-export/bnc/study-demodat/bids/, unless you specified a different $bids_root location) $bidsdir for the remainder of the tutorial.

Step 2: Extract stimulus timing information from stimulus presentation output files.

To make our data BIDS compatible and facilitate future data sharing, we need to create events.tsv files that correspond to each of our functional runs and contain information about each stimulus event of interest (onset time, condition, etc.). First, download the participant's data files (in our case, created by PsychoPy) and place them in the sourcedata subfolder of your BIDS directory in a subfolder named 'beh'. So, for this participant and session, the full path should be: $bidsdir/sourcedata/sub-005/ses-session1/beh.

If you are unable to run this script for any reason, you can download the events.tsv output files here, and manually place them in $bidsdir/sub-005/ses-session1/func/ .

Step 3: Convert events.tsv files into AFNI stimulus timing files

We needed to make those events.tsv files for BIDS compatibility, but in order to run our statistical analysis in AFNI, we need to transform them into .1D text files required by AFNI for specifying stimulus timing information. Instead of one file per run, as we had with the events.tsv files, here we need one file per condition (e.g. left hemifield checks), with one line per run of the task, specifying all the onset times for that condition. We have created an example python script make_afni_stimtimes.py, which you can run from the command line just as you did make_events.py: python make_afni_stimtimes.py --bids_dir $bidsdir --subj sub-005 --sess ses-session1 . This will create stimulus timing files in $bidsdir/derivatives/afni/sub-005/ses-session1/stimtimes/ .

If you are unable to run this script for any reason, you can download the .1D files here and manually place them in $bidsdir/derivatives/afni/sub-005/ses-session1/stimtimes/.

Step 4: Use afni_proc.py to create a simple preprocessing stream and run the general linear model for the checks task

✳️ To access AFNI on Oscar, type module load afni/21.2.04.

  1. tshift (slice time correction)

  2. align (aligning the EPIs to the anatomical scan)

  3. volreg (motion correction within each functional run)

  4. blur (spatial smoothing with a 4mm FWHM smoothing kernel),

  5. mask (create a "brain" mask from the functional data, restricted by the anatomy)

  6. scale (scale each voxel to have a mean of 100 per run)

For the regression, we use -regress_stim_times to provide the checks_left_stimtimes.1D and checks_right_stimtimes.1D files for this participant, -regress_stim_labels to assign those conditions the labels of "left" and "right" respectively, -regress_basis to model each stimulus as a block lasting 12 seconds, and -regress_opts_3dD to specify our contrasts. Here, we do a "left_vs_right" contrast to find voxels whose activity is greater for left hemifield stimulation than for right, and a "right_vs_left" contrast that does the opposite (and should yield the same statistical map, but with opposite-signed t-values).

Copy the text in the box below (changing value of the bidsdir variable to your own location - path should end in /bids), save it as a file called demodat_afniproc.sh, and then execute it on the command line with bash demodat_afniproc.sh. This demodat_afniproc.sh script will then create a much longer proc.sub-005 tcsh script, which will be automatically executed because we included the -execute flag at the bottom of the script. Looking at the proc.sub-005 script is the best way to gain a deeper understanding of each of AFNI's processing steps.

demodat_afniproc.sh
#!/bin/bash

bidsdir=/oscar/path/to/bids
subID='sub-005'
sess='ses-session1'
task='checks'

afni_proc.py                                                         \
    -subj_id                  $subID                                 \
    -out_dir                  $bidsdir/derivatives/afni/$subID/$sess/$subID.$task.results  \
    -copy_anat                $bidsdir/$subID/$sess/anat/${subID}_${sess}_acq-memprageRMS_T1w.nii.gz        \
    -anat_has_skull           yes                                     \
    -dsets                    $bidsdir/$subID/$sess/func/*$task*nii*                 \
    -blocks                   tshift align volreg blur mask scale regress                          \
    -tcat_remove_first_trs    0                                      \
    -align_opts_aea           -cost lpc+ZZ                           \
                                -giant_move                            \
                                -check_flip                            \
    -volreg_align_to          MIN_OUTLIER                            \
    -volreg_align_e2a                                                \
    -mask_epi_anat            yes                                    \
    -blur_size                4.0                                    \
    -regress_stim_times       $bidsdir/derivatives/afni/$subID/$sess/stimtimes/${subID}_${task}_left_stimtimes.1D $bidsdir/derivatives/afni/$subID/$sess/stimtimes/${subID}_${task}_right_stimtimes.1D          \
    -regress_stim_labels      left right                                \
    -regress_basis            'BLOCK(12,1)'                          \
    -regress_opts_3dD         -jobs 2                                \
                                -gltsym 'SYM: left -right'                \
                                -glt_label 1 left_vs_right                       \
                                -gltsym 'SYM: right -left'                \
                                -glt_label 2 right_vs_left                       \
    -regress_motion_per_run                                          \
    -regress_censor_motion    0.3                                    \
    -regress_censor_outliers  0.05                                   \
    -regress_reml_exec                                               \
    -regress_compute_fitts                                           \
    -regress_make_ideal_sum   sum_ideal.1D                           \
    -regress_est_blur_epits                                          \
    -regress_est_blur_errts                                          \
    -regress_run_clustsim     yes                                     \
    -html_review_style        pythonic                               \
    -execute

After the demodat_afniproc.sh script executes successfully, a results directory will be created: $bidsdir/derivatives/afni/sub-005/ses-session1/sub-005.checks.results. Start AFNI from within this directory (just type afni on the command line), set the underlay to anat_final.sub-005 and the overlay to stats.sub-005_REML. In the Define Overlay menu, set the OLay to "#7 left_vs_right#0_Coef" and the Thr to "#8 left_vs_right#0_Tstat", and change the threshold to your desired alpha (here we've used p = 0.001). This left vs. right contrast shows regions of the brain that show a stronger BOLD response to left vs. right visual hemifield stimulation, so we can easily localize the right visual cortex and the right LGN, as expected.

Next, download our example python script make_events.py, and run it from the command line with python make_events.py --bids_dir $bidsdir --subj sub-005 --sess ses-session1. For this script to run, you'll need both and installed in your python environment (if you're doing this on Oscar and you run module load anaconda/latest, you should be all set). This script will create BIDS-formatted events.tsv files corresponding to each functional run in $bidsdir/sub-005/ses-session1/func/.

This basic example of a univariate analysis with AFNI is based on the for afni_proc.py. The -blocks flag lists the processing blocks that will be executed, in order:

regress (build a general linear model and execute with )

AFNI
here
numpy
pandas
example 6b
3dREMLfit
6KB
sub-005_ses-1_beh.zip
archive
Demodat subject 005 session 1 behavioral data. For now, we'll only be using the two datafiles for the hemifield localizer task, with "LRChx" in their filenames.
6KB
make_events.py
example python script to read in csv files created by PsychoPy and create the events.tsv files corresponding to each fMRI run
3KB
sub-005_ses-01_eventsfiles.zip
archive
2KB
make_afni_stimtimes.py
example python script to read in events.tsv files from each functional run and output the .1D stimulus timing files AFNI needs
2KB
sub-005_stimtimes.zip
archive
Results of a general linear test contrasting left vs. right visual hemifield stimulation, in demodat subject 005 session 1