Keywords
Attention Network Task, ANT, fMRI, Parkinson’s disease, attention
This article is included in the INCF gateway.
Attention Network Task, ANT, fMRI, Parkinson’s disease, attention
Attention dysfunction is a common symptom of Parkinson’s disease (PD) and has a significant impact on quality of life. Approximately half of all people with PD suffer from attention and/or memory symptoms (Barone et al., 2009).
The data included here is a subset of data from a study (Cholerton et al., 2013) that used the Attention Network Test (ANT) (Fan et al., 2005) to measure three aspects of attention: alerting (achieving and maintaining an alert state), orienting (selecting the spatial location of sensory input), and executive control (resolving conflict). We acquired structural and functional MRI images at two occasions in participants with and without PD, with six randomly ordered repetitions of the ANT task (labeled 1–6) at each occasion. Each numbered run represents the same stimulus list between subjects, although the six runs were presented to each subject in a different order.
Data described in this paper have previously been analyzed in Boord et al. (2017) and Madhyastha et al. (2015), wherein the runs were labeled A-F rather than 1–6.
Procedures were approved by the University of Washington Institutional Review Board (#41304) and subjects provided written informed consent.
The sample of subjects includes 25 participants with PD and 21 healthy controls (HC) who participated in two scanning sessions, which were one to three weeks apart. PD participants were recruited from a larger parent study where they underwent extensive clinical examination and neuropsychological assessment (Cholerton et al., 2013).
Demographic information is provided in Table 1. PD and HC participants did not differ on age (t(40) = 1; p = 0.2) or years of education (t(40) = 0.6, p = 0.6), but did differ on the Movement Disorder Society-sponsored revision of the Unified Parkinson’s Disease Rating Scale (MDS-UPDRS; Goetz et al., 2007) part III (t(30) = 10; p < .001). Participants also underwent a battery of neuropsychological tests (Cholerton et al., 2013). Neuropsychological test results are provided in Table 2. PD and HC participants did not differ on any of the cognitive tests that were administered to both groups. HC participants obtained only a subset of the measurements.
One subject (RC4206) had an acquisition error during their second session structural scan. Correspondingly, their structural scan from their first session has been copied for their second session to create a valid Brain Imaging Data Structure (BIDS) directory.
At each of the two sessions, we acquired six repetitions of the task and T1-weighted structural images from each subject. Data were acquired using a Philips 3.0T X-Series Achieva MR System (Philips Medical Systems, software version R2.6.3) with a 32-channel SENSE head coil. Each session included functional and structural scans. For task scans, whole-brain axial echo-planar images (43 sequential ascending slices, 3 mm isotropic voxels, field of view = 240 x 240 x 129 mm, repetition time = 2400 ms, echo time = 25 ms, flip angle = 79°, SENSE acceleration factor = 2) were collected parallel to the AC-PC line. Each functional scan was 149 volumes (5.96 min). A sagittal T1-weighted 3D MPRAGE (176 slices, matrix size = 256 x 256, inversion time = 1100 ms, turbo-field echo factor = 225, repetition time = 7.46 ms, echo time = 3.49 ms, flip angle = 7°, shot interval = 2530 ms) with 1 mm isotropic voxels was also acquired for registration and tissue analyses.
In total, 45 subjects completed all six task scans in both sessions. One subject did not complete the second session; and one subject is missing task data for the first four task scans (out of six) at the second session.
Most scans were available in the Digital Imaging and Communications in Medicine (DICOM) file format; and were converted to the Neuroimaging Informatics Technology Initiative (NIfTI) file format using the Analysis of Functional NeuroImages (AFNI) program dcm2niix_afni. Subjects with missing DICOMs had Philips format PAR/RECs available and were also converted to NIfTI format using AFNI dcm2niix_afni (Day et al., 2019).
We used the ANT (Fan et al., 2005; Fan et al., 2002), which combines cues and targets within a single reaction time task to measure the efficiency of the alerting, orienting, and executive attention networks. Each session included six separate task runs. Each run included two buffer trials followed by 36 reaction time trials (a total of 432 trials per subject).
A full description of the ANT can be found in Fan et al. (2005). Briefly, in the ANT, subjects are asked to determine the direction of an arrow (left or right); which is flanked by four other arrows. These flanker arrows either point the same direction as the probe arrow (“congruent”) or the opposite direction (“incongruent”). The row of arrows appears either above or below the center of the screen, and prior to displaying the arrows, the participants are presented with a) no cue; b) a spatial cue that reflects where the arrows will appear; or c) a center cue. A fixation cross appeared throughout the trial.
fMRI data were preprocessed using AFNI (Cox, 1996), version AFNI_17.3.00 (Oct. 12, 2017). Processing steps were generated with afni_proc.py (version 5.18, Sept. 12, 2017), treating each repetition of the ANT task as a single scan (i.e. no concatenation).
afni_proc.py call
First four parameters are set on a per-subject basis and represented here with asterisks (*).
afni_proc.py \
-subj_id * \
-dsets * \
-outdir * \
-script * \
-copy_anat T1.nii.gz \
-blocks despike tshift align tlrc volreg blur mask regress \
-align_opts_aea -cost lpc+ZZ \
-tlrc_base MNI152_T1_2009c+tlrc \
-tlrc_NL_warp \
-volreg_warp_dxyz 2 \
-volreg_align_e2a \
-volreg_tlrc_warp \
-volreg_align_to MIN_OUTLIER \
-regress_anaticor \
-regress_est_blur_epits \
-regress_est_blur_errts
We used the following blocks: despike, tshift (default), align, tlrc, volreg (default), blur (default), regress (default). Frames were despiked and slice-timing corrected (tshift). During the align stage, we aligned the functional to the structural using the lpc+ZZ cost function. Following structural alignment, we aligned the data to the Montreal Neurological Institute (MNI) 152 standard space (2009c) template, and the data was blurred with a 4 mm full-width half-max filter and masked using 3dAutomask algorithms. Frames were registered to the minimum outlier and then aligned to standard space. We used anaticor (Jo et al., 2010) to regress out the white matter signal and remove the effects of motion. The final result of the AFNI processing was converted to NIFTI using AFNI 3dAFNIto NIFTI. All scans completed AFNI processing.
The anatomical scans were defaced using pydeface before organizing in BIDS format. Skull-stripping and registration were performed on the undefaced anatomical scans.
Data are organized according to the Brain Imaging Data Structure (BIDS) (Gorgolewski et al., 2016). All 47 subjects have two sessions, with corresponding func/ and anat/ directories.
The AFNI-processed data are included in derivatives, matching the format of Nifti/. Also included for convenience are skull-stripped anatomical images, as skull-stripping is known to occasionally fail on defaced images.
Finally, individual scans have matching JSON files in both datasets, created by dcm2niix_afni. Supplementing these files are higher level JSON files (following the naming convention task-ANT?_bold.json) that supply the “TaskName” and “SliceTiming” parameters. Slice timing information is required by the BIDS format, and as the pre-processed (“derivatives”) data has been slice-timing corrected, an array of zeros is provided for this field.
Task timing data are included on the scan level. The “onset” and “duration” columns are in seconds, and the “trial_type” column includes cue events (“CenterCue,” “SpatialCue,” “NoCue”), target events (“Congruent,” “Incongruent”), and cue/target errors (“CueErr,” “TargetErr”). Only correct-response trials are included. Errors are also generated when the subject responded too early or not at all.
The processing script (afniscript.sh) and demographic information (demographics.csv) are included at the top level.
OpenNeuro: ANT: Healthy aging and Parkinson’s disease. https://doi.org/10.18112/openneuro.ds001907.v2.0.3 (Day et al., 2019)
This project contains the following underlying data:
These folders each contain the following underlying data:
- ses-1/anat (T1w.json and defaced T1w.nii.gz files for session 1)
- ses-1/func (bold.json, bold.nii.gz and events.tsv files for runs 1–6 of session 1)
- ses-2/anat (T1w.json and defaced T1w.nii.gz files for session 2)
- ses-2/func (bold.json, bold.nii.gz and events.tsv files for runs 1–6 of session 2)
OpenNeuro: ANT: Healthy aging and Parkinson’s disease. https://doi.org/10.18112/openneuro.ds001907.v2.0.3 (Day et al., 2019)
This project contains the following extended data:
- .bidsignore (file to suppress BIDS naming warning messages)
- afniscript.sh (processing script)
- dataset_description.json (BIDS dataset parameters)
- demographics.csv (demographic information for participants)
- README (README file, including changelog)
- task-ANT_bold.json (acquisition parameters for task scan)
- derivatives/ (AFNI-processed functional images within func/ directories; skull-stripped anatomical images within anat/)
Data are available under the terms of the Creative Commons Zero "No rights reserved" data waiver (CC0 1.0 Public domain dedication).
- Source code available from: https://github.com/IBIC/UdallANT
- Archived source code at time of publication: https://doi.org/10.5281/zenodo.2847832 (Day, 2019)
- License: MIT
This research was supported by NIH RC4 NS073008 (PI: Grabowski), P50 NS062684 (PI: Montine). Peter Boord received postdoctoral support under the Ruth L. Kirschstein National Research Service Award, T32AG0000258.
The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript
We are grateful to the participants of the Pacific Udall Center for contributing their time and data to advance Parkinson’s research.
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Is the rationale for creating the dataset(s) clearly described?
Yes
Are the protocols appropriate and is the work technically sound?
Yes
Are sufficient details of methods and materials provided to allow replication by others?
Partly
Are the datasets clearly presented in a useable and accessible format?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Neuroimaging in neurological diseases
Is the rationale for creating the dataset(s) clearly described?
Yes
Are the protocols appropriate and is the work technically sound?
Yes
Are sufficient details of methods and materials provided to allow replication by others?
Yes
Are the datasets clearly presented in a useable and accessible format?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Parkinson's disease, neuroimaging
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | ||
---|---|---|
1 | 2 | |
Version 1 04 Jun 19 |
read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)