Usage Notes

Execution and the BIDS format

The main input to ASLPREP is the path of the dataset that needs processing.

The input dataset is required to be in valid BIDS format, and it must include at least one T1w structural image. We highly recommend that you validate your dataset with the free, online BIDS Validator.


Please note that ASL data in BIDS datasets should already be scaled.

What this means is that the M0 scans in your dataset should preferably be scaled before running ASLPrep.

Please see the BIDS starter kit for information about converting ASL data to BIDS.

If your data are not already scaled, you should use the --m0_scale parameter when running ASLPrep.

The exact command to run ASLPrep depends on the Installation method. The common parts of the command follow the BIDS-Apps definition. For example:

aslprep data/bids_root/ out/ participant -w work/

Command-Line Arguments

ASLPrep: ASL PREProcessing workflows v0.5.0

usage: aslprep [-h] [--version] [--skip_bids_validation]
               [--participant-label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
               [--bids-filter-file FILE] [--anat-derivatives PATH]
               [--nprocs NPROCS] [--omp-nthreads OMP_NTHREADS]
               [--mem MEMORY_GB] [--low-mem] [--use-plugin FILE] [--anat-only]
               [--boilerplate_only] [--md-only-boilerplate] [-v]
               [--ignore {fieldmaps,sbref} [{fieldmaps,sbref} ...]]
               [--output-spaces [OUTPUT_SPACES [OUTPUT_SPACES ...]]]
               [--asl2t1w-init {register,header}] [--asl2t1w-dof {6,9,12}]
               [--force-bbr | --force-no-bbr] [--force-ge | --force-no-ge]
               [--m0_scale M0_SCALE] [--random-seed RANDOM_SEED]
               [--dummy-vols DUMMY_VOLS] [--smooth_kernel SMOOTH_KERNEL]
               [--scorescrub] [--basil]
               [--skull-strip-template SKULL_STRIP_TEMPLATE]
               [--skull-strip-t1w {auto,skip,force}] [--fmap-bspline]
               [--fmap-no-demean] [--use-syn-sdc] [--force-syn]
               [--fs-license-file FILE] [-w WORK_DIR] [--clean-workdir]
               [--resource-monitor] [--reports-only] [--run-uuid RUN_UUID]
               [--write-graph] [--stop-on-first-crash] [--notrack] [--sloppy]
               bids_dir output_dir {participant}

Positional Arguments


the root folder of a BIDS valid dataset (sub-XXXXX folders should be found at the top level in this folder).


the output path for the outcomes of preprocessing and visual reports


Possible choices: participant

processing stage to be run, only “participant” in the case of ASLPREP (see BIDS-Apps specification).

Named Arguments


show program’s version number and exit

Options for filtering BIDS queries

--skip_bids_validation, --skip-bids-validation

assume the input dataset is BIDS compliant and skip the validation

--participant-label, --participant_label

a space delimited list of participant identifiers or a single identifier (the sub- prefix can be removed)


a JSON file describing custom BIDS input filters using PyBIDS. For further details, please check out


Reuse the anatomical derivatives from another ASLPrep run or calculated with an alternative processing tool (NOT RECOMMENDED).

Options to handle performance

--nprocs, --nthreads, --n_cpus, --n-cpus

maximum number of threads across all processes


maximum number of threads per-process

--mem, --mem_mb, --mem-mb

upper bound memory limit for ASLPrep processes


attempt to reduce memory usage (will increase disk usage in working directory)

--use-plugin, --nipype-plugin-file

nipype plugin configuration file


run anatomical workflows only


generate boilerplate only


skip generation of HTML and LaTeX formatted citation with pandoc

-v, --verbose

increases log verbosity for each occurrence, debug level is -vvv

Workflow configuration


Possible choices: fieldmaps, sbref

ignore selected aspects of the input dataset to disable corresponding parts of the workflow (a space delimited list)


treat dataset as longitudinal - may increase runtime


Standard and non-standard spaces to resample anatomical and functional images to. Standard spaces may be specified by the form <SPACE>[:cohort-<label>][:res-<resolution>][...], where <SPACE> is a keyword designating a spatial reference, and may be followed by optional, colon-separated parameters. Non-standard spaces imply specific orientations and sampling grids. Important to note, the res-* modifier does not define the resolution used for the spatial normalization. To generate no ASL outputs, use this option without specifying any spatial references.


Possible choices: register, header

Either “register” (the default) to initialize volumes at center or “header” to use the header information when coregistering ASL to T1w images.


Possible choices: 6, 9, 12

Degrees of freedom when registering ASL to T1w images. 6 degrees (rotation and translation) are used by default.


Always use boundary-based registration (no goodness-of-fit checks)


Do not use boundary-based registration (no goodness-of-fit checks)


Always use boundary-based registration (no goodness-of-fit checks)


Do not use boundary-based registration (no goodness-of-fit checks)


Relative scale between ASL and M0. M0 scans are multiplied by m0_scale before calculating CBF. It is important to note, however, that BIDS expects ASL and M0 data to scaled in the raw dataset, so this parameter should only be used if your dataset does not have pre-scaled data.


Initialize the random seed for the workflow


Number of initial volumes to ignore


Smoothing kernel for the M0 image(s)


Sudipto algoritms for denoising CBF


FSL’s CBF computation with spatial regularization and partial volume correction

Specific options for ANTs registrations


select a template for skull-stripping with antsBrainExtraction


do not use a random seed for skull-stripping - will ensure run-to-run replicability when used with –omp-nthreads 1 and matching –random-seed <int>


Possible choices: auto, skip, force

determiner for T1-weighted skull stripping (‘force’ ensures skull stripping, ‘skip’ ignores skull stripping, and ‘auto’ applies brain extraction based on the outcome of a heuristic to check whether the brain is already masked).

Specific options for handling fieldmaps


fit a B-Spline field using least-squares (experimental)


do not remove median (within mask) from fieldmap

Specific options for SyN distortion correction


EXPERIMENTAL: Use fieldmap-free distortion correction


EXPERIMENTAL/TEMPORARY: Use SyN correction in addition to fieldmap correction, if available

Specific options for FreeSurfer preprocessing


Path to FreeSurfer license key file. Get it (for free) by registering at

Other options

-w, --work-dir

path where intermediate results should be stored


Clears working directory of contents. Use of this flag is notrecommended when running concurrent processes of aslprep.


enable Nipype’s resource monitoring to keep track of memory and CPU usage


only generate reports, don’t run workflows. This will only rerun report aggregation, not reportlet generation for specific nodes.


Specify UUID of previous run, to include error logs in report. No effect without –reports-only.


Write workflow graph.


Force stopping on first crash, even if a work directory was specified.


Opt-out of sending tracking information of this run to the aslprep developers. This information helps to improve aslprep and provides an indicator of real world usage crucial for obtaining funding.


Use low-quality tools for speed - TESTING ONLY

Running ASLPrep via Docker containers

For every new version of ASLPrep that is released, a corresponding Docker image is generated.

In order to run ASLPrep Docker images, the Docker Engine must be installed.

If you have used ASLPrep via Docker in the past, you might need to pull down a more recent version of the image:

$ docker pull pennlinc/aslprep:<latest-version>

ASLPrep can be run interacting directly with the Docker Engine via the docker run command line, or through a lightweight wrapper that was created for convenience.

Running ASLPrep directly interacting with the Docker Engine

Running containers as a user.

In order to run docker smoothly, it is best to prevent permissions issues associated with the root file system. Running docker as user on the host is to ensure the ownership of files written during the container execution.

ASLPrep requires a significant amount of memory, typicaly around 12GB per subject. If using docker desktop, you can set this in preferences. You can also set it on the command line.

A docker container can be created using the following command:

$ docker run -ti -m 12GB --rm \
    -v path/to/data:/data:ro \
    -v path/to/output:/out \
    pennlinc/aslprep:<latest-version> \
    /data /out/out \

For example:

$ docker run -ti -m 12GB --rm \
    -v $HOME/ds000240:/data:ro \
    -v $HOME/ds000240-results:/out:rw \
    -v $HOME/tmp/ds000240-workdir:/work \
    -v ${FREESURFER_HOME}:/fs \
    pennlinc/aslprep:<latest-version> \
    /data /out/aslprep-<latest-version> \
    participant \
    --participant-label '01'
    --fs-license-file ${FREESURFER_HOME}/license.txt
    -w /work

See Usage for more information.

Running ASLPrep via Singularity containers

Preparing a Singularity Image

Singularity version >= 2.5: If the version of Singularity installed on your HPC system is modern enough you can create a Singularity image directly on the system using the following command:

$ singularity build aslprep-<version>.simg docker://pennlinc/aslprep:<version>

where <version> should be replaced with the desired version of ASLPrep that you want to download.

Running a Singularity Image

If the data to be preprocessed is also on the HPC or a personal computer, you are ready to run ASLPrep.

$ singularity run --cleanenv aslprep.simg \
    path/to/data/dir path/to/output/dir \
    participant \
    --participant-label label

Handling environment variables

By default, Singularity interacts with all environment variables from the host. The host libraries could accidentally conflict with singularity variables. To avoid such a situation, it is recommended that you sue the --cleanenv or -e flag. For instance:

$ singularity run --cleanenv aslprep.simg \
    /work/789/asdf/ $WORK/output \
    participant \
    --participant-label 01

Relevant aspects of the $HOME directory within the container. By default, Singularity will bind the user’s $HOME directory on the host into the /home/$USER directory (or equivalent) in the container. Most of the times, it will also redefine the $HOME environment variable and update it to point to the corresponding mount point in /home/$USER. However, these defaults can be overwritten in your system. It is recommended that you check your settings with your system’s administrators. If your Singularity installation allows it, you can work around the $HOME specification combining the bind mounts argument (-B) with the home overwrite argument (--home) as follows:

$ singularity run -B $HOME:/home/aslprep --home /home/aslprep \
    --cleanenv aslprep.simg <aslprep arguments>

The FreeSurfer license

ASLPRep uses FreeSurfer tools, which require a license to run.

To obtain a FreeSurfer license, simply register for free at

When using manually-prepared environments or singularity, FreeSurfer will search for a license key file first using the $FS_LICENSE environment variable and then in the default path to the license key file ($FREESURFER_HOME/license.txt). If using the --cleanenv flag and $FS_LICENSE is set, use --fs-license-file $FS_LICENSE to pass the license file location to ASLPrep.

It is possible to run the docker container pointing the image to a local path where a valid license file is stored. For example, if the license is stored in the $HOME/.licenses/freesurfer/license.txt file on the host system:

$ docker run -ti --rm \
    -v $HOME/fullds005:/data:ro \
    -v $HOME/dockerout:/out \
    -v $HOME/.licenses/freesurfer/license.txt:/opt/freesurfer/license.txt \
    pennlinc/aslprep:latest \
    /data /out/out \
    participant \
    --ignore fieldmaps


Logs and crashfiles are written to the <output dir>/aslprep/sub-<participant_label>/log directory. Information on how to customize and understand these files can be found on the nipype debugging page.

Support and communication

The documentation of this project is found here:

All bugs, concerns and enhancement requests for this software can be submitted here:

If you have a question about using aslprep, please create a new topic on NeuroStars with the “Software Support” category and the “aslprep” tag. The aslprep developers follow NeuroStars, and will be able to answer your question there.