Note

This page is a reference documentation. It only explains the function signature, and not how to use it. Please refer to the user guide for the big picture.

nilearn.image.concat_imgs

nilearn.image.concat_imgs(niimgs, dtype=<class 'numpy.float32'>, ensure_ndim=None, memory=None, memory_level=0, auto_resample=False, verbose=0)[source]

Concatenate a list of 3D/4D niimgs of varying lengths.

The niimgs list can contain niftis/paths to images of varying dimensions (i.e., 3D or 4D) as well as different 3D shapes and affines, as they will be matched to the first image in the list if auto_resample=True.

Parameters:
niimgsiterable of Niimg-like objects or glob pattern

See Input and output: neuroimaging data representation. Niimgs to concatenate.

dtypenumpy dtype, default=np.float32

The dtype of the returned image.

ensure_ndimint, default=None

Indicate the dimensionality of the expected niimg. An error is raised if the niimg is of another dimensionality.

auto_resamplebool, default=False

Converts all images to the space of the first one.

verboseint, default=0

Verbosity level (0 means no message).

memoryNone, instance of joblib.Memory, str, or pathlib.Path

Used to cache the masking process. By default, no caching is done. If a str is given, it is the path to the caching directory.

memory_levelint, default=0

Rough estimator of the amount of memory used by caching. Higher value means more memory for caching. Zero means no caching.

Returns:
concatenatednibabel.Nifti1Image

A single image.

Examples using nilearn.image.concat_imgs

3D and 4D niimgs: handling and visualizing

3D and 4D niimgs: handling and visualizing

Visualizing global patterns with a carpet plot

Visualizing global patterns with a carpet plot

Single-subject data (two runs) in native space

Single-subject data (two runs) in native space

Beta-Series Modeling for Task-Based Functional Connectivity and Decoding

Beta-Series Modeling for Task-Based Functional Connectivity and Decoding