Simple example of NiftiMasker use

Here is a simple example of automatic mask computation using the nifti masker. The mask is computed and visualized.

Warning

If you are using Nilearn with a version older than 0.9.0, then you should either upgrade your version or import maskers from the input_data module instead of the maskers module.

That is, you should manually replace in the following example all occurrences of:

from nilearn.maskers import NiftiMasker

with:

from nilearn.input_data import NiftiMasker

Retrieve the brain development functional dataset

We fetch the dataset and print some basic information about it.

from nilearn.datasets import fetch_development_fmri

dataset = fetch_development_fmri(n_subjects=1)
func_filename = dataset.func[0]

print(f"First functional nifti image (4D) is at: {func_filename}")
[fetch_development_fmri] Dataset found in
/home/runner/nilearn_data/development_fmri
[fetch_development_fmri] Dataset found in
/home/runner/nilearn_data/development_fmri/development_fmri
[fetch_development_fmri] Dataset found in
/home/runner/nilearn_data/development_fmri/development_fmri
First functional nifti image (4D) is at: /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz

Compute the mask

As the input image is an EPI image, the background is noisy and we cannot rely on the 'background' masking strategy. We need to use the 'epi' one.

from nilearn.maskers import NiftiMasker

masker = NiftiMasker(
    standardize="zscore_sample",
    mask_strategy="epi",
    memory="nilearn_cache",
    memory_level=1,
    smoothing_fwhm=8,
    verbose=1,
)

Note

When viewing an Nilearn estimator in a notebook (or more generally on an HTML page like here) you get an expandable ‘Parameters’ section where the parameters that have different values from their default are highlighted in orange. If you are using a version of scikit-learn >= 1.8.0 you will also get access to the ‘docstring’ description of each parameter.

NiftiMasker(mask_strategy='epi', memory='nilearn_cache', smoothing_fwhm=8,
            standardize='zscore_sample', verbose=1)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


[NiftiMasker.fit] Loading data from
'/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-p
ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz'
[NiftiMasker.fit] Computing mask
________________________________________________________________________________
[Memory] Calling nilearn.masking.compute_epi_mask...
compute_epi_mask('/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', verbose=0)
_________________________________________________compute_epi_mask - 0.3s, 0.0min
[NiftiMasker.fit] Resampling mask
________________________________________________________________________________
[Memory] Calling nilearn.image.resampling.resample_img...
resample_img(<nibabel.nifti1.Nifti1Image object at 0x7f7f3ede5e40>, target_affine=None, target_shape=None, copy=False, interpolation='nearest')
_____________________________________________________resample_img - 0.0s, 0.0min
[NiftiMasker.fit] Finished fit
NiftiMasker(mask_strategy='epi', memory='nilearn_cache', smoothing_fwhm=8,
            standardize='zscore_sample', verbose=1)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


Note

You can also note that after fitting, the HTML representation of the estimator looks different than before before fitting.

NiftiMasker(mask_strategy='epi', memory='nilearn_cache', smoothing_fwhm=8,
            standardize='zscore_sample', verbose=1)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


Visualize the mask

We can quickly get an idea about the estimated mask for this functional image by plotting the mask.

We get the estimated mask from the mask_img_ attribute of the masker: the final _ ofd this attribute name means it was generated by the fit method.

We can then plot it using the plot_roi function with the mean functional image as background.

from nilearn.image.image import mean_img
from nilearn.plotting import plot_roi, show

mask_img = masker.mask_img_

mean_func_img = mean_img(func_filename)

plot_roi(mask_img, mean_func_img, display_mode="y", cut_coords=4, title="Mask")

show()
plot nifti simple
/home/runner/work/nilearn/nilearn/examples/06_manipulating_images/plot_nifti_simple.py:85: UserWarning: You are using the 'agg' matplotlib backend that is non-interactive.
No figure will be plotted when calling matplotlib.pyplot.show() or nilearn.plotting.show().
You can fix this by installing a different backend: for example via
        pip install PyQt6
  show()

Visualize the masker report

More information can be obtained about the masker and its mask by generating a masker report. This can be done using the generate_report method.

Note

The generated report can be:

  • displayed in a Notebook,

  • opened in a browser using the .open_in_browser() method,

  • or saved to a file using the .save_as_html(output_filepath) method.

NiftiMasker Applying a mask to extract time-series from Niimg-like objects. NiftiMasker is useful when preprocessing (detrending, standardization, resampling, etc.) of in-mask :term:`voxels` is necessary. Use case: working with time series of :term:`resting-state` or task maps.

No image provided.

This report shows the input Nifti image overlaid with the outlines of the mask (in green). We recommend to inspect the report for the overlap between the mask and its input image.

The mask includes 24256 voxels (16.4 %) of the image.

NiftiMasker(mask_strategy='epi', memory='nilearn_cache', smoothing_fwhm=8,
            standardize='zscore_sample', verbose=1)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.

This report was generated based on information provided at instantiation and fit time. Note that the masker can potentially perform resampling at transform time.



Preprocess data with the NiftiMasker

We extract the data from the nifti image and turn it into a numpy array.

________________________________________________________________________________
[Memory] Calling nilearn.maskers.nifti_masker.filter_and_mask...
filter_and_mask('/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz',
<nibabel.nifti1.Nifti1Image object at 0x7f7f3ede5e40>, { 'clean_args': None,
  'clean_kwargs': {},
  'cmap': 'gray',
  'detrend': False,
  'dtype': None,
  'high_pass': None,
  'high_variance_confounds': False,
  'low_pass': None,
  'reports': True,
  'runs': None,
  'smoothing_fwhm': 8,
  'standardize': 'zscore_sample',
  'standardize_confounds': True,
  't_r': None,
  'target_affine': None,
  'target_shape': None}, memory_level=1, memory=Memory(location=nilearn_cache/joblib), verbose=1, confounds=None, sample_mask=None, copy=True, dtype=None, sklearn_output_config=None)
[NiftiMasker.wrapped] Loading data from <nibabel.nifti1.Nifti1Image object at
0x7f7f24c1a290>
[NiftiMasker.wrapped] Smoothing images
[NiftiMasker.wrapped] Extracting region signals
[NiftiMasker.wrapped] Cleaning extracted signals
__________________________________________________filter_and_mask - 1.3s, 0.0min
(168, 24256)

fmri_masked is now a 2D numpy array, (n_voxels x n_time_points).

Run an algorithm and visualize the results

Given that we now have a numpy array, we can then pass the data the wide range of algorithm. Here we will just do an independent component analysis, turned the extracted component back into images (using inverse_transform), then we will plot the first component.

from sklearn.decomposition import FastICA

from nilearn.image import index_img
from nilearn.plotting import plot_stat_map, show

ica = FastICA(n_components=10, random_state=42, tol=0.001, max_iter=2000)
components_masked = ica.fit_transform(fmri_masked.T).T

components = masker.inverse_transform(components_masked)

plot_stat_map(
    index_img(components, 0),
    mean_func_img,
    display_mode="y",
    cut_coords=4,
    title="Component 0",
)

show()
plot nifti simple
/home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.10/site-packages/sklearn/decomposition/_fastica.py:127: ConvergenceWarning: FastICA did not converge. Consider increasing tolerance or the maximum number of iterations.
  warnings.warn(
[NiftiMasker.inverse_transform] Computing image from signals
________________________________________________________________________________
[Memory] Calling nilearn.masking.unmask...
unmask(array([[0.244354, ..., 2.145976],
       ...,
       [0.500638, ..., 0.718085]], shape=(10, 24256)),
<nibabel.nifti1.Nifti1Image object at 0x7f7f3ede5e40>)
___________________________________________________________unmask - 0.2s, 0.0min
/home/runner/work/nilearn/nilearn/examples/06_manipulating_images/plot_nifti_simple.py:144: UserWarning: You are using the 'agg' matplotlib backend that is non-interactive.
No figure will be plotted when calling matplotlib.pyplot.show() or nilearn.plotting.show().
You can fix this by installing a different backend: for example via
        pip install PyQt6
  show()

Total running time of the script: (0 minutes 23.338 seconds)

Estimated memory usage: 668 MB

Gallery generated by Sphinx-Gallery