Note
Go to the end to download the full example code or to run this example in your browser via Binder.
Basic nilearn example: manipulating and looking at data¶
A simple example showing how to load an existing Nifti file and use basic nilearn functionalities.
# Let us use a Nifti file that is shipped with nilearn
from nilearn.datasets import MNI152_FILE_PATH
# Note that the variable MNI152_FILE_PATH is just a path to a Nifti file
print(f"Path to MNI152 template: {MNI152_FILE_PATH!r}")
Path to MNI152 template: PosixPath('/home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.10/site-packages/nilearn/datasets/data/mni_icbm152_t1_tal_nlin_sym_09a_converted.nii.gz')
A first step: looking at our data¶
Let’s quickly plot this file:
from nilearn import plotting
plotting.plot_img(MNI152_FILE_PATH)

<nilearn.plotting.displays._slicers.OrthoSlicer object at 0x7f30a2bffe50>
This is not a very pretty plot. We just used the simplest possible code. There is a whole section of the documentation on making prettier code.
Exercise: Try plotting one of your own files. In the above, MNI152_FILE_PATH is nothing more than a string with a path pointing to a nifti image. You can replace it with a string pointing to a file on your disk. Note that it should be a 3D volume, and not a 4D volume.
Simple image manipulation: smoothing¶
Let’s use an image-smoothing function from nilearn:
smooth_img
Functions containing ‘img’ can take either a filename or an image as input.
Here we give as inputs the image filename and the smoothing value in mm
from nilearn import image
smooth_anat_img = image.smooth_img(MNI152_FILE_PATH, fwhm=3)
# While we are giving a file name as input, the function returns
# an in-memory object:
smooth_anat_img
<nibabel.nifti1.Nifti1Image object at 0x7f30a2ce75e0>
This is an in-memory object. We can pass it to nilearn function, for instance to look at it

<nilearn.plotting.displays._slicers.OrthoSlicer object at 0x7f30a2a17ee0>
We could also pass it to the smoothing function

<nilearn.plotting.displays._slicers.OrthoSlicer object at 0x7f30c714c940>
Globbing over multiple 3D volumes¶
Nilearn also supports reading multiple volumes at once, using glob-style patterns. For instance, we can smooth volumes from many subjects at once and get a 4D image as output.
First let’s fetch Haxby dataset for subject 1 and 2
from nilearn import datasets
haxby = datasets.fetch_haxby(subjects=[1, 2])
[fetch_haxby] Dataset found in /home/runner/nilearn_data/haxby2001
[fetch_haxby] Downloading data from
http://data.pymvpa.org/datasets/haxby2001/subj1-2010.01.14.tar.gz ...
[fetch_haxby] Downloaded 12853248 of 314803244 bytes (4.1%%, 23.5s remaining)
[fetch_haxby] Downloaded 22683648 of 314803244 bytes (7.2%%, 25.9s remaining)
[fetch_haxby] Downloaded 27598848 of 314803244 bytes (8.8%%, 31.3s remaining)
[fetch_haxby] Downloaded 30613504 of 314803244 bytes (9.7%%, 37.3s remaining)
[fetch_haxby] Downloaded 33284096 of 314803244 bytes (10.6%%, 42.5s remaining)
[fetch_haxby] Downloaded 36749312 of 314803244 bytes (11.7%%, 45.6s remaining)
[fetch_haxby] Downloaded 39370752 of 314803244 bytes (12.5%%, 49.2s remaining)
[fetch_haxby] Downloaded 41861120 of 314803244 bytes (13.3%%, 52.5s remaining)
[fetch_haxby] Downloaded 43794432 of 314803244 bytes (13.9%%, 56.0s remaining)
[fetch_haxby] Downloaded 46710784 of 314803244 bytes (14.8%%, 57.7s remaining)
[fetch_haxby] Downloaded 49233920 of 314803244 bytes (15.6%%, 59.7s remaining)
[fetch_haxby] Downloaded 51978240 of 314803244 bytes (16.5%%, 1.0min remaining)
[fetch_haxby] Downloaded 55287808 of 314803244 bytes (17.6%%, 1.0min remaining)
[fetch_haxby] Downloaded 58613760 of 314803244 bytes (18.6%%, 1.0min remaining)
[fetch_haxby] Downloaded 61562880 of 314803244 bytes (19.6%%, 1.0min remaining)
[fetch_haxby] Downloaded 65232896 of 314803244 bytes (20.7%%, 1.0min remaining)
[fetch_haxby] Downloaded 68616192 of 314803244 bytes (21.8%%, 1.0min remaining)
[fetch_haxby] Downloaded 71032832 of 314803244 bytes (22.6%%, 1.0min remaining)
[fetch_haxby] Downloaded 74588160 of 314803244 bytes (23.7%%, 1.0min remaining)
[fetch_haxby] Downloaded 77299712 of 314803244 bytes (24.6%%, 1.0min remaining)
[fetch_haxby] Downloaded 80601088 of 314803244 bytes (25.6%%, 1.0min remaining)
[fetch_haxby] Downloaded 85663744 of 314803244 bytes (27.2%%, 59.1s remaining)
[fetch_haxby] Downloaded 89055232 of 314803244 bytes (28.3%%, 58.5s remaining)
[fetch_haxby] Downloaded 94601216 of 314803244 bytes (30.1%%, 56.1s remaining)
[fetch_haxby] Downloaded 102522880 of 314803244 bytes (32.6%%, 52.0s
remaining)
[fetch_haxby] Downloaded 112205824 of 314803244 bytes (35.6%%, 47.2s
remaining)
[fetch_haxby] Downloaded 123265024 of 314803244 bytes (39.2%%, 42.1s
remaining)
[fetch_haxby] Downloaded 137068544 of 314803244 bytes (43.5%%, 36.5s
remaining)
[fetch_haxby] Downloaded 145186816 of 314803244 bytes (46.1%%, 34.0s
remaining)
[fetch_haxby] Downloaded 153591808 of 314803244 bytes (48.8%%, 31.6s
remaining)
[fetch_haxby] Downloaded 161366016 of 314803244 bytes (51.3%%, 29.6s
remaining)
[fetch_haxby] Downloaded 166232064 of 314803244 bytes (52.8%%, 28.7s
remaining)
[fetch_haxby] Downloaded 170303488 of 314803244 bytes (54.1%%, 28.1s
remaining)
[fetch_haxby] Downloaded 172851200 of 314803244 bytes (54.9%%, 28.0s
remaining)
[fetch_haxby] Downloaded 174637056 of 314803244 bytes (55.5%%, 28.2s
remaining)
[fetch_haxby] Downloaded 177717248 of 314803244 bytes (56.5%%, 27.9s
remaining)
[fetch_haxby] Downloaded 181223424 of 314803244 bytes (57.6%%, 27.4s
remaining)
[fetch_haxby] Downloaded 184926208 of 314803244 bytes (58.7%%, 26.8s
remaining)
[fetch_haxby] Downloaded 187580416 of 314803244 bytes (59.6%%, 26.6s
remaining)
[fetch_haxby] Downloaded 189980672 of 314803244 bytes (60.3%%, 26.4s
remaining)
[fetch_haxby] Downloaded 193175552 of 314803244 bytes (61.4%%, 25.9s
remaining)
[fetch_haxby] Downloaded 197279744 of 314803244 bytes (62.7%%, 25.1s
remaining)
[fetch_haxby] Downloaded 200081408 of 314803244 bytes (63.6%%, 24.8s
remaining)
[fetch_haxby] Downloaded 202227712 of 314803244 bytes (64.2%%, 24.6s
remaining)
[fetch_haxby] Downloaded 204029952 of 314803244 bytes (64.8%%, 24.5s
remaining)
[fetch_haxby] Downloaded 206143488 of 314803244 bytes (65.5%%, 24.3s
remaining)
[fetch_haxby] Downloaded 208781312 of 314803244 bytes (66.3%%, 24.0s
remaining)
[fetch_haxby] Downloaded 212041728 of 314803244 bytes (67.4%%, 23.4s
remaining)
[fetch_haxby] Downloaded 215506944 of 314803244 bytes (68.5%%, 22.7s
remaining)
[fetch_haxby] Downloaded 218710016 of 314803244 bytes (69.5%%, 22.1s
remaining)
[fetch_haxby] Downloaded 221773824 of 314803244 bytes (70.4%%, 21.5s
remaining)
[fetch_haxby] Downloaded 223854592 of 314803244 bytes (71.1%%, 21.2s
remaining)
[fetch_haxby] Downloaded 226246656 of 314803244 bytes (71.9%%, 20.8s
remaining)
[fetch_haxby] Downloaded 230957056 of 314803244 bytes (73.4%%, 19.7s
remaining)
[fetch_haxby] Downloaded 235896832 of 314803244 bytes (74.9%%, 18.5s
remaining)
[fetch_haxby] Downloaded 241008640 of 314803244 bytes (76.6%%, 17.2s
remaining)
[fetch_haxby] Downloaded 247005184 of 314803244 bytes (78.5%%, 15.7s
remaining)
[fetch_haxby] Downloaded 250126336 of 314803244 bytes (79.5%%, 15.1s
remaining)
[fetch_haxby] Downloaded 252903424 of 314803244 bytes (80.3%%, 14.5s
remaining)
[fetch_haxby] Downloaded 254992384 of 314803244 bytes (81.0%%, 14.1s
remaining)
[fetch_haxby] Downloaded 258686976 of 314803244 bytes (82.2%%, 13.3s
remaining)
[fetch_haxby] Downloaded 262807552 of 314803244 bytes (83.5%%, 12.3s
remaining)
[fetch_haxby] Downloaded 266846208 of 314803244 bytes (84.8%%, 11.4s
remaining)
[fetch_haxby] Downloaded 271187968 of 314803244 bytes (86.1%%, 10.3s
remaining)
[fetch_haxby] Downloaded 275570688 of 314803244 bytes (87.5%%, 9.3s
remaining)
[fetch_haxby] Downloaded 278863872 of 314803244 bytes (88.6%%, 8.5s
remaining)
[fetch_haxby] Downloaded 281935872 of 314803244 bytes (89.6%%, 7.8s
remaining)
[fetch_haxby] Downloaded 284884992 of 314803244 bytes (90.5%%, 7.2s
remaining)
[fetch_haxby] Downloaded 286834688 of 314803244 bytes (91.1%%, 6.8s
remaining)
[fetch_haxby] Downloaded 288858112 of 314803244 bytes (91.8%%, 6.3s
remaining)
[fetch_haxby] Downloaded 292085760 of 314803244 bytes (92.8%%, 5.5s
remaining)
[fetch_haxby] Downloaded 294322176 of 314803244 bytes (93.5%%, 5.0s
remaining)
[fetch_haxby] Downloaded 296075264 of 314803244 bytes (94.1%%, 4.6s
remaining)
[fetch_haxby] Downloaded 298278912 of 314803244 bytes (94.8%%, 4.1s
remaining)
[fetch_haxby] Downloaded 301277184 of 314803244 bytes (95.7%%, 3.4s
remaining)
[fetch_haxby] Downloaded 304037888 of 314803244 bytes (96.6%%, 2.7s
remaining)
[fetch_haxby] Downloaded 306421760 of 314803244 bytes (97.3%%, 2.1s
remaining)
[fetch_haxby] Downloaded 309198848 of 314803244 bytes (98.2%%, 1.4s
remaining)
[fetch_haxby] Downloaded 312565760 of 314803244 bytes (99.3%%, 0.6s
remaining)
[fetch_haxby] ...done. (80 seconds, 1 min)
[fetch_haxby] Extracting data from
/home/runner/nilearn_data/haxby2001/b2fd65a88d22090da62c3fb828be840e/subj1-2010.
01.14.tar.gz...
[fetch_haxby] .. done.
Now we can find the anatomical images from both subjects using the * wildcard
from pathlib import Path
anats_all_subjects = (
Path(datasets.get_data_dirs()[0]) / "haxby2001" / "subj*" / "anat*"
)
Now we can smooth all the anatomical images at once
This is a 4D image containing one volume per subject
(124, 256, 256, 2)
Saving results to a file¶
We can save any in-memory object as follows:
output_dir = Path.cwd() / "results" / "plot_nilearn_101"
output_dir.mkdir(exist_ok=True, parents=True)
print(f"Output will be saved to: {output_dir}")
anats_all_subjects_smooth.to_filename(
output_dir / "anats_all_subjects_smooth.nii.gz"
)
Output will be saved to: /home/runner/work/nilearn/nilearn/examples/00_tutorials/results/plot_nilearn_101
Finally, calling plotting.show() is necessary to display the figure when running as a script outside IPython
To recap, all the nilearn tools can take data as filenames or glob-style patterns or in-memory objects, and return brain volumes as in-memory objects. These can be passed on to other nilearn tools, or saved to disk.
Total running time of the script: (1 minutes 30.280 seconds)
Estimated memory usage: 310 MB