Show stimuli of Haxby et al. dataset

In this script we plot an overview of the stimuli used in Haxby et al.[1].

from nilearn._utils.helpers import check_matplotlib

check_matplotlib()

import matplotlib.pyplot as plt
from nilearn import datasets
from nilearn.plotting import show

haxby_dataset = datasets.fetch_haxby(subjects=[], fetch_stimuli=True)
stimulus_information = haxby_dataset.stimuli
[get_dataset_dir] Dataset found in /home/runner/nilearn_data/haxby2001
[fetch_single_file] Downloading data from
http://data.pymvpa.org/datasets/haxby2001/stimuli-2010.01.14.tar.gz ...
[fetch_single_file]  ...done. (1 seconds, 0 min)

[uncompress_file] Extracting data from
/home/runner/nilearn_data/haxby2001/ee9e0d5a40146477e9197f0d13da9b32/stimuli-201
0.01.14.tar.gz...
[uncompress_file] .. done.
for stim_type in stimulus_information:
    # skip control images, there are too many
    if stim_type != "controls":
        file_names = stimulus_information[stim_type]

        fig, axes = plt.subplots(6, 8)
        fig.suptitle(stim_type)

        for img_path, ax in zip(file_names, axes.ravel()):
            ax.imshow(plt.imread(img_path), cmap=plt.cm.gray)

        for ax in axes.ravel():
            ax.axis("off")

show()
  • shoes
  • cats
  • houses
  • chairs
  • scissors
  • faces
  • bottles

References

Total running time of the script: (0 minutes 7.859 seconds)

Estimated memory usage: 173 MB

Gallery generated by Sphinx-Gallery