.. only:: html
.. note::
:class: sphx-glr-download-link-note
Click :ref:`here ` to download the full example code or to run this example in your browser via Binder
.. rst-class:: sphx-glr-example-title
.. _sphx_glr_auto_examples_02_decoding_plot_haxby_stimuli.py:
Show stimuli of Haxby et al. dataset
===============================================================================
In this script we plot an overview of the stimuli used in "Distributed
and Overlapping Representations of Faces and Objects in Ventral Temporal
Cortex" (Science 2001)
.. rst-class:: sphx-glr-horizontal
*
.. image:: /auto_examples/02_decoding/images/sphx_glr_plot_haxby_stimuli_001.png
:alt: shoes
:class: sphx-glr-multi-img
*
.. image:: /auto_examples/02_decoding/images/sphx_glr_plot_haxby_stimuli_002.png
:alt: scissors
:class: sphx-glr-multi-img
*
.. image:: /auto_examples/02_decoding/images/sphx_glr_plot_haxby_stimuli_003.png
:alt: houses
:class: sphx-glr-multi-img
*
.. image:: /auto_examples/02_decoding/images/sphx_glr_plot_haxby_stimuli_004.png
:alt: faces
:class: sphx-glr-multi-img
*
.. image:: /auto_examples/02_decoding/images/sphx_glr_plot_haxby_stimuli_005.png
:alt: chairs
:class: sphx-glr-multi-img
*
.. image:: /auto_examples/02_decoding/images/sphx_glr_plot_haxby_stimuli_006.png
:alt: cats
:class: sphx-glr-multi-img
*
.. image:: /auto_examples/02_decoding/images/sphx_glr_plot_haxby_stimuli_007.png
:alt: bottles
:class: sphx-glr-multi-img
.. rst-class:: sphx-glr-script-out
Out:
.. code-block:: none
Downloading data from http://data.pymvpa.org/datasets/haxby2001/stimuli-2010.01.14.tar.gz ...
...done. (1 seconds, 0 min)
Extracting data from /home/varoquau/nilearn_data/haxby2001/5cd78c74b711572c7f41a5bddb69abca/stimuli-2010.01.14.tar.gz..... done.
|
.. code-block:: default
import matplotlib.pyplot as plt
from nilearn import datasets
from nilearn.plotting import show
haxby_dataset = datasets.fetch_haxby(subjects=[], fetch_stimuli=True)
stimulus_information = haxby_dataset.stimuli
for stim_type in stimulus_information:
# skip control images, there are too many
if stim_type != 'controls':
file_names = stimulus_information[stim_type]
fig, axes = plt.subplots(6, 8)
fig.suptitle(stim_type)
for img_path, ax in zip(file_names, axes.ravel()):
ax.imshow(plt.imread(img_path), cmap=plt.cm.gray)
for ax in axes.ravel():
ax.axis("off")
show()
.. rst-class:: sphx-glr-timing
**Total running time of the script:** ( 0 minutes 8.435 seconds)
.. _sphx_glr_download_auto_examples_02_decoding_plot_haxby_stimuli.py:
.. only :: html
.. container:: sphx-glr-footer
:class: sphx-glr-footer-example
.. container:: binder-badge
.. image:: https://mybinder.org/badge_logo.svg
:target: https://mybinder.org/v2/gh/nilearn/nilearn.github.io/master?filepath=examples/auto_examples/02_decoding/plot_haxby_stimuli.ipynb
:width: 150 px
.. container:: sphx-glr-download sphx-glr-download-python
:download:`Download Python source code: plot_haxby_stimuli.py `
.. container:: sphx-glr-download sphx-glr-download-jupyter
:download:`Download Jupyter notebook: plot_haxby_stimuli.ipynb `
.. only:: html
.. rst-class:: sphx-glr-signature
`Gallery generated by Sphinx-Gallery `_