.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/05_glm_second_level/plot_proportion_activated_voxels.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_05_glm_second_level_plot_proportion_activated_voxels.py: Second-level fMRI model: true positive proportion in clusters ============================================================= This script showcases the so-called "All resolution inference" procedure (:footcite:t:`Rosenblatt2018`), in which the proportion of true discoveries in arbitrary clusters is estimated. The clusters can be defined from the input image, i.e. in a circular way, as the error control accounts for arbitrary cluster selection. .. GENERATED FROM PYTHON SOURCE LINES 14-20 Fetch dataset -------------- We download a list of left vs right button press contrasts from a localizer dataset. Note that we fetch individual t-maps that represent the :term:`BOLD` activity estimate divided by the uncertainty about this estimate. .. GENERATED FROM PYTHON SOURCE LINES 20-28 .. code-block:: Python from nilearn.datasets import fetch_localizer_contrasts n_subjects = 16 data = fetch_localizer_contrasts( ["left vs right button press"], n_subjects, legacy_format=False, ) .. GENERATED FROM PYTHON SOURCE LINES 29-33 Estimate second level model --------------------------- We define the input maps and the design matrix for the second level model and fit it. .. GENERATED FROM PYTHON SOURCE LINES 33-40 .. code-block:: Python import pandas as pd second_level_input = data["cmaps"] design_matrix = pd.DataFrame( [1] * len(second_level_input), columns=["intercept"] ) .. GENERATED FROM PYTHON SOURCE LINES 41-42 Model specification and fit .. GENERATED FROM PYTHON SOURCE LINES 42-49 .. code-block:: Python from nilearn.glm.second_level import SecondLevelModel second_level_model = SecondLevelModel(smoothing_fwhm=8.0, n_jobs=2) second_level_model = second_level_model.fit( second_level_input, design_matrix=design_matrix ) .. GENERATED FROM PYTHON SOURCE LINES 50-52 To estimate the :term:`contrast` is very simple. We can just provide the column name of the design matrix. .. GENERATED FROM PYTHON SOURCE LINES 52-54 .. code-block:: Python z_map = second_level_model.compute_contrast(output_type="z_score") .. GENERATED FROM PYTHON SOURCE LINES 55-57 We threshold the second level :term:`contrast` at uncorrected p < 0.001 and plot .. GENERATED FROM PYTHON SOURCE LINES 57-89 .. code-block:: Python from scipy.stats import norm p_val = 0.001 p001_uncorrected = norm.isf(p_val) from nilearn.glm import cluster_level_inference proportion_true_discoveries_img = cluster_level_inference( z_map, threshold=[3, 4, 5], alpha=0.05 ) from nilearn import plotting plotting.plot_stat_map( proportion_true_discoveries_img, threshold=0.0, display_mode="z", vmax=1, colorbar=True, title="group left-right button press, proportion true positives", ) plotting.plot_stat_map( z_map, threshold=p001_uncorrected, colorbar=True, display_mode="z", title="group left-right button press (uncorrected p < 0.001)", ) plotting.show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/05_glm_second_level/images/sphx_glr_plot_proportion_activated_voxels_001.png :alt: plot proportion activated voxels :srcset: /auto_examples/05_glm_second_level/images/sphx_glr_plot_proportion_activated_voxels_001.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/05_glm_second_level/images/sphx_glr_plot_proportion_activated_voxels_002.png :alt: plot proportion activated voxels :srcset: /auto_examples/05_glm_second_level/images/sphx_glr_plot_proportion_activated_voxels_002.png :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 90-94 References ---------- .. footbibliography:: .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 6.859 seconds) **Estimated memory usage:** 9 MB .. _sphx_glr_download_auto_examples_05_glm_second_level_plot_proportion_activated_voxels.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/nilearn/nilearn/0.10.4?urlpath=lab/tree/notebooks/auto_examples/05_glm_second_level/plot_proportion_activated_voxels.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_proportion_activated_voxels.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_proportion_activated_voxels.py ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_