.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/05_glm_second_level/plot_second_level_two_sample_test.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_05_glm_second_level_plot_second_level_two_sample_test.py: Second-level fMRI model: two-sample test, unpaired and paired ============================================================= Full step-by-step example of fitting a :term:`GLM` to perform a second level analysis in experimental data and visualizing the results More specifically: 1. A sample of n=16 visual activity fMRIs are downloaded. 2. An unpaired, two-sample t-test is applied to the brain maps in order to see the effect of the contrast difference across subjects. 3. A paired, two-sample t-test is applied to the brain maps in order to see the effect of the contrast difference across subjects, considering subject intercepts The contrast is between responses to retinotopically distinct vertical versus horizontal checkerboards. At the individual level, these stimuli are sometimes used to map the borders of primary visual areas. At the group level, such a mapping is not possible. Yet, we may observe some significant effects in these areas. .. GENERATED FROM PYTHON SOURCE LINES 29-34 .. code-block:: Python import pandas as pd from nilearn.datasets import fetch_localizer_contrasts from nilearn.plotting import plot_design_matrix, plot_glass_brain, show .. GENERATED FROM PYTHON SOURCE LINES 35-39 Fetch dataset ------------- We download a list of left vs right button press contrasts from a localizer dataset. .. GENERATED FROM PYTHON SOURCE LINES 39-53 .. code-block:: Python n_subjects = 16 sample_vertical = fetch_localizer_contrasts( ["vertical checkerboard"], n_subjects, ) sample_horizontal = fetch_localizer_contrasts( ["horizontal checkerboard"], n_subjects, ) # Implicitly, there is a one-to-one correspondence between the two samples: # the first image of both samples comes from subject S1, # the second from subject S2 etc. .. rst-class:: sphx-glr-script-out .. code-block:: none [fetch_localizer_contrasts] Dataset found in /home/runner/nilearn_data/brainomics_localizer [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d27c2c41c5b4a001d9f4e7e/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d27d3c3114a42001804500a/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d27e5fa1c5b4a001aa09681/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d27f18945253a00193cb2dd/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d2808401c5b4a001d9f83b2/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d2811fba26b340017085492/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d282b2345253a001c3e7d09/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28318445253a00193ce6d7/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d2848581c5b4a001aa10aac/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28545ca26b340018089ba7/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d285cd945253a001a3c8509/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d286e49114a42001904ab90/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d288af11c5b4a001d9ff0cb/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d289be945253a001c3ef5e2/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28a1c91c5b4a001da00bd9/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28bb90a26b3400190925d2/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Dataset found in /home/runner/nilearn_data/brainomics_localizer [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d27ccde1c5b4a001d9f5602/ ... [fetch_localizer_contrasts] ...done. (3 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d27d9c6114a420019045370/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d27de38a26b340016099771/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d27fb651c5b4a001d9f7938/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d280057a26b340019089965/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d2814d145253a001c3e6404/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28244745253a001b3c4afa/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28309645253a001a3c6a8d/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d284a3445253a001c3ea2d1/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28564b1c5b4a001d9fc9d6/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d285b6c1c5b4a001c9edada/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28765645253a001b3c8106/ ... [fetch_localizer_contrasts] ...done. (3 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d287eeb45253a001c3ed1ba/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d2896fb45253a001a3cabe0/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28af541c5b4a001da01caa/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) [fetch_localizer_contrasts] Downloading data from https://osf.io/download/5d28b9af45253a001a3ccb85/ ... [fetch_localizer_contrasts] ...done. (2 seconds, 0 min) .. GENERATED FROM PYTHON SOURCE LINES 54-58 Estimate second level models ---------------------------- We define the input maps and the design matrix for the second level model and fit it. .. GENERATED FROM PYTHON SOURCE LINES 58-60 .. code-block:: Python second_level_input = sample_vertical["cmaps"] + sample_horizontal["cmaps"] .. GENERATED FROM PYTHON SOURCE LINES 61-62 Next, we model the effect of conditions (sample 1 vs sample 2). .. GENERATED FROM PYTHON SOURCE LINES 62-66 .. code-block:: Python import numpy as np condition_effect = np.hstack(([1] * n_subjects, [0] * n_subjects)) .. GENERATED FROM PYTHON SOURCE LINES 67-69 The design matrix for the unpaired test needs to add an intercept, For the paired test, we include an intercept for each subject. .. GENERATED FROM PYTHON SOURCE LINES 69-72 .. code-block:: Python subject_effect = np.vstack((np.eye(n_subjects), np.eye(n_subjects))) subjects = [f"S{i:02d}" for i in range(1, n_subjects + 1)] .. GENERATED FROM PYTHON SOURCE LINES 73-74 We then assemble those into design matrices .. GENERATED FROM PYTHON SOURCE LINES 74-86 .. code-block:: Python unpaired_design_matrix = pd.DataFrame( { "vertical vs horizontal": condition_effect, "intercept": 1, } ) paired_design_matrix = pd.DataFrame( np.hstack((condition_effect[:, np.newaxis], subject_effect)), columns=["vertical vs horizontal", *subjects], ) .. GENERATED FROM PYTHON SOURCE LINES 87-88 and plot the designs. .. GENERATED FROM PYTHON SOURCE LINES 88-105 .. code-block:: Python import matplotlib.pyplot as plt _, (ax_unpaired, ax_paired) = plt.subplots( 1, 2, gridspec_kw={"width_ratios": [1, 17]}, constrained_layout=True, ) plot_design_matrix(unpaired_design_matrix, rescale=False, axes=ax_unpaired) plot_design_matrix(paired_design_matrix, rescale=False, axes=ax_paired) ax_unpaired.set_title("unpaired design", fontsize=12) ax_paired.set_title("paired design", fontsize=12) show() .. image-sg:: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_001.png :alt: unpaired design, paired design :srcset: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 106-107 We specify the analysis models and fit them. .. GENERATED FROM PYTHON SOURCE LINES 107-117 .. code-block:: Python from nilearn.glm.second_level import SecondLevelModel second_level_model_unpaired = SecondLevelModel(n_jobs=2).fit( second_level_input, design_matrix=unpaired_design_matrix ) second_level_model_paired = SecondLevelModel(n_jobs=2).fit( second_level_input, design_matrix=paired_design_matrix ) .. GENERATED FROM PYTHON SOURCE LINES 118-122 Estimating the :term:`contrast` is simple. To do so, we provide the column name of the design matrix. The argument 'output_type' is set to return all available outputs so that we can compare differences in the effect size, variance, and z-score. .. GENERATED FROM PYTHON SOURCE LINES 122-130 .. code-block:: Python stat_maps_unpaired = second_level_model_unpaired.compute_contrast( "vertical vs horizontal", output_type="all" ) stat_maps_paired = second_level_model_paired.compute_contrast( "vertical vs horizontal", output_type="all" ) .. GENERATED FROM PYTHON SOURCE LINES 131-135 Plot the results ---------------- The two :term:`'effect_size'` images are essentially identical. .. GENERATED FROM PYTHON SOURCE LINES 135-140 .. code-block:: Python ( stat_maps_unpaired["effect_size"].get_fdata() - stat_maps_paired["effect_size"].get_fdata() ).max() .. rst-class:: sphx-glr-script-out .. code-block:: none 2.220446049250313e-15 .. GENERATED FROM PYTHON SOURCE LINES 141-142 But the variance in the unpaired image is larger. .. GENERATED FROM PYTHON SOURCE LINES 142-160 .. code-block:: Python plot_glass_brain( stat_maps_unpaired["effect_variance"], vmin=0, vmax=6, cmap="inferno", title="vertical vs horizontal effect variance, unpaired", ) plot_glass_brain( stat_maps_paired["effect_variance"], vmin=0, vmax=6, cmap="inferno", title="vertical vs horizontal effect variance, paired", ) show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_002.png :alt: plot second level two sample test :srcset: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_002.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_003.png :alt: plot second level two sample test :srcset: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_003.png :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 161-163 Together, this makes the z_scores from the paired test larger. We threshold the second level :term:`contrast` and plot it. .. GENERATED FROM PYTHON SOURCE LINES 163-182 .. code-block:: Python threshold = 3.1 # corresponds to p < .001, uncorrected plot_glass_brain( stat_maps_unpaired["z_score"], threshold=threshold, plot_abs=False, vmax=5.8, title="vertical vs horizontal (unc p<0.001), unpaired", ) plot_glass_brain( stat_maps_paired["z_score"], threshold=threshold, plot_abs=False, vmax=5.8, title="vertical vs horizontal (unc p<0.001), paired", ) show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_004.png :alt: plot second level two sample test :srcset: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_004.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_005.png :alt: plot second level two sample test :srcset: /auto_examples/05_glm_second_level/images/sphx_glr_plot_second_level_two_sample_test_005.png :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 183-185 Unsurprisingly, we see activity in the primary visual cortex, both positive and negative. .. rst-class:: sphx-glr-timing **Total running time of the script:** (1 minutes 15.974 seconds) **Estimated memory usage:** 108 MB .. _sphx_glr_download_auto_examples_05_glm_second_level_plot_second_level_two_sample_test.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/nilearn/nilearn/0.12.0?urlpath=lab/tree/notebooks/auto_examples/05_glm_second_level/plot_second_level_two_sample_test.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_second_level_two_sample_test.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_second_level_two_sample_test.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_second_level_two_sample_test.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_