.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/07_advanced/plot_localizer_mass_univariate_methods.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code or to run this example in your browser via Binder. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_07_advanced_plot_localizer_mass_univariate_methods.py: Massively univariate analysis of a motor task from the Localizer dataset ======================================================================== This example shows the results obtained in a massively univariate analysis performed at the inter-subject level with various methods. We use the [left button press (auditory cue)] task from the Localizer dataset and seek association between the contrast values and a variate that measures the speed of pseudo-word reading. No confounding variate is included in the model. 1. A standard :term:`ANOVA` is performed. Data smoothed at 5 :term:`voxels` :term:`FWHM` are used. 2. A permuted Ordinary Least Squares algorithm is run at each :term:`voxel`. Data smoothed at 5 :term:`voxels` :term:`FWHM` are used. .. include:: ../../../examples/masker_note.rst .. GENERATED FROM PYTHON SOURCE LINES 20-25 .. code-block:: Python from nilearn._utils.helpers import check_matplotlib check_matplotlib() .. GENERATED FROM PYTHON SOURCE LINES 26-32 .. code-block:: Python import numpy as np from nilearn import datasets from nilearn.maskers import NiftiMasker from nilearn.mass_univariate import permuted_ols .. GENERATED FROM PYTHON SOURCE LINES 33-34 Load Localizer contrast .. GENERATED FROM PYTHON SOURCE LINES 34-57 .. code-block:: Python n_samples = 94 localizer_dataset = datasets.fetch_localizer_contrasts( ["left button press (auditory cue)"], n_subjects=n_samples, ) # print basic information on the dataset print( "First contrast nifti image (3D) is located " f"at: {localizer_dataset.cmaps[0]}" ) tested_var = localizer_dataset.ext_vars["pseudo"] # Quality check / Remove subjects with bad tested variate mask_quality_check = np.where(np.logical_not(np.isnan(tested_var)))[0] n_samples = mask_quality_check.size contrast_map_filenames = [ localizer_dataset.cmaps[i] for i in mask_quality_check ] tested_var = tested_var[mask_quality_check].to_numpy().reshape((-1, 1)) print(f"Actual number of subjects after quality check: {int(n_samples)}") .. rst-class:: sphx-glr-script-out .. code-block:: none [fetch_localizer_contrasts] Dataset found in /home/runner/nilearn_data/brainomics_localizer First contrast nifti image (3D) is located at: /home/runner/nilearn_data/brainomics_localizer/brainomics_data/S01/cmaps_LeftAuditoryClick.nii.gz Actual number of subjects after quality check: 89 .. GENERATED FROM PYTHON SOURCE LINES 58-59 Mask data .. GENERATED FROM PYTHON SOURCE LINES 59-65 .. code-block:: Python nifti_masker = NiftiMasker( smoothing_fwhm=5, memory="nilearn_cache", memory_level=1, verbose=1 ) fmri_masked = nifti_masker.fit_transform(contrast_map_filenames) .. rst-class:: sphx-glr-script-out .. code-block:: none [NiftiMasker.wrapped] Loading data from ['/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S01/cmaps_LeftA uditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S02/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S03/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S04/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S05/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S06/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S07/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S08/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S09/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S10/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S11/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S12/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S13/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S14/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S16/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S17/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S18/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S19/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S20/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S21/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S22/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S23/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S24/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S25/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S26/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S27/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S28/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S29/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S30/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S31/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S32/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S33/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S34/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S35/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S36/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S37/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S39/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S40/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S41/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S42/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S43/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S44/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S45/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S46/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S47/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S48/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S49/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S50/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S51/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S52/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S53/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S54/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S55/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S56/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S57/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S58/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S59/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S60/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S61/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S63/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S64/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S65/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S66/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S67/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S68/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S69/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S70/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S71/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S72/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S73/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S74/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S75/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S76/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S77/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S78/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S79/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S80/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S82/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S83/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S84/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S85/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S86/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S88/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S89/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S90/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S91/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S92/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S93/cmaps_LeftAu ditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S94/cmaps_LeftAu ditoryClick.nii.gz'] [NiftiMasker.wrapped] Computing mask ________________________________________________________________________________ [Memory] Calling nilearn.masking.compute_background_mask... compute_background_mask([ '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S01/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S02/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S03/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S04/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S05/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S06/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S07/cmaps_LeftAu..., verbose=0) __________________________________________compute_background_mask - 0.6s, 0.0min [NiftiMasker.wrapped] Resampling mask ________________________________________________________________________________ [Memory] Calling nilearn.image.resampling.resample_img... resample_img(, target_affine=None, target_shape=None, copy=False, interpolation='nearest') _____________________________________________________resample_img - 0.0s, 0.0min [NiftiMasker.wrapped] Finished fit ________________________________________________________________________________ [Memory] Calling nilearn.maskers.nifti_masker.filter_and_mask... filter_and_mask([ '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S01/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S02/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S03/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S04/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S05/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S06/cmaps_LeftAuditoryClick.nii.gz', '/home/runner/nilearn_data/brainomics_localizer/brainomics_data/S07/cmaps_LeftAu..., , { 'clean_args': None, 'clean_kwargs': {}, 'cmap': 'gray', 'detrend': False, 'dtype': None, 'high_pass': None, 'high_variance_confounds': False, 'low_pass': None, 'reports': True, 'runs': None, 'smoothing_fwhm': 5, 'standardize': False, 'standardize_confounds': True, 't_r': None, 'target_affine': None, 'target_shape': None}, memory_level=1, memory=Memory(location=nilearn_cache/joblib), verbose=1, confounds=None, sample_mask=None, copy=True, dtype=None, sklearn_output_config=None) [NiftiMasker.wrapped] Loading data from [NiftiMasker.wrapped] Smoothing images [NiftiMasker.wrapped] Extracting region signals [NiftiMasker.wrapped] Cleaning extracted signals __________________________________________________filter_and_mask - 1.3s, 0.0min .. GENERATED FROM PYTHON SOURCE LINES 66-67 Anova (parametric F-scores) .. GENERATED FROM PYTHON SOURCE LINES 67-78 .. code-block:: Python from sklearn.feature_selection import f_regression _, pvals_anova = f_regression(fmri_masked, tested_var.ravel(), center=True) pvals_anova *= fmri_masked.shape[1] pvals_anova[np.isnan(pvals_anova)] = 1 pvals_anova[pvals_anova > 1] = 1 neg_log_pvals_anova = -np.log10(pvals_anova) neg_log_pvals_anova_unmasked = nifti_masker.inverse_transform( neg_log_pvals_anova ) .. rst-class:: sphx-glr-script-out .. code-block:: none [NiftiMasker.inverse_transform] Computing image from signals ________________________________________________________________________________ [Memory] Calling nilearn.masking.unmask... unmask(array([-0., ..., -0.], shape=(41852,)), ) ___________________________________________________________unmask - 0.2s, 0.0min .. GENERATED FROM PYTHON SOURCE LINES 79-89 Perform massively univariate analysis with permuted OLS This method will produce both voxel-level FWE-corrected -log10 p-values and :term:`TFCE`-based FWE-corrected -log10 p-values. .. note:: :func:`~nilearn.mass_univariate.permuted_ols` can support a wide range of analysis designs, depending on the ``tested_var``. For example, if you wished to perform a one-sample test, you could simply provide an array of ones (e.g., ``np.ones(n_samples)``). .. GENERATED FROM PYTHON SOURCE LINES 89-107 .. code-block:: Python ols_outputs = permuted_ols( tested_var, # this is equivalent to the design matrix, in array form fmri_masked, model_intercept=True, masker=nifti_masker, tfce=True, n_perm=100, # 100 for the sake of time. Ideally, this should be 10000. verbose=1, # display progress bar n_jobs=2, # can be changed to use more CPUs ) neg_log_pvals_permuted_ols_unmasked = nifti_masker.inverse_transform( ols_outputs["logp_max_t"][0, :] # select first regressor ) neg_log_pvals_tfce_unmasked = nifti_masker.inverse_transform( ols_outputs["logp_max_tfce"][0, :] # select first regressor ) .. rst-class:: sphx-glr-script-out .. code-block:: none [NiftiMasker.inverse_transform] Computing image from signals ________________________________________________________________________________ [Memory] Calling nilearn.masking.unmask... unmask(array([[ 1.604273, ..., -0.864518]], shape=(1, 41852)), ) ___________________________________________________________unmask - 0.2s, 0.0min [Parallel(n_jobs=2)]: Using backend LokyBackend with 2 concurrent workers. [Parallel(n_jobs=2)]: Done 2 out of 2 | elapsed: 25.1s finished [NiftiMasker.inverse_transform] Computing image from signals ________________________________________________________________________________ [Memory] Calling nilearn.masking.unmask... unmask(array([-0., ..., -0.], shape=(41852,)), ) ___________________________________________________________unmask - 0.2s, 0.0min [NiftiMasker.inverse_transform] Computing image from signals ________________________________________________________________________________ [Memory] Calling nilearn.masking.unmask... unmask(array([ 0.031194, ..., -0. ], shape=(41852,)), ) ___________________________________________________________unmask - 0.2s, 0.0min .. GENERATED FROM PYTHON SOURCE LINES 108-109 Visualization .. GENERATED FROM PYTHON SOURCE LINES 109-157 .. code-block:: Python import matplotlib.pyplot as plt from nilearn import plotting from nilearn.image import get_data threshold = -np.log10(0.1) # 10% corrected vmax = max( np.amax(ols_outputs["logp_max_t"]), np.amax(neg_log_pvals_anova), np.amax(ols_outputs["logp_max_tfce"]), ) images_to_plot = { "Parametric Test\n(Bonferroni FWE)": neg_log_pvals_anova_unmasked, "Permutation Test\n(Max t-statistic FWE)": ( neg_log_pvals_permuted_ols_unmasked ), "Permutation Test\n(Max TFCE FWE)": neg_log_pvals_tfce_unmasked, } fig, axes = plt.subplots(figsize=(10, 4), ncols=3) for i_col, (title, img) in enumerate(images_to_plot.items()): ax = axes[i_col] n_detections = (get_data(img) > threshold).sum() new_title = f"{title}\n{n_detections} sig. voxels" plotting.plot_glass_brain( img, vmax=vmax, display_mode="z", threshold=threshold, vmin=threshold, cmap="inferno", figure=fig, axes=ax, ) ax.set_title(new_title) fig.suptitle( "Group left button press ($-\\log_{10}$ p-values)", y=1, fontsize=16, ) fig.subplots_adjust(top=0.75, wspace=0.5) plotting.show() .. image-sg:: /auto_examples/07_advanced/images/sphx_glr_plot_localizer_mass_univariate_methods_001.png :alt: Group left button press ($-\log_{10}$ p-values), Parametric Test (Bonferroni FWE) 3 sig. voxels, Permutation Test (Max t-statistic FWE) 20 sig. voxels, Permutation Test (Max TFCE FWE) 1139 sig. voxels :srcset: /auto_examples/07_advanced/images/sphx_glr_plot_localizer_mass_univariate_methods_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/nilearn/nilearn/examples/07_advanced/plot_localizer_mass_univariate_methods.py:156: UserWarning: You are using the 'agg' matplotlib backend that is non-interactive. No figure will be plotted when calling matplotlib.pyplot.show() or nilearn.plotting.show(). You can fix this by installing a different backend: for example via pip install PyQt6 .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 35.909 seconds) **Estimated memory usage:** 212 MB .. _sphx_glr_download_auto_examples_07_advanced_plot_localizer_mass_univariate_methods.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/nilearn/nilearn/0.13.1?urlpath=lab/tree/notebooks/auto_examples/07_advanced/plot_localizer_mass_univariate_methods.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_localizer_mass_univariate_methods.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_localizer_mass_univariate_methods.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_localizer_mass_univariate_methods.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_