.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/03_connectivity/plot_compare_decomposition.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_03_connectivity_plot_compare_decomposition.py: Deriving spatial maps from group fMRI data using ICA and Dictionary Learning ============================================================================ Various approaches exist to derive spatial maps or networks from group fmr data. The methods extract distributed brain regions that exhibit similar :term:`BOLD` fluctuations over time. Decomposition methods allow for generation of many independent maps simultaneously without the need to provide a priori information (e.g. seeds or priors.) This example will apply two popular decomposition methods, :term:`ICA` and :term:`Dictionary learning`, to :term:`fMRI` data measured while children and young adults watch movies. The resulting maps will be visualized using atlas plotting tools. :term:`CanICA` is an :term:`ICA` method for group-level analysis of :term:`fMRI` data. Compared to other strategies, it brings a well-controlled group model, as well as a thresholding algorithm controlling for specificity and sensitivity with an explicit model of the signal. The reference paper is :footcite:t:`Varoquaux2010c`. .. GENERATED FROM PYTHON SOURCE LINES 27-29 Load brain development :term:`fMRI` dataset ------------------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 29-38 .. code-block:: Python from nilearn.datasets import fetch_development_fmri rest_dataset = fetch_development_fmri(n_subjects=30) func_filenames = rest_dataset.func # list of 4D nifti files for each subject # print basic information on the dataset print(f"First functional nifti image (4D) is at: {rest_dataset.func[0]}") .. rst-class:: sphx-glr-script-out .. code-block:: none [fetch_development_fmri] Dataset found in /home/runner/nilearn_data/development_fmri [fetch_development_fmri] Dataset found in /home/runner/nilearn_data/development_fmri/development_fmri [fetch_development_fmri] Dataset found in /home/runner/nilearn_data/development_fmri/development_fmri First functional nifti image (4D) is at: /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar128_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz .. GENERATED FROM PYTHON SOURCE LINES 39-44 Apply :term:`CanICA` on the data -------------------------------- We use "whole-brain-template" as a strategy to compute the mask, as this leads to slightly faster and more reproducible results. However, the images need to be in :term:`MNI` template space. .. GENERATED FROM PYTHON SOURCE LINES 44-72 .. code-block:: Python from nilearn.decomposition import CanICA canica = CanICA( n_components=20, memory="nilearn_cache", memory_level=2, verbose=10, mask_strategy="whole-brain-template", random_state=0, standardize="zscore_sample", n_jobs=2, ) canica.fit(func_filenames) # Retrieve the independent components in brain space. Directly # accessible through attribute `components_img_`. canica_components_img = canica.components_img_ # components_img is a Nifti Image object, and can be saved to a file with # the following lines: from pathlib import Path output_dir = Path.cwd() / "results" / "plot_compare_decomposition" output_dir.mkdir(exist_ok=True, parents=True) print(f"Output will be saved to: {output_dir}") canica_components_img.to_filename(output_dir / "canica_resting_state.nii.gz") .. rst-class:: sphx-glr-script-out .. code-block:: none [CanICA.fit] Loading data from [/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar128_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar126_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar125_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar124_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar127_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar024_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar023_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar022_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar021_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar020_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar019_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar018_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar017_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar016_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar001_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar013_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar012_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar011_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar010_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar009_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar008_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar007_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar006_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar005_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar004_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar003_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar002_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar014_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar015_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz]. [CanICA.fit] Computing mask /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/joblib/memory.py:632: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. return hashing.hash(filter_args(self.func, self.ignore, args, kwargs), [CanICA.fit] Template whole-brain mask computation /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/joblib/memory.py:810: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. argument_dict = filter_args(self.func, self.ignore, /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/joblib/memory.py:632: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. return hashing.hash(filter_args(self.func, self.ignore, args, kwargs), [CanICA.fit] Resampling mask [CanICA.fit] Loading data ________________________________________________________________________________ [Memory] Calling sklearn.utils.extmath.randomized_svd... randomized_svd(array([[ 0.001034, ..., -0.011761], ..., [ 0.007108, ..., -0.005026]]), n_components=20, transpose=True, random_state=0, n_iter=3) ___________________________________________________randomized_svd - 0.8s, 0.0min [Parallel(n_jobs=2)]: Using backend LokyBackend with 2 concurrent workers. [Parallel(n_jobs=2)]: Done 1 tasks | elapsed: 0.5s [Parallel(n_jobs=2)]: Done 4 tasks | elapsed: 1.8s [Parallel(n_jobs=2)]: Done 10 out of 10 | elapsed: 3.9s finished Output will be saved to: /home/runner/work/nilearn/nilearn/examples/03_connectivity/results/plot_compare_decomposition .. GENERATED FROM PYTHON SOURCE LINES 73-74 To visualize we plot the outline of all components on one figure .. GENERATED FROM PYTHON SOURCE LINES 74-80 .. code-block:: Python from nilearn.plotting import plot_prob_atlas # Plot all ICA components together plot_prob_atlas(canica_components_img, title="All ICA components") .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_001.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/nilearn/plotting/displays/_axes.py:94: UserWarning: No contour levels were found within the data range. im = getattr(ax, type)( /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/nilearn/plotting/displays/_axes.py:94: UserWarning: linewidths is ignored by contourf im = getattr(ax, type)( /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/numpy/ma/core.py:2826: UserWarning: Warning: converting a masked element to nan. _data = np.array(data, dtype=dtype, copy=copy, /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/matplotlib/contour.py:1494: UserWarning: Warning: converting a masked element to nan. self.zmax = float(z.max()) /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/matplotlib/contour.py:1495: UserWarning: Warning: converting a masked element to nan. self.zmin = float(z.min()) /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/nilearn/plotting/displays/_slicers.py:730: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance. self._colorbar_ax = figure.add_axes(lt_wid_top_ht) .. GENERATED FROM PYTHON SOURCE LINES 81-82 Finally, we plot the map for each :term:`ICA` component separately .. GENERATED FROM PYTHON SOURCE LINES 82-99 .. code-block:: Python from nilearn.image import iter_img from nilearn.plotting import plot_stat_map, show for i, cur_img in enumerate(iter_img(canica_components_img)): plot_stat_map( cur_img, display_mode="z", title=f"IC {int(i)}", cut_coords=1, vmax=0.05, vmin=-0.05, colorbar=False, ) show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_002.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_002.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_003.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_003.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_004.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_004.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_005.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_005.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_006.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_006.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_007.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_007.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_008.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_008.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_009.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_009.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_010.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_010.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_011.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_011.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_012.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_012.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_013.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_013.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_014.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_014.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_015.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_015.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_016.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_016.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_017.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_017.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_018.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_018.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_019.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_019.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_020.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_020.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_021.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_021.png :class: sphx-glr-multi-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/nilearn/plotting/displays/_slicers.py:1674: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance. ax = fh.add_axes( .. GENERATED FROM PYTHON SOURCE LINES 100-109 Compare :term:`CanICA` to dictionary learning --------------------------------------------- :term:`Dictionary learning` is a sparsity based decomposition method for extracting spatial maps. It extracts maps that are naturally sparse and usually cleaner than :term:`ICA`. Here, we will compare networks built with :term:`CanICA` to networks built with :term:`Dictionary learning`. For more detailse see :footcite:t:`Mensch2016`. .. GENERATED FROM PYTHON SOURCE LINES 112-113 Create a dictionary learning estimator .. GENERATED FROM PYTHON SOURCE LINES 113-139 .. code-block:: Python from nilearn.decomposition import DictLearning dict_learning = DictLearning( n_components=20, memory="nilearn_cache", memory_level=2, verbose=1, random_state=0, n_epochs=1, mask_strategy="whole-brain-template", standardize="zscore_sample", n_jobs=2, ) print("[Example] Fitting dictionary learning model") dict_learning.fit(func_filenames) print("[Example] Saving results") # Grab extracted components umasked back to Nifti image. # Note: For older versions, less than 0.4.1. components_img_ # is not implemented. See Note section above for details. dictlearning_components_img = dict_learning.components_img_ dictlearning_components_img.to_filename( output_dir / "dictionary_learning_resting_state.nii.gz" ) .. rst-class:: sphx-glr-script-out .. code-block:: none [Example] Fitting dictionary learning model [DictLearning.fit] Loading data from [/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar128_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar126_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar125_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar124_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar127_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar024_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar023_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar022_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar021_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar020_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar019_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar018_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar017_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar016_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar001_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar013_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar012_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar011_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar010_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar009_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar008_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar007_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar006_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar005_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar004_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar003_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar002_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar014_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar015_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz]. [DictLearning.fit] Computing mask /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/joblib/memory.py:632: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. return hashing.hash(filter_args(self.func, self.ignore, args, kwargs), /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/joblib/memory.py:810: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. argument_dict = filter_args(self.func, self.ignore, /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/joblib/memory.py:632: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. return hashing.hash(filter_args(self.func, self.ignore, args, kwargs), [DictLearning.fit] Resampling mask [DictLearning.fit] Loading data [DictLearning.fit] Learning initial components ________________________________________________________________________________ [Memory] Calling sklearn.utils.extmath.randomized_svd... randomized_svd(array([[-4.725221e-04, ..., -5.596845e-03], ..., [ 1.390502e-03, ..., -8.438412e-05]]), n_components=20, transpose=True, random_state=0, n_iter=3) ___________________________________________________randomized_svd - 0.6s, 0.0min [Parallel(n_jobs=2)]: Using backend LokyBackend with 2 concurrent workers. [Parallel(n_jobs=2)]: Done 1 out of 1 | elapsed: 0.4s finished [DictLearning.fit] Computing initial loadings ________________________________________________________________________________ [Memory] Calling nilearn.decomposition.dict_learning._compute_loadings... _compute_loadings(array([[ 0.003026, ..., -0.000948], ..., [ 0.001213, ..., 0.014412]]), array([[-0.280625, ..., 0.825802], ..., [-0.997198, ..., -0.015035]])) _________________________________________________compute_loadings - 0.2s, 0.0min [DictLearning.fit] Learning dictionary ________________________________________________________________________________ [Memory] Calling sklearn.decomposition._dict_learning.dict_learning_online... dict_learning_online(array([[-0.280625, ..., -0.997198], ..., [ 0.825802, ..., -0.015035]]), 20, alpha=10, batch_size=20, method='cd', dict_init=array([[-0.168973, ..., 0.081441], ..., [ 0.00847 , ..., -0.496571]]), verbose=0, random_state=0, return_code=True, shuffle=True, n_jobs=1, max_iter=1090) _____________________________________________dict_learning_online - 1.4s, 0.0min [Example] Saving results .. GENERATED FROM PYTHON SOURCE LINES 140-143 Visualize the results First plot all DictLearning components together .. GENERATED FROM PYTHON SOURCE LINES 143-148 .. code-block:: Python plot_prob_atlas( dictlearning_components_img, title="All DictLearning components" ) .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_022.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_022.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/nilearn/plotting/displays/_axes.py:94: UserWarning: linewidths is ignored by contourf im = getattr(ax, type)( /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/nilearn/plotting/displays/_axes.py:94: UserWarning: No contour levels were found within the data range. im = getattr(ax, type)( /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/numpy/ma/core.py:2826: UserWarning: Warning: converting a masked element to nan. _data = np.array(data, dtype=dtype, copy=copy, /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/matplotlib/contour.py:1494: UserWarning: Warning: converting a masked element to nan. self.zmax = float(z.max()) /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/matplotlib/contour.py:1495: UserWarning: Warning: converting a masked element to nan. self.zmin = float(z.min()) /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/nilearn/plotting/displays/_slicers.py:730: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance. self._colorbar_ax = figure.add_axes(lt_wid_top_ht) .. GENERATED FROM PYTHON SOURCE LINES 149-150 One plot of each component .. GENERATED FROM PYTHON SOURCE LINES 150-162 .. code-block:: Python for i, cur_img in enumerate(iter_img(dictlearning_components_img)): plot_stat_map( cur_img, display_mode="z", title=f"Comp {int(i)}", cut_coords=1, vmax=0.1, vmin=-0.1, colorbar=False, ) .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_023.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_023.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_024.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_024.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_025.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_025.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_026.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_026.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_027.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_027.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_028.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_028.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_029.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_029.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_030.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_030.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_031.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_031.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_032.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_032.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_033.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_033.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_034.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_034.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_035.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_035.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_036.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_036.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_037.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_037.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_038.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_038.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_039.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_039.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_040.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_040.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_041.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_041.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_042.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_042.png :class: sphx-glr-multi-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/nilearn/plotting/displays/_slicers.py:1674: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance. ax = fh.add_axes( .. GENERATED FROM PYTHON SOURCE LINES 163-167 Estimate explained variance per component and plot using matplotlib The fitted object `dict_learning` can be used to calculate the score per component .. GENERATED FROM PYTHON SOURCE LINES 167-185 .. code-block:: Python scores = dict_learning.score(func_filenames, per_component=True) # Plot the scores import numpy as np from matplotlib import pyplot as plt from matplotlib.ticker import FormatStrFormatter plt.figure(figsize=(4, 4), constrained_layout=True) positions = np.arange(len(scores)) plt.barh(positions, scores) plt.ylabel("Component #", size=12) plt.xlabel("Explained variance", size=12) plt.yticks(np.arange(20)) plt.gca().xaxis.set_major_formatter(FormatStrFormatter("%.3f")) show() .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_043.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_043.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none ________________________________________________________________________________ [Memory] Calling nilearn.decomposition._base._explained_variance... _explained_variance(array([[-2.806378e-01, ..., 8.257976e-01], ..., [-1.977336e-15, ..., -3.897215e-16]]), array([[ 0., ..., 0.], ..., [-0., ..., 0.]]), per_component=True) /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.9/site-packages/nilearn/decomposition/_base.py:653: UserWarning: Persisting input arguments took 1.31s to run. If this happens often in your code, it can cause performance problems (results will be correct in all cases). The reason for this is probably some large input arguments for a wrapped function (e.g. large strings). THIS IS A JOBLIB ISSUE. If you can, kindly provide the joblib's team with an example so that they can fix the problem. return self._cache(_explained_variance)( ______________________________________________explained_variance - 14.2s, 0.2min .. GENERATED FROM PYTHON SOURCE LINES 186-193 .. note:: To see how to extract subject-level timeseries' from regions created using :term:`Dictionary learning`, see :ref:`example Regions extraction using dictionary learning and functional connectomes `. .. GENERATED FROM PYTHON SOURCE LINES 195-199 References ---------- .. footbibliography:: .. rst-class:: sphx-glr-timing **Total running time of the script:** (2 minutes 33.759 seconds) **Estimated memory usage:** 6314 MB .. _sphx_glr_download_auto_examples_03_connectivity_plot_compare_decomposition.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/nilearn/nilearn/0.12.0?urlpath=lab/tree/notebooks/auto_examples/03_connectivity/plot_compare_decomposition.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_compare_decomposition.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_compare_decomposition.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_compare_decomposition.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_