.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/03_connectivity/plot_compare_decomposition.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code or to run this example in your browser via Binder. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_03_connectivity_plot_compare_decomposition.py: Deriving spatial maps from group fMRI data using ICA and Dictionary Learning ============================================================================ Various approaches exist to derive spatial maps or networks from group fmr data. The methods extract distributed brain regions that exhibit similar :term:`BOLD` fluctuations over time. Decomposition methods allow for generation of many independent maps simultaneously without the need to provide a priori information (e.g. seeds or priors.) This example will apply two popular decomposition methods, :term:`ICA` and :term:`Dictionary learning`, to :term:`fMRI` data measured while children and young adults watch movies. The resulting maps will be visualized using atlas plotting tools. :term:`CanICA` is an :term:`ICA` method for group-level analysis of :term:`fMRI` data. Compared to other strategies, it brings a well-controlled group model, as well as a thresholding algorithm controlling for specificity and sensitivity with an explicit model of the signal. The reference paper is :footcite:t:`Varoquaux2010c`. .. GENERATED FROM PYTHON SOURCE LINES 27-29 Load brain development :term:`fMRI` dataset ------------------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 29-38 .. code-block:: Python from nilearn.datasets import fetch_development_fmri rest_dataset = fetch_development_fmri(n_subjects=30) func_filenames = rest_dataset.func # list of 4D nifti files for each subject # print basic information on the dataset print(f"First functional nifti image (4D) is at: {rest_dataset.func[0]}") .. rst-class:: sphx-glr-script-out .. code-block:: none [fetch_development_fmri] Dataset found in /home/runner/nilearn_data/development_fmri [fetch_development_fmri] Dataset found in /home/runner/nilearn_data/development_fmri/development_fmri [fetch_development_fmri] Dataset found in /home/runner/nilearn_data/development_fmri/development_fmri First functional nifti image (4D) is at: /home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar128_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz .. GENERATED FROM PYTHON SOURCE LINES 39-44 Apply :term:`CanICA` on the data -------------------------------- We use "whole-brain-template" as a strategy to compute the mask, as this leads to slightly faster and more reproducible results. However, the images need to be in :term:`MNI` template space. .. GENERATED FROM PYTHON SOURCE LINES 44-80 .. code-block:: Python import warnings from sklearn.exceptions import ConvergenceWarning from nilearn.decomposition import CanICA canica = CanICA( n_components=20, memory="nilearn_cache", memory_level=1, verbose=1, random_state=0, standardize="zscore_sample", mask_strategy="whole-brain-template", n_jobs=2, ) with warnings.catch_warnings(): # silence warnings about ICA not converging # Consider increasing tolerance or the maximum number of iterations. warnings.filterwarnings(action="ignore", category=ConvergenceWarning) canica.fit(func_filenames) # Retrieve the independent components in brain space. Directly # accessible through attribute `components_img_`. canica_components_img = canica.components_img_ # components_img is a Nifti Image object, and can be saved to a file with # the following lines: from pathlib import Path output_dir = Path.cwd() / "results" / "plot_compare_decomposition" output_dir.mkdir(exist_ok=True, parents=True) print(f"Output will be saved to: {output_dir}") canica_components_img.to_filename(output_dir / "canica_resting_state.nii.gz") .. rst-class:: sphx-glr-script-out .. code-block:: none [CanICA.fit] Loading data from ['/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar128_task- pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar126_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar125_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar124_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar127_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar024_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar023_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar022_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar021_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar020_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar019_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar018_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar017_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar016_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar001_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar013_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar012_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar011_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar010_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar009_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar008_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar007_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar006_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar005_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar004_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar003_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar002_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar014_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar015_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz'] [CanICA.fit] Computing mask [CanICA.fit] Resampling mask [CanICA.fit] Finished fit [CanICA.fit] Loading data [CanICA.fit] Computing image from signals [Parallel(n_jobs=2)]: Using backend LokyBackend with 2 concurrent workers. [Parallel(n_jobs=2)]: Done 10 out of 10 | elapsed: 13.9s finished [CanICA.fit] Computing image from signals Output will be saved to: /home/runner/work/nilearn/nilearn/examples/03_connectivity/results/plot_compare_decomposition .. GENERATED FROM PYTHON SOURCE LINES 81-82 To visualize we plot the outline of all components on one figure .. GENERATED FROM PYTHON SOURCE LINES 82-88 .. code-block:: Python from nilearn.plotting import plot_prob_atlas # Plot all ICA components together plot_prob_atlas(canica_components_img, title="All ICA components") .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_001.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.10/site-packages/numpy/ma/core.py:2892: UserWarning: Warning: converting a masked element to nan. .. GENERATED FROM PYTHON SOURCE LINES 89-90 Finally, we plot the map for each :term:`ICA` component separately .. GENERATED FROM PYTHON SOURCE LINES 90-107 .. code-block:: Python from nilearn.image import iter_img from nilearn.plotting import plot_stat_map, show for i, cur_img in enumerate(iter_img(canica_components_img)): plot_stat_map( cur_img, display_mode="z", title=f"IC {int(i)}", cut_coords=1, vmax=0.05, vmin=-0.05, colorbar=False, ) show() .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_002.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_002.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_003.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_003.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_004.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_004.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_005.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_005.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_006.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_006.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_007.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_007.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_008.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_008.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_009.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_009.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_010.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_010.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_011.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_011.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_012.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_012.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_013.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_013.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_014.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_014.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_015.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_015.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_016.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_016.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_017.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_017.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_018.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_018.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_019.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_019.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_020.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_020.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_021.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_021.png :class: sphx-glr-multi-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/nilearn/nilearn/examples/03_connectivity/plot_compare_decomposition.py:105: UserWarning: You are using the 'agg' matplotlib backend that is non-interactive. No figure will be plotted when calling matplotlib.pyplot.show() or nilearn.plotting.show(). You can fix this by installing a different backend: for example via pip install PyQt6 .. GENERATED FROM PYTHON SOURCE LINES 108-117 Compare :term:`CanICA` to dictionary learning --------------------------------------------- :term:`Dictionary learning` is a sparsity based decomposition method for extracting spatial maps. It extracts maps that are naturally sparse and usually cleaner than :term:`ICA`. Here, we will compare networks built with :term:`CanICA` to networks built with :term:`Dictionary learning`. For more detailse see :footcite:t:`Mensch2016`. .. GENERATED FROM PYTHON SOURCE LINES 120-121 Create a dictionary learning estimator .. GENERATED FROM PYTHON SOURCE LINES 121-147 .. code-block:: Python from nilearn.decomposition import DictLearning dict_learning = DictLearning( n_components=20, memory="nilearn_cache", memory_level=1, verbose=1, random_state=0, n_epochs=1, mask_strategy="whole-brain-template", standardize="zscore_sample", n_jobs=2, ) print("[Example] Fitting dictionary learning model") dict_learning.fit(func_filenames) print("[Example] Saving results") # Grab extracted components umasked back to Nifti image. # Note: For older versions, less than 0.4.1. components_img_ # is not implemented. See Note section above for details. dictlearning_components_img = dict_learning.components_img_ dictlearning_components_img.to_filename( output_dir / "dictionary_learning_resting_state.nii.gz" ) .. rst-class:: sphx-glr-script-out .. code-block:: none [Example] Fitting dictionary learning model [DictLearning.fit] Loading data from ['/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar128_task- pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar126_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar125_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar124_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar127_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar024_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar023_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar022_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar021_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar020_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar019_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar018_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar017_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar016_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar001_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar013_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar012_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar011_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar010_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar009_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar008_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar007_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar006_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar005_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar004_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar003_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar002_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar014_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz', '/home/runner/nilearn_data/development_fmri/development_fmri/sub-pixar015_task-p ixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz'] [DictLearning.fit] Computing mask [DictLearning.fit] Resampling mask [DictLearning.fit] Finished fit [DictLearning.fit] Loading data [DictLearning.fit] Learning initial components [Parallel(n_jobs=2)]: Using backend LokyBackend with 2 concurrent workers. [DictLearning.fit] Computing initial loadings ________________________________________________________________________________ [Memory] Calling nilearn.decomposition.dict_learning._compute_loadings... _compute_loadings(array([[-0.005308, ..., -0.003948], ..., [-0.000253, ..., 0.003836]], shape=(20, 21781)), array([[-0.280625, ..., 0.825802], ..., [-0.997198, ..., -0.015035]], shape=(600, 21781))) _________________________________________________compute_loadings - 0.1s, 0.0min [DictLearning.fit] Learning dictionary ________________________________________________________________________________ [Memory] Calling sklearn.decomposition._dict_learning.dict_learning_online... dict_learning_online(array([[-0.280625, ..., -0.997198], ..., [ 0.825802, ..., -0.015035]], shape=(21781, 600)), 20, alpha=10, batch_size=20, method='cd', dict_init=array([[-0.294655, ..., -0.01288 ], ..., [-0.29801 , ..., -0.313942]], shape=(20, 600)), verbose=0, random_state=0, return_code=True, shuffle=True, n_jobs=1, max_iter=1090) _____________________________________________dict_learning_online - 1.0s, 0.0min [DictLearning.fit] Computing image from signals [Example] Saving results .. GENERATED FROM PYTHON SOURCE LINES 148-151 Visualize the results First plot all DictLearning components together .. GENERATED FROM PYTHON SOURCE LINES 151-156 .. code-block:: Python plot_prob_atlas( dictlearning_components_img, title="All DictLearning components" ) .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_022.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_022.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/runner/work/nilearn/nilearn/.tox/doc/lib/python3.10/site-packages/numpy/ma/core.py:2892: UserWarning: Warning: converting a masked element to nan. .. GENERATED FROM PYTHON SOURCE LINES 157-158 One plot of each component .. GENERATED FROM PYTHON SOURCE LINES 158-170 .. code-block:: Python for i, cur_img in enumerate(iter_img(dictlearning_components_img)): plot_stat_map( cur_img, display_mode="z", title=f"Comp {int(i)}", cut_coords=1, vmax=0.1, vmin=-0.1, colorbar=False, ) .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_023.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_023.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_024.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_024.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_025.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_025.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_026.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_026.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_027.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_027.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_028.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_028.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_029.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_029.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_030.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_030.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_031.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_031.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_032.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_032.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_033.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_033.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_034.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_034.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_035.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_035.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_036.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_036.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_037.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_037.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_038.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_038.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_039.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_039.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_040.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_040.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_041.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_041.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_042.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_042.png :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 171-175 Estimate explained variance per component and plot using matplotlib The fitted object `dict_learning` can be used to calculate the score per component .. GENERATED FROM PYTHON SOURCE LINES 175-193 .. code-block:: Python scores = dict_learning.score(func_filenames, per_component=True) # Plot the scores import numpy as np from matplotlib import pyplot as plt from matplotlib.ticker import FormatStrFormatter plt.figure(figsize=(4, 4), constrained_layout=True) positions = np.arange(len(scores)) plt.barh(positions, scores) plt.ylabel("Component #", size=12) plt.xlabel("Explained variance", size=12) plt.yticks(np.arange(20)) plt.gca().xaxis.set_major_formatter(FormatStrFormatter("%.3f")) show() .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_043.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_043.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none ________________________________________________________________________________ [Memory] Calling nilearn.decomposition._base._explained_variance... _explained_variance(array([[-2.806378e-01, ..., 8.257976e-01], ..., [-2.074250e-15, ..., -4.247761e-16]], shape=(5040, 21781)), array([[0. , ..., 0.002218], ..., [0. , ..., 0. ]], shape=(20, 21781)), per_component=True) ______________________________________________explained_variance - 17.3s, 0.3min /home/runner/work/nilearn/nilearn/examples/03_connectivity/plot_compare_decomposition.py:191: UserWarning: You are using the 'agg' matplotlib backend that is non-interactive. No figure will be plotted when calling matplotlib.pyplot.show() or nilearn.plotting.show(). You can fix this by installing a different backend: for example via pip install PyQt6 .. GENERATED FROM PYTHON SOURCE LINES 194-201 .. note:: To see how to extract subject-level timeseries' from regions created using :term:`Dictionary learning`, see :ref:`example Regions extraction using dictionary learning and functional connectomes `. .. GENERATED FROM PYTHON SOURCE LINES 203-207 References ---------- .. footbibliography:: .. rst-class:: sphx-glr-timing **Total running time of the script:** (2 minutes 45.054 seconds) **Estimated memory usage:** 2658 MB .. _sphx_glr_download_auto_examples_03_connectivity_plot_compare_decomposition.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/nilearn/nilearn/0.13.1?urlpath=lab/tree/notebooks/auto_examples/03_connectivity/plot_compare_decomposition.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_compare_decomposition.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_compare_decomposition.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_compare_decomposition.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_