.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/03_connectivity/plot_compare_decomposition.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_03_connectivity_plot_compare_decomposition.py: Deriving spatial maps from group fMRI data using ICA and Dictionary Learning ============================================================================ Various approaches exist to derive spatial maps or networks from group fmr data. The methods extract distributed brain regions that exhibit similar :term:`BOLD` fluctuations over time. Decomposition methods allow for generation of many independent maps simultaneously without the need to provide a priori information (e.g. seeds or priors.) This example will apply two popular decomposition methods, :term:`ICA` and :term:`Dictionary learning`, to :term:`fMRI` data measured while children and young adults watch movies. The resulting maps will be visualized using atlas plotting tools. :term:`CanICA` is an :term:`ICA` method for group-level analysis of :term:`fMRI` data. Compared to other strategies, it brings a well-controlled group model, as well as a thresholding algorithm controlling for specificity and sensitivity with an explicit model of the signal. The reference paper is :footcite:t:`Varoquaux2010c`. .. GENERATED FROM PYTHON SOURCE LINES 27-29 Load brain development :term:`fMRI` dataset ------------------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 29-38 .. code-block:: Python from nilearn import datasets rest_dataset = datasets.fetch_development_fmri(n_subjects=30) func_filenames = rest_dataset.func # list of 4D nifti files for each subject # print basic information on the dataset print(f"First functional nifti image (4D) is at: {rest_dataset.func[0]}") .. rst-class:: sphx-glr-script-out .. code-block:: none First functional nifti image (4D) is at: /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz .. GENERATED FROM PYTHON SOURCE LINES 39-44 Apply :term:`CanICA` on the data -------------------------------- We use "whole-brain-template" as a strategy to compute the mask, as this leads to slightly faster and more reproducible results. However, the images need to be in :term:`MNI` template space. .. GENERATED FROM PYTHON SOURCE LINES 44-72 .. code-block:: Python from nilearn.decomposition import CanICA canica = CanICA( n_components=20, memory="nilearn_cache", memory_level=2, verbose=10, mask_strategy="whole-brain-template", random_state=0, standardize="zscore_sample", n_jobs=2, ) canica.fit(func_filenames) # Retrieve the independent components in brain space. Directly # accessible through attribute `components_img_`. canica_components_img = canica.components_img_ # components_img is a Nifti Image object, and can be saved to a file with # the following lines: from pathlib import Path output_dir = Path.cwd() / "results" / "plot_compare_decomposition" output_dir.mkdir(exist_ok=True, parents=True) print(f"Output will be saved to: {output_dir}") canica_components_img.to_filename(output_dir / "canica_resting_state.nii.gz") .. rst-class:: sphx-glr-script-out .. code-block:: none [MultiNiftiMasker.fit] Loading data from [/home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar124_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar125_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar126_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar127_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar128_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar001_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar002_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar003_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar004_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar005_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar006_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar007_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar008_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar009_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar010_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar011_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar012_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar013_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar014_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar015_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar016_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar017_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar018_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar019_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar020_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar021_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar022_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar023_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar024_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz]. [{self.__class__.__name__}.fit] Computing mask /home/himanshu/.local/miniconda3/envs/nilearnpy/lib/python3.12/site-packages/joblib/memory.py:693: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. return hashing.hash(filter_args(self.func, self.ignore, args, kwargs), Template whole-brain mask computation /home/himanshu/.local/miniconda3/envs/nilearnpy/lib/python3.12/site-packages/joblib/memory.py:887: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. argument_dict = filter_args(self.func, self.ignore, /home/himanshu/.local/miniconda3/envs/nilearnpy/lib/python3.12/site-packages/joblib/memory.py:693: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. return hashing.hash(filter_args(self.func, self.ignore, args, kwargs), [MultiNiftiMasker.transform] Resampling mask [CanICA] Loading data ________________________________________________________________________________ [Memory] Calling sklearn.utils.extmath.randomized_svd... randomized_svd(array([[0.003659, ..., 0.013254], ..., [0.012477, ..., 0.002881]]), n_components=20, transpose=True, random_state=0, n_iter=3) ___________________________________________________randomized_svd - 1.4s, 0.0min [Parallel(n_jobs=2)]: Using backend LokyBackend with 2 concurrent workers. [Parallel(n_jobs=2)]: Done 1 tasks | elapsed: 16.2s [Parallel(n_jobs=2)]: Done 4 tasks | elapsed: 29.0s [Parallel(n_jobs=2)]: Done 10 out of 10 | elapsed: 1.0min finished Output will be saved to: /home/himanshu/Desktop/nilearn_work/nilearn/examples/03_connectivity/results/plot_compare_decomposition .. GENERATED FROM PYTHON SOURCE LINES 73-74 To visualize we plot the outline of all components on one figure .. GENERATED FROM PYTHON SOURCE LINES 74-80 .. code-block:: Python from nilearn.plotting import plot_prob_atlas # Plot all ICA components together plot_prob_atlas(canica_components_img, title="All ICA components") .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_001.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/himanshu/.local/miniconda3/envs/nilearnpy/lib/python3.12/site-packages/numpy/ma/core.py:2820: UserWarning: Warning: converting a masked element to nan. _data = np.array(data, dtype=dtype, copy=copy, /home/himanshu/Desktop/nilearn_work/nilearn/nilearn/plotting/displays/_axes.py:74: UserWarning: linewidths is ignored by contourf im = getattr(ax, type)( .. GENERATED FROM PYTHON SOURCE LINES 81-82 Finally, we plot the map for each :term:`ICA` component separately .. GENERATED FROM PYTHON SOURCE LINES 82-95 .. code-block:: Python from nilearn.image import iter_img from nilearn.plotting import plot_stat_map, show for i, cur_img in enumerate(iter_img(canica_components_img)): plot_stat_map( cur_img, display_mode="z", title=f"IC {int(i)}", cut_coords=1, colorbar=False, ) .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_002.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_002.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_003.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_003.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_004.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_004.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_005.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_005.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_006.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_006.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_007.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_007.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_008.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_008.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_009.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_009.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_010.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_010.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_011.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_011.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_012.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_012.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_013.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_013.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_014.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_014.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_015.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_015.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_016.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_016.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_017.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_017.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_018.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_018.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_019.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_019.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_020.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_020.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_021.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_021.png :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 96-105 Compare :term:`CanICA` to dictionary learning --------------------------------------------- :term:`Dictionary learning` is a sparsity based decomposition method for extracting spatial maps. It extracts maps that are naturally sparse and usually cleaner than :term:`ICA`. Here, we will compare networks built with :term:`CanICA` to networks built with :term:`Dictionary learning`. For more detailse see :footcite:t:`Mensch2016`. .. GENERATED FROM PYTHON SOURCE LINES 108-109 Create a dictionary learning estimator .. GENERATED FROM PYTHON SOURCE LINES 109-135 .. code-block:: Python from nilearn.decomposition import DictLearning dict_learning = DictLearning( n_components=20, memory="nilearn_cache", memory_level=2, verbose=1, random_state=0, n_epochs=1, mask_strategy="whole-brain-template", standardize="zscore_sample", n_jobs=2, ) print("[Example] Fitting dictionary learning model") dict_learning.fit(func_filenames) print("[Example] Saving results") # Grab extracted components umasked back to Nifti image. # Note: For older versions, less than 0.4.1. components_img_ # is not implemented. See Note section above for details. dictlearning_components_img = dict_learning.components_img_ dictlearning_components_img.to_filename( output_dir / "dictionary_learning_resting_state.nii.gz" ) .. rst-class:: sphx-glr-script-out .. code-block:: none [Example] Fitting dictionary learning model [MultiNiftiMasker.fit] Loading data from [/home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar123_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar124_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar125_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar126_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar127_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar128_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar001_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar002_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar003_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar004_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar005_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar006_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar007_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar008_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar009_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar010_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar011_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar012_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar013_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar014_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar015_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar016_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar017_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar018_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar019_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar020_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar021_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar022_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar023_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz, /home/himanshu/nilearn_data/development_fmri/development_fmri/sub-pixar024_task-pixar_space-MNI152NLin2009cAsym_desc-preproc_bold.nii.gz]. [{self.__class__.__name__}.fit] Computing mask /home/himanshu/.local/miniconda3/envs/nilearnpy/lib/python3.12/site-packages/joblib/memory.py:693: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. return hashing.hash(filter_args(self.func, self.ignore, args, kwargs), /home/himanshu/.local/miniconda3/envs/nilearnpy/lib/python3.12/site-packages/joblib/memory.py:887: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. argument_dict = filter_args(self.func, self.ignore, /home/himanshu/.local/miniconda3/envs/nilearnpy/lib/python3.12/site-packages/joblib/memory.py:693: UserWarning: Cannot inspect object functools.partial(, mask_type='whole-brain'), ignore list will not work. return hashing.hash(filter_args(self.func, self.ignore, args, kwargs), [MultiNiftiMasker.transform] Resampling mask [DictLearning] Loading data [DictLearning] Learning initial components ________________________________________________________________________________ [Memory] Calling sklearn.utils.extmath.randomized_svd... randomized_svd(array([[-0.001315, ..., 0.004387], ..., [ 0.011243, ..., 0.004194]]), n_components=20, transpose=True, random_state=0, n_iter=3) ___________________________________________________randomized_svd - 0.7s, 0.0min [Parallel(n_jobs=2)]: Using backend LokyBackend with 2 concurrent workers. [DictLearning] Computing initial loadings ________________________________________________________________________________ [Memory] Calling nilearn.decomposition.dict_learning._compute_loadings... _compute_loadings(array([[0.002488, ..., 0.003911], ..., [0.0079 , ..., 0.004339]]), array([[-0.622651, ..., 5.322742], ..., [ 0.777205, ..., 0.743122]])) _________________________________________________compute_loadings - 0.3s, 0.0min [DictLearning] Learning dictionary ________________________________________________________________________________ [Memory] Calling sklearn.decomposition._dict_learning.dict_learning_online... dict_learning_online(array([[-0.622651, ..., 0.777205], ..., [ 5.322742, ..., 0.743122]]), 20, alpha=10, batch_size=20, method='cd', dict_init=array([[-0.1227 , ..., -0.136644], ..., [-0.074334, ..., 0.035289]]), verbose=0, random_state=0, return_code=True, shuffle=True, n_jobs=1, max_iter=1090) _____________________________________________dict_learning_online - 4.4s, 0.1min [Example] Saving results .. GENERATED FROM PYTHON SOURCE LINES 136-139 Visualize the results First plot all DictLearning components together .. GENERATED FROM PYTHON SOURCE LINES 139-144 .. code-block:: Python plot_prob_atlas( dictlearning_components_img, title="All DictLearning components" ) .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_022.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_022.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none /home/himanshu/Desktop/nilearn_work/nilearn/nilearn/plotting/displays/_axes.py:74: UserWarning: linewidths is ignored by contourf im = getattr(ax, type)( /home/himanshu/.local/miniconda3/envs/nilearnpy/lib/python3.12/site-packages/numpy/ma/core.py:2820: UserWarning: Warning: converting a masked element to nan. _data = np.array(data, dtype=dtype, copy=copy, .. GENERATED FROM PYTHON SOURCE LINES 145-146 One plot of each component .. GENERATED FROM PYTHON SOURCE LINES 146-156 .. code-block:: Python for i, cur_img in enumerate(iter_img(dictlearning_components_img)): plot_stat_map( cur_img, display_mode="z", title=f"Comp {int(i)}", cut_coords=1, colorbar=False, ) .. rst-class:: sphx-glr-horizontal * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_023.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_023.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_024.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_024.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_025.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_025.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_026.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_026.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_027.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_027.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_028.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_028.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_029.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_029.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_030.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_030.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_031.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_031.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_032.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_032.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_033.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_033.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_034.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_034.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_035.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_035.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_036.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_036.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_037.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_037.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_038.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_038.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_039.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_039.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_040.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_040.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_041.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_041.png :class: sphx-glr-multi-img * .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_042.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_042.png :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 157-161 Estimate explained variance per component and plot using matplotlib The fitted object `dict_learning` can be used to calculate the score per component .. GENERATED FROM PYTHON SOURCE LINES 161-179 .. code-block:: Python scores = dict_learning.score(func_filenames, per_component=True) # Plot the scores import numpy as np from matplotlib import pyplot as plt from matplotlib.ticker import FormatStrFormatter plt.figure(figsize=(4, 4)) positions = np.arange(len(scores)) plt.barh(positions, scores) plt.ylabel("Component #", size=12) plt.xlabel("Explained variance", size=12) plt.yticks(np.arange(20)) plt.gca().xaxis.set_major_formatter(FormatStrFormatter("%.3f")) plt.tight_layout() show() .. image-sg:: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_043.png :alt: plot compare decomposition :srcset: /auto_examples/03_connectivity/images/sphx_glr_plot_compare_decomposition_043.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none ________________________________________________________________________________ [Memory] Calling nilearn.decomposition._base._explained_variance... _explained_variance(array([[-6.227098e-01, ..., 5.322679e+00], ..., [-1.768479e-16, ..., 2.222713e-16]]), array([[0., ..., 0.], ..., [0., ..., 0.]]), per_component=True) /home/himanshu/Desktop/nilearn_work/nilearn/nilearn/decomposition/_base.py:539: UserWarning: Persisting input arguments took 1.32s to run.If this happens often in your code, it can cause performance problems (results will be correct in all cases). The reason for this is probably some large input arguments for a wrapped function. return self._cache(_explained_variance)( ______________________________________________explained_variance - 97.3s, 1.6min .. GENERATED FROM PYTHON SOURCE LINES 180-186 .. note:: To see how to extract subject-level timeseries' from regions created using :term:`Dictionary learning`, see :ref:`example Regions extraction using dictionary learning and functional connectomes `. .. GENERATED FROM PYTHON SOURCE LINES 188-192 References ---------- .. footbibliography:: .. rst-class:: sphx-glr-timing **Total running time of the script:** (7 minutes 30.450 seconds) **Estimated memory usage:** 2967 MB .. _sphx_glr_download_auto_examples_03_connectivity_plot_compare_decomposition.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/nilearn/nilearn/0.10.4?urlpath=lab/tree/notebooks/auto_examples/03_connectivity/plot_compare_decomposition.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_compare_decomposition.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_compare_decomposition.py ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_