The Resolution Matrix for Visualizing Functional Network Connectivity
TThe Resolution Matrix for VisualizingFunctional Network Connectivity
Keith DillonSeptember 8, 2020
Abstract
The resolution matrix is a mathematical tool for analyzing inverse problems such as computationalimaging systems. When treating network connectivity estimation as an inverse problem, the resolutionmatrix describes the degree to which network nodes and edges can be resolved. This is useful both forquantifying robustness of the network estimate, as well as identifying correlated activity. In this report weanalyze the resolution matrix for functional MRI data from the Human Connectome project. We find thatcommon metrics of the resolution metric can be used to identify networked activity, though with a new twiston the relationship between default mode network and the frontoparietal attention network.
Background
Functional connectivity in the brain, i.e. functional brain networks, are increasingly believed to be importantto both healthy brain function and mental illness [1, 2]. However, the ability to monitor brain activity in vivo isseverely-limited in humans, requiring the use of indirect methods such as functional magnetic resonance imaging(fMRI). Due to the relatively-poor resolution and signal-to-noise ratio of these functional imaging methods, thedetermination of network nodes and edges is a difficult and unsolved problem [3].Resolution [4] is an imaging concept that might shed some light on this problem. The resolution of animaging system is the size of the smallest region who’s pixel value can be determined independently of itsneighbors’. In an inverse problem, this may be generalized to the size of the most compact region for which aunique solution may be determined [5]. The resolution matrix [6] describes the smallest-resolvable regions, orresolution cells, that can be estimated at each point. For an ideal imaging system, the resolution cells would beindividual points or small tiles even spaced across the image. A more general case is depicted in Fig. 1 wherethe resolution varies across the (one-dimensional) image and the resolution cells get correspondingly larger. The
100 200 300 400 500 600 700 800 900 100050100150200250300 -0.2-0.15-0.1-0.0500.050.10.150.2
100 200 300 400 500 600 700 800 900 10001002003004005006007008009001000
Figure 1: (Left) data matrix for one-dimensional simulation, where column represent the signal received fromeach location; the columns become increasingly blurred together from left-to-right; (Right) Corresponding res-olution matrix, where each column represents the best-resolved pixel than can be formed at the correspondinglocation.resolution matrix does not describe the image content directly (i.e., the reflectivity or brightness parameters at1 a r X i v : . [ q - b i o . N C ] S e p oints in space), but rather the size of the region over which parameter estimates are too blurred together toseparate. Fig. 2 gives the diagonal of the resolution matrix for this simulation. The value on the diagonal is acommon metric used in resolution matrix analysis because it can be efficiently computed [7] and relates roughlyto the spread of the resolution cell (in terms of its inverse as in fig. 2). Diagonal of RInverse of diagonal of R
Filter kernelCorrelationResolutionPrecision
Figure 2: (Left) plot of the value of the resolution matrix on the diagonal versus its inverse; (Right) Plots of asingle resolution cell (a resolution matrix column) compared to the original blurring kernel used at this point,the corresponding column of the correlation marix for this data matrix, and the corresponding precision matrixcolumn (not visible because it is identical to the resolution matrix column for this case).Network connectivity estimation may be viewed as an inverse problem in terms of neighborhood selection,where a regression problem for each node is solved to find the connectivity to its neighbors [8]. In [9] wedemonstrated the application of resolution to regression problems with functional imaging data, finding reso-lution cells suggestive of brain modularity or parcellation. In [3] we provided a technique for individual brainparcellation by applying clustering to the resolution matrix for the neighborhood selection problem. In [10] wefound a close relationship between the resolution matrix and the partial correlation estimates defining Gaussiangraphical models. This is depicted in Fig. 2, which shows that resolvability differs from simple univariate corre-lation between points. Correlation describes the similarity between signals collected from different points, whichwould conceptually be similar to a resolution cell describing the blurring of points together. However the act ofcomputationally reconstructing the image unmixes this blurring to the degree possible, hence resolution (andpartial correlation) provide a kind of “sharpened” estimate as compared to univariate correlation. While thegoal of resolution estimation in imaging is to identify which regions are unresolvable due to physical limitationsof the imaging system, partial correlation is used to suggest network relations such as causality. In applicationto networks, therefore, resolution provides a combination of both kinds of information. In this paper we willconsider the use of the resolution matrix as a visualization tool to describe these facets of a brain network. Byextracting metrics of resolution for each point, we see both how sharply a node may be defined, as well as thedistant regions which are functionally similar.
Results
We used the ‘10 unrelated subjects’ dataset from the Human Connectome Project (HCP). This contains multiplefMRI scans of each subject. We used resting state scans which have been preprocessed [11] to remove artifacts,extract cortical and subcortical surfaces, and align to a standard coordinate system. We used the connectomeworkbench [12] to perform spatial smoothing with a 5mm kernel. Finally we removed extracted the signals forcortical surfaces into a data matrix A of size 1200 × R r = A † r A where A † r is the regularized pseudoinverse using a cutoffof 50 percent of singular values (see Appendix A). This would produce a 64984 × k thcolumn (i.e., of size 64984 ×
1) for the k th point on the surface describing the resolvability of this point versusall other points on the surface. This is too large for conventional memory, however we can directly compute thediagonal of the resolution matrix, giving a metric for the resolution at each point as described in the previoussection, by computing the sum of singular vectors squared (see Appendix A). In Fig. 3 we give the resolutionmetric averaged over all ten subjects for both lateral and medial views of both hemispheres. We find that the2igure 3: Average resolution metric over 10 subjects, lateral and medial views of both cortical hemispheres.Darker agrea mean higher metric (more compact resolution cell), and lighter areas mean lower metric (morespread out resolution cell, and hence worse resolution).resolution is high near the central sulcus and sylvian fissure, regions of primary cortex. On the medial cortex,resolution is also high on the cingulate gyrus and especially near the entorhinal cortex, where memory andemotion systems are located.A high resolution metric in the above regions means the signal here is largely independent of signals in otherpoints in the brain. The is demonstrated in Figs. 4, 5, and 6, which show the full resolution cells for threedifferent points on the cortex (Fig. 3 was an image of the peak of such resolution cells for every point). Thecompact dark region around the selected point implies that nearby points are the only ones which are stronglydependent, due either to modularity of activity or resolution limits (including spatial smoothing).Figure 4: Resolution cell for point in the right auditory cortex. A compact region around the selected pointdominates the resolution cell.Figure 5: Resolution cell for point in the right inferior temporal gyrus. A larger region around the selectedpoint dominates the resolution cell.Figure 6: Resolution cell for point in the right congulate cortex. A compact region around the selected pointdominates the resolution cell.As the above examples demonstrate, in the case of high resolution areas, the resolution cells are relativelysimple, dominated by a large compact region. The resolution metric itself is potentially sufficient to identifyand compare such areas. Such compact resolution suggests the activity at these point is highly specialized ormodular, with less reliance on brain-wide networks. On the other hand, regions of low resolution are perhapsmore interesting, as they are suggestive of highly-networked processing which cannot be disentangled. Theaverage inverse resolution metric over subjects is given in Fig. 3 to identify regions with such points. Theinverse of the resolution metric appears noticeable sparser, and dominated largely by the lateral occipital andinferioparietal region of association cortex.Figs. 8 and 9 show interesting resolution cells resulting from two points of very high inverse metric (i.e. verylow resolution metric). Figs. 10 and 11 show resolution cells for points with more moderate inverse metrics,3igure 7: Inverse of resolution metric averaged over 10 subjects, lateral and medial views of both corticalhemispheres.exhibiting large regional regions of sensorimotor cortex and visual association cortex, respectively. Interestingly,Figure 8: Resolution cell for point in the right inferior parietal lobe, involving an assympetric patterns containingbilateral parietal regions, bilateral regions in the precuneus, and right superior frontal and and middle temporalgyri; exhibiting strong similarity to both frontoparietal attention network and default mode network.Figure 9: Resolution cell for point in the right superior parietal lobe, involving a symmetric pattern of isolatedlateral regions in the parietal, frontal, and temporal lobes.Figure 10: Resolution cell for point in the post-central gyrus, involving the bilateral pre-central and post-centralgyri, and a region of the occipital lobeFigure 11: Resolution cell for point in the right cuneus, involving bilateral association regions of the medialoccipital lobes.the frontoparietal network [13] is often separated into a pair of unilateral networks [14], while the default modenetwork [15] is considered symmetric [16]. Here, we find that the resolution cell of Fig. 8, most similar tothe default mode network, contains symmetric medial regions but asymmetric lateral regions. Meanwhile theresolution cell of Fig. 9 contains symmetric lateral regions but essentially no medial regions.4 iscussion The resolution metric (and its inverse) can be used to identify points who’s activity is essentially independentof other points beyond a compact region, as well as points which are involved in long-range networks of activity.Probing these networked points more deeply, we found patterns reminiscent of well-known networks such asthe default-mode and frontoparietal networks. These two networks in particular were originally defined asbeing positively-correlated, while being negatively-correlated between each other [15]. Further, Independentcomponent analysis generally separated the frontoparietal network into a unilateral pair [14]. However from theperspective of resolution cells, we find a different mix of regions and asymmetry, with a symmetric network oflateral regions, and an asymmetric network with symmetric medial regions and asymmetric lateral regions.5 ppendix A: Resolution matrix derivations
In the neighborhood selection problem [8], we solve the linear regression Ax k = a k , (1)where A is the matrix whos the k th column a k contains the time series describing the fMRI activity of the k thcortical point. The predictor x k can be viewed as the k th column of a weighted adjacency matrix X whichrelates points to each other. The least-squares solution isˆ x k = A † a k = A † Ax k = Rx k , where A † is the pseudoinverse of A , and we have defined the resolution matrix, R = A † A . (2) R describes the difference between the least-squares solution ˆ x k and the true predictor x k . The more R differsfrom the identity matrix, the more information loss there is, and the worse our estimate of the true connectivitywill be.In regression as with other supervised machine learning techniques, there is a risk of overfitting when thereis limited data and many parameters. In a single-subject scan from the HCP data, we have 1200 time samplesfor each of 64984 cortical surface points. An adjacency matrix relating every point to every other point wouldrequire 64984 × A is A = (cid:80) ri =1 σ i u i v Ti , where u i and v i are left and right singular vectors,and σ i are the r largest singular values. So the TSVD regularized solution isˆ x k = r (cid:88) i =1 σ i v i u Ti a k = A † r a k , (3)where we have defined the TSVD-regularized pseudoinverse A † r . The resulting TSVD-regularized resolutionmatrix is R r = A † r A r = A † r A . (4)To form an efficient estimate of the diagonal of R r for use as a metric, we first note R r can be written as R r = A † r A = (cid:32) r (cid:88) i =1 σ i v i u Ti (cid:33) (cid:32) m (cid:88) i =1 σ i u i v Ti (cid:33) = r (cid:88) i =1 v i v Ti . (5)So the diagonal elements of R r is ( R r ) k,k = A † r A = r (cid:88) i =1 ( v i ) k , (6)the sum of squares of the k th elements of the first r singular vectors (ie. corresponding to r largest singularvalues).We can also efficiently compute a single (i.e., the k th) resolution cell directly as the k th column of R r ,( R r ) k = A † r a k = r (cid:88) i =1 v i ( v i ) k . (7)6 ppendix B: Detailed data Fig. 12 provides visualizations of the resolution metric for each subject, demonstrating that the pattern appearsgenerally the same for each individual. Figs. 13 and 14 give the relative contribution of different regions to theresolution metric and its inverse.Figure 12: Resolution metric images for 10 subjects, right lateral view of cortex. R i gh t ban kss t s R i gh t c auda l an t e r i o r c i ngu l a t e R i gh t c auda l m i dd l e f r on t a l R i gh t c o r pu sc a ll o s u m R i gh t c uneu s R i gh t en t o r h i na l R i gh t f u s i f o r m R i gh t i n f e r i o r pa r i e t a l R i gh t i n f e r i o r t e m po r a l R i gh t i s t h m u sc i ngu l a t e R i gh t l a t e r a l o cc i p i t a l R i gh t l a t e r a l o r b i t o f r on t a l R i gh t li ngua l R i gh t m ed i a l o r b i t o f r on t a l R i gh t m i dd l e t e m po r a l R i gh t pa r ah i ppo c a m pa l R i gh t pa r a c en t r a l R i gh t pa r s ope r c u l a r i s R i gh t pa r s o r b i t a li s R i gh t pa r s t r i angu l a r i s R i gh t pe r i c a l c a r i ne R i gh t po s t c en t r a l R i gh t po s t e r i o r c i ngu l a t e R i gh t p r e c en t r a l R i gh t p r e c uneu s R i gh t r o s t r a l an t e r i o r c i ngu l a t e R i gh t r o s t r a l m i dd l e f r on t a l R i gh t s upe r i o r f r on t a l R i gh t s upe r i o r pa r i e t a l R i gh t s upe r i o r t e m po r a l R i gh t s up r a m a r g i na l R i gh t f r on t a l po l e R i gh t t e m po r a l po l e R i gh t t r an sv e r s e t e m po r a l R i gh t i n s u l aLe ft ban kss t s Le ft c auda l an t e r i o r c i ngu l a t e Le ft c auda l m i dd l e f r on t a l Le ft c o r pu sc a ll o s u m Le ft c uneu s Le ft en t o r h i na l Le ft f u s i f o r m Le ft i n f e r i o r pa r i e t a l Le ft i n f e r i o r t e m po r a l Le ft i s t h m u sc i ngu l a t e Le ft l a t e r a l o cc i p i t a l Le ft l a t e r a l o r b i t o f r on t a l Le ft li ngua l Le ft m ed i a l o r b i t o f r on t a l Le ft m i dd l e t e m po r a l Le ft pa r ah i ppo c a m pa l Le ft pa r a c en t r a l Le ft pa r s ope r c u l a r i s Le ft pa r s o r b i t a li s Le ft pa r s t r i angu l a r i s Le ft pe r i c a l c a r i ne Le ft po s t c en t r a l Le ft po s t e r i o r c i ngu l a t e Le ft p r e c en t r a l Le ft p r e c uneu s Le ft r o s t r a l an t e r i o r c i ngu l a t e Le ft r o s t r a l m i dd l e f r on t a l Le ft s upe r i o r f r on t a l Le ft s upe r i o r pa r i e t a l Le ft s upe r i o r t e m po r a l Le ft s up r a m a r g i na l Le ft f r on t a l po l e Le ft t e m po r a l po l e Le ft t r an sv e r s e t e m po r a l Le ft i n s u l a Figure 13: Relative contribution of regions to average metric. R i gh t ban kss t s R i gh t c auda l an t e r i o r c i ngu l a t e R i gh t c auda l m i dd l e f r on t a l R i gh t c o r pu sc a ll o s u m R i gh t c uneu s R i gh t en t o r h i na l R i gh t f u s i f o r m R i gh t i n f e r i o r pa r i e t a l R i gh t i n f e r i o r t e m po r a l R i gh t i s t h m u sc i ngu l a t e R i gh t l a t e r a l o cc i p i t a l R i gh t l a t e r a l o r b i t o f r on t a l R i gh t li ngua l R i gh t m ed i a l o r b i t o f r on t a l R i gh t m i dd l e t e m po r a l R i gh t pa r ah i ppo c a m pa l R i gh t pa r a c en t r a l R i gh t pa r s ope r c u l a r i s R i gh t pa r s o r b i t a li s R i gh t pa r s t r i angu l a r i s R i gh t pe r i c a l c a r i ne R i gh t po s t c en t r a l R i gh t po s t e r i o r c i ngu l a t e R i gh t p r e c en t r a l R i gh t p r e c uneu s R i gh t r o s t r a l an t e r i o r c i ngu l a t e R i gh t r o s t r a l m i dd l e f r on t a l R i gh t s upe r i o r f r on t a l R i gh t s upe r i o r pa r i e t a l R i gh t s upe r i o r t e m po r a l R i gh t s up r a m a r g i na l R i gh t f r on t a l po l e R i gh t t e m po r a l po l e R i gh t t r an sv e r s e t e m po r a l R i gh t i n s u l aLe ft ban kss t s Le ft c auda l an t e r i o r c i ngu l a t e Le ft c auda l m i dd l e f r on t a l Le ft c o r pu sc a ll o s u m Le ft c uneu s Le ft en t o r h i na l Le ft f u s i f o r m Le ft i n f e r i o r pa r i e t a l Le ft i n f e r i o r t e m po r a l Le ft i s t h m u sc i ngu l a t e Le ft l a t e r a l o cc i p i t a l Le ft l a t e r a l o r b i t o f r on t a l Le ft li ngua l Le ft m ed i a l o r b i t o f r on t a l Le ft m i dd l e t e m po r a l Le ft pa r ah i ppo c a m pa l Le ft pa r a c en t r a l Le ft pa r s ope r c u l a r i s Le ft pa r s o r b i t a li s Le ft pa r s t r i angu l a r i s Le ft pe r i c a l c a r i ne Le ft po s t c en t r a l Le ft po s t e r i o r c i ngu l a t e Le ft p r e c en t r a l Le ft p r e c uneu s Le ft r o s t r a l an t e r i o r c i ngu l a t e Le ft r o s t r a l m i dd l e f r on t a l Le ft s upe r i o r f r on t a l Le ft s upe r i o r pa r i e t a l Le ft s upe r i o r t e m po r a l Le ft s up r a m a r g i na l Le ft f r on t a l po l e Le ft t e m po r a l po l e Le ft t r an sv e r s e t e m po r a l Le ft i n s u l a Figure 14: Relative contribution of regions to average inverse metric.7 eferences [1] O. Sporns, G. Tononi, and R. Ktter, “The Human Connectome: A Structural Description of the HumanBrain,”
PLOS Computational Biology , vol. 1, p. e42, Sept. 2005.[2] A. Fornito, A. Zalesky, and E. Bullmore,
Fundamentals of Brain Network Analysis . Academic Press, Mar.2016.[3] K. Dillon and Y.-P. Wang, “Resolution-based spectral clustering for brain parcellation using functionalMRI,”
Journal of Neuroscience Methods , vol. 335, p. 108628, Apr. 2020.[4] A. J. den Dekker and A. van den Bos, “Resolution: a survey,”
Journal of the Optical Society of AmericaA , vol. 14, p. 547, Mar. 1997.[5] K. Dillon and Y. Fainman, “Element-wise uniqueness, prior knowledge, and data-dependent resolution,”
Signal, Image and Video Processing , pp. 1–8, Apr. 2016.[6] D. D. Jackson, “Interpretation of Inaccurate, Insufficient and Inconsistent Data,”
Geophysical JournalInternational , vol. 28, pp. 97–109, June 1972.[7] J. K. MacCarthy, B. Borchers, and R. C. Aster, “Efficient stochastic estimation of the model resolution ma-trix diagonal and generalized crossvalidation for large geophysical inverse problems,”
Journal of GeophysicalResearch: Solid Earth , vol. 116, no. B10, 2011.[8] N. Meinshausen and P. Bhlmann, “High-dimensional graphs and variable selection with the Lasso,”
TheAnnals of Statistics , vol. 34, pp. 1436–1462, June 2006.[9] K. Dillon and Y.-P. Wang, “An image resolution perspective on functional activity mapping,” in
Engineer-ing in Medicine and Biology Society (EMBC), 2016 IEEE 38th Annual International Conference of the ,pp. 1139–1142, IEEE, 2016.[10] K. Dillon, “On the Computation and Applications of Large Dense Partial Correlation Networks,” arXiv:1903.07181 [cs, stat] , Mar. 2019. arXiv: 1903.07181.[11] M. F. Glasser, S. N. Sotiropoulos, J. A. Wilson, T. S. Coalson, B. Fischl, J. L. Andersson, J. Xu, S. Jbabdi,M. Webster, J. R. Polimeni, D. C. Van Essen, and M. Jenkinson, “The Minimal Preprocessing Pipelinesfor the Human Connectome Project,”
NeuroImage , vol. 80, pp. 105–124, Oct. 2013.[12] D. S. Marcus, M. P. Harms, A. Z. Snyder, M. Jenkinson, J. A. Wilson, M. F. Glasser, D. M. Barch,K. A. Archie, G. C. Burgess, M. Ramaratnam, M. Hodge, W. Horton, R. Herrick, T. Olsen, M. McKay,M. House, M. Hileman, E. Reid, J. Harwell, T. Coalson, J. Schindler, J. S. Elam, S. W. Curtiss, andD. C. Van Essen, “Human Connectome Project informatics: Quality control, database services, and datavisualization,”
NeuroImage , vol. 80, pp. 202–219, Oct. 2013.[13] S. Marek and N. U. F. Dosenbach, “The frontoparietal network: function, electrophysiology, and importanceof individual precision mapping,”
Dialogues in Clinical Neuroscience , vol. 20, pp. 133–140, June 2018.[14] J. Cabral, M. L. Kringelbach, and G. Deco, “Exploring the network dynamics underlying brain activityduring rest,”
Progress in Neurobiology , vol. 114, pp. 102–131, Mar. 2014.[15] M. D. Fox, A. Z. Snyder, J. L. Vincent, M. Corbetta, D. C. V. Essen, and M. E. Raichle, “The human brainis intrinsically organized into dynamic, anticorrelated functional networks,”
Proceedings of the NationalAcademy of Sciences of the United States of America , vol. 102, pp. 9673–9678, July 2005.[16] M. E. Raichle, “The brain’s default mode network,”
Annual Review of Neuroscience , vol. 38, pp. 433–447,July 2015.[17] P. C. Hansen, “The truncatedSVD as a method for regularization,”