Canonical Correlation
Canonical Correlation
In statistics, canonical-correlation analysis (CCA), also called canonical variates analysis, is a way of
inferring information from cross-covariance matrices. If we have two vectors X = (X1 , ..., Xn ) and
Y = (Y1 , ..., Ym) of random variables, and there are correlations among the variables, then canonical-
correlation analysis will find linear combinations of X and Y which have maximum correlation with each
other.[1] T. R. Knapp notes that "virtually all of the commonly encountered parametric tests of significance
can be treated as special cases of canonical-correlation analysis, which is the general procedure for
investigating the relationships between two sets of variables."[2] The method was first introduced by Harold
Hotelling in 1936,[3] although in the context of angles between flats the mathematical concept was
published by Jordan in 1875.[4]
Definition
Given two column vectors and of random variables with finite
second moments, one may define the cross-covariance to be the matrix whose
entry is the covariance . In practice, we would estimate the covariance matrix based on
sampled data from and (i.e. from a pair of data matrices).
Canonical-correlation analysis seeks vectors ( ) and ( ) such that the random variables
and maximize the correlation . The (scalar) random variables
and are the first pair of canonical variables. Then one seeks vectors maximizing the same
correlation subject to the constraint that they are to be uncorrelated with the first pair of canonical variables;
this gives the second pair of canonical variables. This procedure may be continued up to
times.
Computation
Derivation
Let be the cross-covariance matrix for any pair of (vector-shaped) random variables and . The
target function to maximize is
There is equality if the vectors and are collinear. In addition, the maximum of
correlation is attained if is the eigenvector with the maximum eigenvalue for the matrix
(see Rayleigh quotient). The subsequent pairs are found by using eigenvalues
of decreasing magnitudes. Orthogonality is guaranteed by the symmetry of the correlation matrices.
Another way of viewing this computation is that and are the left and right singular vectors of the
correlation matrix of X and Y corresponding to the highest singular value.
Solution
is an eigenvector of
is proportional to
is an eigenvector of
is proportional to
is an eigenvector of ,
is proportional to
is an eigenvector of
is proportional to .
CCA can be computed using singular value decomposition on a correlation matrix.[5] It is available as a
function in[6]
CCA computation using singular value decomposition on a correlation matrix is related to the cosine of the
angles between flats. The cosine function is ill-conditioned for small angles, leading to very inaccurate
computation of highly correlated principal vectors in finite precision computer arithmetic. To fix this
trouble, alternative algorithms[7] are available in
Hypothesis testing
Each row can be tested for significance with the following method. Since the correlations are sorted, saying
that row is zero implies all further correlations are also zero. If we have independent observations in a
sample and is the estimated correlation for . For the th row, the test statistic is:
Practical uses
A typical use for canonical correlation in the experimental context is to take two sets of variables and see
what is common among the two sets.[10] For example, in psychological testing, one could take two well
established multidimensional personality tests such as the Minnesota Multiphasic Personality Inventory
(MMPI-2) and the NEO. By seeing how the MMPI-2 factors relate to the NEO factors, one could gain
insight into what dimensions were common between the tests and how much variance was shared. For
example, one might find that an extraversion or neuroticism dimension accounted for a substantial amount
of shared variance between the two tests.
One can also use canonical-correlation analysis to produce a model equation which relates two sets of
variables, for example a set of performance measures and a set of explanatory variables, or a set of outputs
and set of inputs. Constraint restrictions can be imposed on such a model to ensure it reflects theoretical
requirements or intuitively obvious conditions. This type of model is known as a maximum correlation
model.[11]
Visualization of the results of canonical correlation is usually through bar plots of the coefficients of the two
sets of variables for the pairs of canonical variates showing significant correlation. Some authors suggest
that they are best visualized by plotting them as heliographs, a circular format with ray like bars, with each
half representing the two sets of variables.[12]
Examples
Let with zero expected value, i.e., .
1. If , i.e., and are perfectly correlated, then, e.g., and , so that the first
(and only in this example) pair of canonical variables is and .
2. If , i.e., and are perfectly anticorrelated, then, e.g., and , so that
the first (and only in this example) pair of canonical variables is and .
We notice that in both cases , which illustrates that the canonical-correlation analysis treats
correlated and anticorrelated variables similarly.
See also
Generalized canonical correlation
RV coefficient
Angles between flats
Principal component analysis
Linear discriminant analysis
Regularized canonical correlation analysis
Singular value decomposition
Partial least squares regression
References
1. Härdle, Wolfgang; Simar, Léopold (2007). "Canonical Correlation Analysis". Applied
Multivariate Statistical Analysis. pp. 321–330. CiteSeerX 10.1.1.324.403 (https://citeseerx.ist.
psu.edu/viewdoc/summary?doi=10.1.1.324.403). doi:10.1007/978-3-540-72244-1_14 (http
s://doi.org/10.1007%2F978-3-540-72244-1_14). ISBN 978-3-540-72243-4.
2. Knapp, T. R. (1978). "Canonical correlation analysis: A general parametric significance-
testing system". Psychological Bulletin. 85 (2): 410–416. doi:10.1037/0033-2909.85.2.410 (h
ttps://doi.org/10.1037%2F0033-2909.85.2.410).
3. Hotelling, H. (1936). "Relations Between Two Sets of Variates". Biometrika. 28 (3–4): 321–
377. doi:10.1093/biomet/28.3-4.321 (https://doi.org/10.1093%2Fbiomet%2F28.3-4.321).
JSTOR 2333955 (https://www.jstor.org/stable/2333955).
4. Jordan, C. (1875). "Essai sur la géométrie à dimensions" (http://www.numdam.org/item?id
=BSMF_1875__3__103_2). Bull. Soc. Math. France. 3: 103.
5. Hsu, D.; Kakade, S. M.; Zhang, T. (2012). "A spectral algorithm for learning Hidden Markov
Models" (http://www.cs.mcgill.ca/~colt2009/papers/011.pdf) (PDF). Journal of Computer and
System Sciences. 78 (5): 1460. arXiv:0811.4413 (https://arxiv.org/abs/0811.4413).
doi:10.1016/j.jcss.2011.12.025 (https://doi.org/10.1016%2Fj.jcss.2011.12.025).
S2CID 220740158 (https://api.semanticscholar.org/CorpusID:220740158).
6. Huang, S. Y.; Lee, M. H.; Hsiao, C. K. (2009). "Nonlinear measures of association with
kernel canonical correlation analysis and applications" (http://www.stat.sinica.edu.tw/syhuan
g/papersdownload/KCCA-080906.pdf) (PDF). Journal of Statistical Planning and Inference.
139 (7): 2162. doi:10.1016/j.jspi.2008.10.011 (https://doi.org/10.1016%2Fj.jspi.2008.10.011).
7. Knyazev, A.V.; Argentati, M.E. (2002), "Principal Angles between Subspaces in an A-Based
Scalar Product: Algorithms and Perturbation Estimates", SIAM Journal on Scientific
Computing, 23 (6): 2009–2041, Bibcode:2002SJSC...23.2008K (https://ui.adsabs.harvard.ed
u/abs/2002SJSC...23.2008K), CiteSeerX 10.1.1.73.2914 (https://citeseerx.ist.psu.edu/viewd
oc/summary?doi=10.1.1.73.2914), doi:10.1137/S1064827500377332 (https://doi.org/10.113
7%2FS1064827500377332)
8. Kanti V. Mardia, J. T. Kent and J. M. Bibby (1979). Multivariate Analysis. Academic Press.
9. Yang Song, Peter J. Schreier, David Ram´ırez, and Tanuj Hasija Canonical correlation
analysis of high-dimensional data with very small sample support arXiv:1604.02047 (https://
arxiv.org/abs/1604.02047)
10. Sieranoja, S.; Sahidullah, Md; Kinnunen, T.; Komulainen, J.; Hadid, A. (July 2018).
"Audiovisual Synchrony Detection with Optimized Audio Features" (http://cs.joensuu.fi/page
s/tkinnu/webpage/pdf/audiovisual_synchrony_2018.pdf) (PDF). 2018 IEEE 3rd International
Conference on Signal and Image Processing (ICSIP) (http://urn.fi/urn:nbn:fi-fe20200414153
45). IEEE 3rd Int. Conference on Signal and Image Processing (ICSIP 2018). pp. 377–381.
doi:10.1109/SIPROCESS.2018.8600424 (https://doi.org/10.1109%2FSIPROCESS.2018.86
00424). ISBN 978-1-5386-6396-7. S2CID 51682024 (https://api.semanticscholar.org/Corpus
ID:51682024).
11. Tofallis, C. (1999). "Model Building with Multiple Dependent Variables and Constraints".
Journal of the Royal Statistical Society, Series D. 48 (3): 371–378. arXiv:1109.0725 (https://a
rxiv.org/abs/1109.0725). doi:10.1111/1467-9884.00195 (https://doi.org/10.1111%2F1467-98
84.00195). S2CID 8942357 (https://api.semanticscholar.org/CorpusID:8942357).
12. Degani, A.; Shafto, M.; Olson, L. (2006). "Canonical Correlation Analysis: Use of Composite
Heliographs for Representing Multiple Patterns" (http://ti.arc.nasa.gov/m/profile/adegani/Co
mposite_Heliographs.pdf) (PDF). Diagrammatic Representation and Inference. Lecture
Notes in Computer Science. Vol. 4045. p. 93. CiteSeerX 10.1.1.538.5217 (https://citeseerx.is
t.psu.edu/viewdoc/summary?doi=10.1.1.538.5217). doi:10.1007/11783183_11 (https://doi.or
g/10.1007%2F11783183_11). ISBN 978-3-540-35623-3.
13. Jendoubi, T.; Strimmer, K. (2018). "A whitening approach to probabilistic canonical
correlation analysis for omics data integration" (https://www.ncbi.nlm.nih.gov/pmc/articles/P
MC6327589). BMC Bioinformatics. 20 (1): 15. arXiv:1802.03490 (https://arxiv.org/abs/1802.0
3490). doi:10.1186/s12859-018-2572-9 (https://doi.org/10.1186%2Fs12859-018-2572-9).
PMC 6327589 (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6327589). PMID 30626338
(https://pubmed.ncbi.nlm.nih.gov/30626338).
External links
Discriminant Correlation Analysis (DCA) (https://github.com/mhaghighat/dcaFuse)[1]
(MATLAB)
Hardoon, D. R.; Szedmak, S.; Shawe-Taylor, J. (2004). "Canonical Correlation Analysis: An
Overview with Application to Learning Methods". Neural Computation. 16 (12): 2639–2664.
CiteSeerX 10.1.1.14.6452 (https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.14.64
52). doi:10.1162/0899766042321814 (https://doi.org/10.1162%2F0899766042321814).
PMID 15516276 (https://pubmed.ncbi.nlm.nih.gov/15516276). S2CID 202473 (https://api.se
manticscholar.org/CorpusID:202473).
A note on the ordinal canonical-correlation analysis of two sets of ranking scores (http://mpr
a.ub.uni-muenchen.de/12796/) (Also provides a FORTRAN program)- in Journal of
Quantitative Economics 7(2), 2009, pp. 173–199
Representation-Constrained Canonical Correlation Analysis: A Hybridization of Canonical
Correlation and Principal Component Analyses (http://ssrn.com/abstract=1331886) (Also
provides a FORTRAN program)- in Journal of Applied Economic Sciences 4(1), 2009,
pp. 115–124
1. Haghighat, Mohammad; Abdel-Mottaleb, Mohamed; Alhalabi, Wadee (2016). "Discriminant
Correlation Analysis: Real-Time Feature Level Fusion for Multimodal Biometric Recognition"
(https://zenodo.org/record/889881). IEEE Transactions on Information Forensics and
Security. 11 (9): 1984–1996. doi:10.1109/TIFS.2016.2569061 (https://doi.org/10.1109%2FTI
FS.2016.2569061). S2CID 15624506 (https://api.semanticscholar.org/CorpusID:15624506).