0% found this document useful (0 votes)
455 views3 pages

Digest - Dimensionality Reduction of Neural Spike Train Data Using Factor Analysis

Factor analysis is more appropriate than PCA for analyzing neural spike train data as it allows for distinct noise levels in each neuron. Factor analysis models each data point as generated from a low-dimensional latent variable plus noise. It generalizes PCA by allowing the noise covariance to be diagonal rather than a multiple of the identity matrix. Gaussian process factor analysis extends this static model to time series analysis by using a parametric Gaussian process covariance function to model the temporal evolution of the latent variables.

Uploaded by

Jie Fu
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
455 views3 pages

Digest - Dimensionality Reduction of Neural Spike Train Data Using Factor Analysis

Factor analysis is more appropriate than PCA for analyzing neural spike train data as it allows for distinct noise levels in each neuron. Factor analysis models each data point as generated from a low-dimensional latent variable plus noise. It generalizes PCA by allowing the noise covariance to be diagonal rather than a multiple of the identity matrix. Gaussian process factor analysis extends this static model to time series analysis by using a parametric Gaussian process covariance function to model the temporal evolution of the latent variables.

Uploaded by

Jie Fu
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Dimensionality Reduction of Neural spike train data using factor analysis, including a comparative review of linear Gaussian models

Jie Fu https://sites.google.com/site/bigaidream/

If we can get the low-dimensional features of the neural signals by doing dimensionality reduction, pattern classification tasks could become easier. [Key idea] A generalization of PCA, Factor Analysis (FA) handles distinct noise variability in each neuron, making it more appropriate for neural data. What is PCA? It finds projections in the high-dimensional space, then throwing out the less-informative ones.

PCA finds dimensions with maximum variance of projection. PCA finds dimensions that minimize error of projection. Both lead to a solution based on eigenvector decomposition. Probabilistic PCA (PPCA) [Key Idea]PCA can also be expressed of as maximum likelihood estimation of a probabilistic continuous latent variable model [pattern recognition and machine learning, Bishop, Section

12.2]. Each data point Xn, can be described by a latent variable Zn in the space Z, which is generative. Factor Analysis (FA) [Key idea]FA is an extension of PPCA, in which the covariance structure of the noise is less constrained. In PPCA, the covariance matrix corresponding to input noise must be a multiple of the identity matrix, , is the global noise level. For lim0 , it is vanilla PCA. For FA, the covariance matrix is diagonal. FA retains distinct noise variance in each dimension. Equivalence of FA and Mixture Models Gaussian factor model with k factors is equivalent to some mixture model with k +1 clusters, in the sense that the two models have the same means and covariance. *Shalizi+ factor analysis cant really tell us whether we have k continuous hidden causal variables, or one discrete hidden variable taking k+1 values *Shalizi+ [KEY] Analysis of Neural Data Assumptions: 1. Each neuron is a noisy sensor reflecting temporal evolution of a low-dimensional internal process. 2. Firing rate is sufficient to characterize activity Challenges: 1. Neurons are highly variable: Hence FA is necessary, allowing distinct noise models, especially to normalize for loud vs. quiet neurons 2. Must smooth binned firing rates: Use a two-step process: first smooth, then fit; Authors combine two steps into Gaussian Process Factor Analysis (GPFA) Neural Time Series Analysis - Steps

The goal is to determine trajectory from a single trial. [KEY] Problems with two-step process/traditional approach Kernel smoothing technique is ad-hoc Same kernel for each neuron implies single timescale, probably erroneously PCA has no noise model, making it difficult to isolate noise from data

No interaction between two phases of process Gaussian Process Factor Analysis

Conclusions and Future Work GPFA offers a powerful tool with several advantages: FA allows explicit noise model for each neuron GPFA extends static model for time-series analysis Use of parametric GP covariance permits extensiveexploratory modeling Future Work: Richer GP neural state evolution Non-stationary kernels Non-linear manifolds and point-process likelihood

Reference Single-trial analysis of neural population activity during motor preparation Dimensionality Reduction of Neural spike train data using factor analysis, including a comparative review of linear Gaussian models (PPT for CS 545, machine learning)

You might also like