Pattern Recognition, International Conference on
Download PDF

Abstract

Principal component analysis (PCA) is one of the most important and popular tools used to recognize and compress patterns. The principal components are the basis functions that minimize the mean-squared error and they can be used as matched filters. In many PCA applications it has been observed that the eigenvector with the largest eigenvalue has only non-negative entries when the vectors of the underlying stochastic process have only non-negative values. This has been used to show that the coordinate vectors in PCA are all located in a cone. This in turn can be used to construct invariants, for tracking or compression. In this paper we show how this empirical observation can be rigorously proved. For the case of patterns described by vectors we use the Perron-Frobenius theory of non-negative matrices to investigate stochastic processes of finite-dimensional patterns that assume only non-negative function values. We show that they always have a first eigenfunction that assumes only non-negative values. We also describe the conditions under which the first eigenfunction has strictly positive values. For stochastic processes of patterns in Hilbert (or Banach) spaces we use versions of the Krein-Rutman theory to prove the non-negativity of the first eigenfunction. In contrast to the finite-dimensional case we see that this formulation gives a more direct connection to the conical structure of the underlying pattern space. As a concrete example we sketch how these results can be used in multi-spectral color processing.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!

Related Articles