Page 399 - Understanding Machine Learning
P. 399

C.4 Singular Value Decomposition (SVD)  381


              C.2 EIGENVALUES AND EIGENVECTORS
              Let A ∈ R d,d  be a matrix. A nonzero vector u is an eigenvector of A with a
              corresponding eigenvalue λ if
                                              Au = λu.

              Theorem C.1 (Spectral Decomposition). If A ∈ R d,d  is a symmetric matrix of rank
                                                      d
              k, then there exists an orthonormal basis of R , u 1 ,...,u d , such that each u i is an
                                                                  d


                                                                    λ
              eigenvector of A. Furthermore, A can be written as A =  i=1 i u i u , where each
                                                                         i
              λ i is the eigenvalue corresponding to the eigenvector u i . This can be written equiv-
              alently as A = UDU , where the columns of U are the vectors u 1 ,...,u d ,and D is

              a diagonal matrix with D i,i = λ i and for i  = j, D i, j = 0. Finally, the number of λ i
              which are nonzero is the rank of the matrix, the eigenvectors which correspond to the
              nonzero eigenvalues span the range of A, and the eigenvectors which correspond to
              zero eigenvalues span the null space of A.
              C.3 POSITIVE DEFINITE MATRICES

              A symmetric matrix A ∈ R d,d  is positive definite if all its eigenvalues are positive. A
              is positive semidefinite if all its eigenvalues are nonnegative.

              Theorem C.2. Let A ∈ R d,d  be a symmetric matrix. Then, the following are equivalent
              definitions of positive semidefiniteness of A:

                All the eigenvalues of A are nonnegative.
                For every vector u,  u, Au ≥ 0.
                There exists a matrix B such that A = BB .




              C.4 SINGULAR VALUE DECOMPOSITION (SVD)
              Let A ∈ R m,n  beamatrixofrank r.When m  = n, the eigenvalue decomposition given
              in Theorem C.1 cannot be applied. We will describe another decomposition of A,
              which is called Singular Value Decomposition, or SVD for short.
                                          m
                                n
                 Unit vectors v ∈ R and u ∈ R are called right and left singular vectors of A with
              corresponding singular value σ> 0if

                                       Av = σu and A u = σv.
                 We first show that if we can find r orthonormal singular vectors with positive

              singular values, then we can decompose A = UDV , with the columns of U and V
              containing the left and right singular vectors, and D being a diagonal r × r matrix
              with the singular values on its diagonal.
              Lemma C.3. Let A ∈ R  m,n  be amatrixofrank r. Assume that v 1 ,...,v r is an
              orthonormal set of right singular vectors of A, u 1 ,...,u r is an orthonormal set of cor-
              responding left singular vectors of A,and σ 1 ,...,σ r are the corresponding singular
   394   395   396   397   398   399   400   401   402   403   404