About the Project
1 Algebraic and Analytic MethodsTopics of Discussion

§1.3 Determinants, Linear Operators, and Spectral Expansions

Contents
  1. §1.3(i) Determinants: Elementary Properties
  2. §1.3(ii) Special Determinants
  3. §1.3(iii) Infinite Determinants
  4. §1.3(iv) Matrices as Linear Operators

§1.3(i) Determinants: Elementary Properties

Formal Calculation of Determinants

The notation is that of (1.2.58). For n=2:

1.3.1 det[ajk]=|a11a12a21a22|=a11a22a12a21.

For n=3:

1.3.2 det[ajk]=|a11a12a13a21a22a23a31a32a33|=a11|a22a23a32a33|a12|a21a23a31a33|+a13|a21a22a31a32|=a11a22a33a11a23a32a12a21a33+a12a23a31+a13a21a32a13a22a31.

Higher-order determinants are natural generalizations. The minor Mjk of the entry ajk in the nth-order determinant det[ajk] is the (n1)th-order determinant derived from det[ajk] by deleting the jth row and the kth column. The cofactor Ajk of ajk is

1.3.3 Ajk=(1)j+kMjk.

An nth-order determinant expanded by its jth row is given by

1.3.4 det[ajk]==1najAj.

If two rows (or columns) of a determinant are interchanged, then the determinant changes sign. If two rows (columns) of a determinant are identical, then the determinant is zero. If all the elements of a row (column) of a determinant are multiplied by an arbitrary factor μ, then the result is a determinant which is μ times the original. If μ times a row (column) of a determinant is added to another row (column), then the value of the determinant is unchanged.

Relationships Between Determinants

1.3.5 det(𝐀T) =det(𝐀),
1.3.6 det(𝐀1) =1det(𝐀),
1.3.7 det(𝐀𝐁) =det(𝐀)det(𝐁).

Determinants of Upper/Lower Triangular and Diagonal Matrices

The determinant of an upper or lower triangular, or diagonal, square matrix 𝐀 is the product of the diagonal elements det(𝐀)=i=1naii.

Hadamard’s Inequality

For real-valued ajk,

1.3.8 |a11a12a21a22|2(a112+a122)(a212+a222),
1.3.9 det[ajk]2(k=1na1k2)(k=1na2k2)(k=1nank2).

Compare also (1.3.7) for the left-hand side. Equality holds iff

1.3.10 aj1ak1+aj2ak2++ajnakn=0

for every distinct pair of j,k, or when one of the factors k=1najk2 vanishes.

§1.3(ii) Special Determinants

An alternant is a determinant function of n variables which changes sign when two of the variables are interchanged. Examples:

1.3.11 det[fk(xj)],
j=1,,n; k=1,,n,
1.3.12 det[f(xj,yk)],
j=1,,n; k=1,,n.

Vandermonde Determinant or Vandermondian

1.3.13 |1x1x12x1n11x2x22x2n11xnxn2xnn1|=1j<kn(xkxj).

Cauchy Determinant

1.3.14 det[1ajbk]=(1)n(n1)/21j<kn(akaj)(bkbj)/j,k=1n(ajbk).

Circulant

1.3.15 |a1a2anana1an1a2a3a1|=k=1n(a1+a2ωk+a3ωk2++anωkn1),

where ω1,ω2,,ωn are the nth roots of unity (1.11.21).

Krattenthaler’s Formula

For

1.3.16 tjk=(xj+an)(xj+an1)(xj+ak+1)(xj+bk)(xj+bk1)(xj+b2),
1.3.17 det[tjk]=1j<kn(xjxk)2jkn(bjak).

§1.3(iii) Infinite Determinants

Let aj,k be defined for all integer values of j and k, and 𝐷n[aj,k] denote the (2n+1)×(2n+1) determinant

1.3.18 𝐷n[aj,k]=|an,nan,n+1an,nan+1,nan+1,n+1an+1,nan,nan,n+1an,n|.

If 𝐷n[aj,k] tends to a limit L as n, then we say that the infinite determinant 𝐷[aj,k] converges and 𝐷[aj,k]=L.

Of importance for special functions are infinite determinants of Hill’s type. These have the property that the double series

1.3.19 j,k=|aj,kδj,k|

converges (§1.9(vii)). Here δj,k is the Kronecker delta. Hill-type determinants always converge.

For further information see Whittaker and Watson (1927, pp. 36–40) and Magnus and Winkler (1966, §2.3).

§1.3(iv) Matrices as Linear Operators

Linear Operators in Finite Dimensional Vector Spaces

Square matices can be seen as linear operators because 𝐀(α𝐚+β𝐛)=α𝐀𝐚+β𝐀𝐛 for all α,β and 𝐚,𝐛𝐄n, the space of all n-dimensional vectors.

Self-Adjoint Operators on 𝐄n

The adjoint of a matrix 𝐀 is the matrix 𝐀 such that 𝐀𝐚,𝐛=𝐚,𝐀𝐛 for all 𝐚,𝐛𝐄n. In the case of a real matrix 𝐀=𝐀T and in the complex case 𝐀=𝐀H.

Real symmetric (𝐀=𝐀T) and Hermitian (𝐀=𝐀H) matrices are self-adjoint operators on 𝐄n. The spectrum of such self-adjoint operators consists of their eigenvalues, λi,i=1,2,,n, and all λi. The corresponding eigenvectors 𝐚1,,𝐚n can be chosen such that they form a complete orthonormal basis in 𝐄n.

Let the columns of matrix 𝐒 be these eigenvectors 𝐚1,,𝐚n, then 𝐒1=𝐒H, and the similarity transformation (1.2.73) is now of the form 𝐒H𝐀𝐒=λiδi,j. For Hermitian matrices 𝐒 is unitary, and for real symmetric matrices 𝐒 is an orthogonal transformation.

For self-adjoint 𝐀 and 𝐁, if [𝐀,𝐁]=𝟎, see (1.2.66), simultaneous eigenvectors of 𝐀 and 𝐁 always exist.

Orthonormal Expansions

Assuming {𝐚i} is an orthonormal basis in 𝐄n, any vector 𝐮 may be expanded as

1.3.20 𝐮=i=1nci𝐚i,
ci=𝐮,𝐚i.

Taking l2 norms,

1.3.21 𝐮2=i=1n|ci|2,

which is Parseval’s equality.