•Three great branches of science:
Theory, Experiment, and Computation
•The purpose of computing is insight, not numbers (1961 Hamming).
•Numerical Analysis is
Study of Algorithms for Problems of Continuous Mathematics.
Ex. Newton’s method, Lagrange interpolation polynomial, Gaussian
Elimination, Euler’s method...
•Computational mathematics is mainly based on two ideas
(extreme simplification) Taylor series and Linear Algebra .
•Role of Computers in Numerical Computing
Computers certainly play a part in numerical computing but even if rounding
error vanishes, 95 % of numerical analysis would remain.
Most mathematical problems cannot be solved
by a finite sequence of elementary operations
Need: fast algorithms that converge to ’approximate’ answers accurate to
many digits of precision, in science and engineering applications.
Different Types of Problems in Numerical Computing
•Problem F: can be solved in a finite sequence of
• Root for polynomial of degree 4: closed form formula exists (Ferrari
•Solving linear equations
•Problem I: cannot be solved in a finite sequence of
•Root for polynomial of degree 5 and higher: no closed form formula
exisits (Ruffini and Abel, around 1800)
•Finding eigenvalues of an matrix with
Minimize a function of several variables
•Evaluate an integral
•solve an ODE
•solve a PDE
Problem F is not necessarily easier than Problem I
When problem dimension is very high, one often ignores the exact solution
and uses approximate and fast methods instead .
*** World’s largest matrix computation as of April 2007:
Google’s PageRank - eigenvector of a matrix of order 2.7 billion
Gauss (1777-1855) and Numerical Computing
• least squares data fitting (1795)
• systems of linear equations (1809)
•numerical quadrature (1814)
•fast Fourier transform (1805) - not well known until it was rediscovered by
Cooley and Tukey (1965)
Numerical Linear Algebra
•square linear system solving
•least squares problems
Often, Algorithms = Matrix Factorizations
Square Linear System Solving
•When does the solution exist?
•When is it easy to solve?
•Make it triangular: A = LU: lower - upper triangular factors
•Gaussian elimination: make the problem into triangular system solving
May break down:
Even for matrices with the LU factorization, it can be
•Pivoting: By inter changing rows , stability can be
Gaussian elimination with pivoting :
Discovery of pivoting was easy but its theoretical analysis has been hard.
For most matrices, it is stable but in 1960 Wilkinson and others found that
for certain exceptional matrices, Gaussian elimination with pivoting is
has orthonormal colummns and is upper
triangular. Reduced QRD: where
•Gram(1883)-Schmidt(1907) orthogonalization: column of Q are obtained
and it gets R as a by product in the process of triangular orthogonalization
•Modified Gram-Schmidt (Laplace 1816, Rice 1966)
•Householder method (1958, Householder reflector, Turnbull
1932): A is reduced to an upper triangular matrix R via orthogonal
operations. More stable numerically, because orthogonal operations
preserve and Frobenius norms and thus do not
amplify the rounding
errors introduced at each step :
•Givens method: extension of 2x2 plane rotations
Important matrix computation algorithms in 1960s
Based on the QR factorization:
•to solve the least squares
•construct orthonormal bases
•used at the core of other algorithms especially in EVD and SVD algorithms
•Overde termined system solving
•If a square system, we know how to solve: normal equations, or QRD
•Reduced QR Decomposition: distance preserving dimension reduction method
•QRD: efficient updating and downdating methods exist
•Rank Deficiency: Q is not the basis for range(R) if rank(A) is not full
•Pivoted QR decomposition, Rank Revealing QR decomposition or the SVD is needed
if rank(A) is not full
Singular Value Decomposition (SVD)
•Beltrami, Jordan, Sylvester, in the late 19th century
made well known by Golub 1965
•The SVD: Any matrix (assume
but not necessary) can
be decomposed into whereis
unitary, andis diagonal,
• is the best rank p
approximation of A
•singular values of A are the nonnegative square roots of
the eigenvalues of
•Latent Semantic Indexing: lower rank approximation of the term-document matrix
•Principal Component Analysis:
Then the leading singular vectors are the PCA solutions. Let the SVD of
is the k principal vectors
•But SVD is expensive.
•Symmetric vs. Non-symmetric
•For which matrices eigenvalues are easy to find?
•What transformations allowed? Similarity transformations. Two matrices A
and B are similar if for a nonsingular matrix
X. Then the
characteristic polynomials of A and B are the same.
Algorithms for Symmetric Eigenvalue Problems
•QR algorithm (1960)
•One of the matrix factorization algorithms with the greatest impact
•Francis, Kublanovskaya, Wilkinson
•Iterative, in symmetric case, typically it converges cubically
•EISPACK, LAPACK, NAG, IMSL, Numerical recipes, MATLAB, Maple, Mathematica
• Power method : to find the leading eigenvalue, eigenvector, PageRank