Example 1.10. There are six different 3 × 3 permutation matrices, namely
⎛ ⎞ ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ ⎛ ⎞
1 0 0 0 1 0 0 0 1 0 1 0 0 0 1 1 0 0
⎝0 1 0⎠, ⎝0 0 1⎠, ⎝1 0 0⎠, ⎝1 0 0⎠, ⎝0 1 0⎠, ⎝0 0 1⎠.
0 0 1 1 0 0 0 1 0 0 0 1 1 0 0 0 1 0
(1.30)
These have the following effects: if A is a matrix with row vectors r1 , r2 , r3 , then multipli-
cation on the left by each of the six permutation matrices produces, respectively,
⎛ ⎞ ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ ⎛ ⎞
r1 r2 r3 r2 r3 r1
⎝ r2 ⎠ , ⎝ r3 ⎠ , ⎝ r1 ⎠ , ⎝ r1 ⎠ , ⎝ r2 ⎠ , ⎝ r3 ⎠ . (1.31)
r3 r1 r2 r3 r1 r2
Thus, the first permutation matrix, which is the identity, does nothing — the identity
permutation. The fourth, fifth, sixth represent row interchanges. The second and third are
non-elementary permutations, and can be realized by a pair of successive row interchanges.
In general, any rearrangement of a finite ordered collection of objects is called a per-
mutation. Thus, the 6 permutation matrices (1.30) produce the 6 possible permutations
(1.31) of the rows of a 3 × 3 matrix. In general, if a permutation π rearranges the integers
(1, . . . , n) to form (π(1), . . . , π(n)), then the corresponding permutation matrix P = Pπ
that maps row ri to row rπ(i) will have 1’s in positions (i, π(i)) for i = 1, . . . , n and zeros
everywhere else. For example, the second permutation matrix in (1.30) corresponds to the
permutation with π(1) = 2, π(2) = 3, π(3) = 1. Keep in mind that π(1), . . . , π(n) is merely
a rearrangement of the integers 1, . . . , n, so that 1 ≤ π(i) ≤ n and π(i) = π(j) when i = j.
An elementary combinatorial argument proves that there is a total of
n ! = n (n − 1) (n − 2) · · · 3 · 2 · 1 (1.32)
different permutations of (1, . . . , n), and hence the same number of permutation matrices
of size n × n. Moreover, the product P = P1 P2 of any two permutation matrices is also a
permutation matrix, and corresponds to the composition of the two permutations, meaning
one permutes according to P2 and then permutes the result according to P1 . An important
point is that multiplication of permutation matrices is noncommutative — the order in
which one permutes makes a difference. Switching the first and second rows, and then
switching the second and third rows, does not have the same effect as first switching the
second and third rows and then switching the first and second rows!
Exercises
1.4.10. Write down the elementary 4 × 4 permutation matrix (a) P1 that permutes the second
and fourth rows, and (b) P2 that permutes the first and fourth rows. (c) Do P1 and P2
commute? (d) Explain what the matrix products P1 P2 and P2 P1 do to a 4 × 4 matrix.
1.4.11. Write down the permutation matrix P such that
⎛ ⎞ ⎛ ⎞
⎛ ⎞ ⎛ ⎞ ⎛ ⎞ ⎛ ⎞ x1 x4
⎛ ⎞ ⎛ ⎞ a d a b
u v ⎜x ⎟ ⎜x ⎟
⎜ ⎟ ⎜ ⎟ ⎜ ⎟ ⎜ ⎟ ⎜ 2⎟ ⎜ 1⎟
⎜ b⎟ ⎜ c⎟ ⎜ b⎟ ⎜a⎟
(a) P ⎜ ⎟ ⎜ ⎟
⎝ v ⎠ = ⎝ w ⎠, (b) P ⎜
⎝ c⎠
⎟ = ⎜ ⎟,
⎝a⎠
(c) P ⎜
⎝ c⎠
⎟ = ⎜ ⎟,
⎝d⎠
(d) P⎜
⎜
⎟
⎜ x3 ⎟
⎟
= ⎜ ⎟
⎜ x3 ⎟.
⎜ ⎟
w u ⎝x ⎠ ⎝x ⎠
d b d c 4 2
x5 x5
1.4.12. Construct a multiplication table that shows all possible products of the 3 × 3
permutation matrices (1.30). List all pairs that commute.
, 1.4 Pivoting and Permutations 27
1.4.13. Write down all 4 × 4 permutation matrices that (a) fix the third row of a 4 × 4 matrix
A; (b) take the third row to the fourth row; (c) interchange the second and third rows.
1.4.14. True or false: (a) Every elementary permutation matrix satisfies P 2 = I . (b) Every
permutation matrix satisfies P 2 = I . (c) A matrix that satisfies P 2 = I is necessarily a
permutation matrix.
1.4.15. (a) Let P and Q be n × n permutation matrices and v ∈ R n a vector. Under what
conditions does the equation P v = Q v imply that P = Q? (b) Answer the same question
when P A = Q A, where A is an n × k matrix.
1.4.16. Let P be the 3 × 3 permutation matrix such that the product P A permutes the first
and third rows of the 3 × 3 matrix A. (a) Write down P . (b) True or false: The product
A P is obtained by permuting the first and third columns of A.
(c) Does the same conclusion hold for every permutation matrix: is the effect of P A on the
rows of a square matrix A the same as the effect of A P on the columns of A?
♥ 1.4.17. A common
notation for a permutation π of the integers {1, . . . , m} is as a 2 × m
1 2 3 ... m
matrix , indicating that π takes i to π(i). (a) Show
π(1) π(2) π(3) . . . π(m)
that such a permutation corresponds to the permutation matrix with 1’s in positions
(π(j), j) for j = 1, . . . , m. (b) Write down the permutation matrices corresponding to
1 2 3 1 2 3 4 1 2 3 4
the following permutations: (i ) , (ii ) , (iii ) ,
2 1 3 4 2 3 1 1 4 2 3
1 2 3 4 5
(iv ) . Which are elementary matrices? (c) Write down, using the
5 4 3 2 1
preceding notation, the permutations corresponding to the following permutation matrices:
⎛ ⎞
⎛ ⎞ ⎛ ⎞ 0 0 0 1 0
⎛ ⎞ 0 0 1 0 0 1 0 0
0 0 1 ⎜ ⎟
⎜ ⎟ ⎜ ⎟ ⎜1 0 0 0 0⎟
⎜0 0 0 1⎟ ⎜0 0 1 0⎟
(i ) ⎜ ⎟
⎝ 1 0 0 ⎠, (ii ) ⎜
⎝1 0 0 0⎠
⎟, (iii ) ⎜
⎝0 0 0 1⎠
⎜
⎟, (iv ) ⎜ 0 0 1 0 0 ⎟.
⎜
⎟
⎟
0 1 0 ⎝ 0 0 0 0 1⎠
0 1 0 0 1 0 0 0
0 1 0 0 0
♦ 1.4.18. Justify the statement that there are n ! different n × n permutation matrices.
1.4.19. Consider the following combination of elementary row operations of type #1: (i ) Add
row i to row j. (ii ) Subtract row j from row i. (iii ) Add row i to row j again. Prove that
the net effect is to interchange −1 times row i with row j. Thus, we can almost produce
an elementary row operation of type #2 by a combination of elementary row operations
of type #1. Lest you be tempted to try, Exercise 1.9.16 proves that one cannot produce a
bona fide row interchange by a combination of elementary row operations of type #1.
1.4.20. What is the effect of permuting the columns of its coefficient matrix on a linear system?
The Permuted L U Factorization
As we now know, every nonsingular matrix A can be reduced to upper triangular form
by elementary row operations of types #1 and #2. The row interchanges merely reorder
the equations. If one performs all of the required row interchanges in advance, then the
elimination algorithm can proceed without requiring any further pivoting. Thus, the matrix
obtained by permuting the rows of A in the prescribed manner is regular. In other words,
if A is a nonsingular matrix, then there is a permutation matrix P such that the product
P A is regular, and hence admits an L U factorization. As a result, we deduce the general
permuted L U factorization
P A = L U, (1.33)
, 28 1 Linear Algebraic Systems
where P is a permutation matrix, L is lower unitriangular, and U is upper triangular with
the pivots on the diagonal. For instance, in the preceding example, we permuted the first
and second rows, and hence equation (1.33) has the explicit form
⎛ ⎞⎛ ⎞ ⎛ ⎞⎛ ⎞
0 1 0 0 2 1 1 0 0 2 6 1
⎝1 0 0⎠⎝2 6 1⎠ = ⎝ 0 1 0⎠⎝0 2 1⎠.
0 0 1 1 1 4 1
2 −1 1 0 0 92
We have now established the following generalization of Theorem 1.3.
Theorem 1.11. Let A be an n × n matrix. Then the following conditions are equivalent:
(i ) A is nonsingular.
(ii ) A has n nonzero pivots.
(iii ) A admits a permuted L U factorization: P A = L U .
A practical method to construct a permuted L U factorization of a given matrix A would
proceed as follows. First set up P = L = I as n × n identity matrices. The matrix P
will keep track of the permutations performed during the Gaussian Elimination process,
while the entries of L below the diagonal are gradually replaced by the negatives of the
multiples used in the corresponding row operations of type #1. Each time two rows of A are
interchanged, the same two rows of P will be interchanged. Moreover, any pair of entries
that both lie below the diagonal in these same two rows of L must also be interchanged,
while entries lying on and above its diagonal need to stay in their place. At a successful
conclusion to the procedure, A will have been converted into the upper triangular matrix
U , while L and P will assume their final form. Here is an illustrative example.
Example 1.12. Our goal is to produce a permuted L U factorization of the matrix
⎛ ⎞
1 2 −1 0
⎜ 2 4 −2 −1 ⎟
A=⎝ ⎠.
−3 −5 6 1
−1 2 8 −2
To begin the procedure, we apply row operations of type #1 to eliminate the entries below
the first pivot. The updated matrices† are
⎛ ⎞ ⎛ ⎞ ⎛ ⎞
1 2 −1 0 1 0 0 0 1 0 0 0
⎜0 0 0 −1 ⎟ ⎜ 2 1 0 0⎟ ⎜0 1 0 0⎟
A=⎝ ⎠, L=⎝ ⎠, P =⎝ ⎠,
0 1 3 1 −3 0 1 0 0 0 1 0
0 4 7 −2 −1 0 0 1 0 0 0 1
where L keeps track of the row operations, and we initialize P to be the identity matrix.
The (2, 2) entry of the new A is zero, and so we interchange its second and third rows,
leading to
⎛ ⎞ ⎛ ⎞ ⎛ ⎞
1 2 −1 0 1 0 0 0 1 0 0 0
⎜0 1 3 1⎟ ⎜ −3 1 0 0⎟ ⎜0 0 1 0⎟
A=⎝ ⎠, L=⎝ ⎠, P =⎝ ⎠.
0 0 0 −1 2 0 1 0 0 1 0 0
0 4 7 −2 −1 0 0 1 0 0 0 1
†
Here, we are adopting computer programming conventions, where updates of a matrix are all
given the same name.
The benefits of buying summaries with Stuvia:
Guaranteed quality through customer reviews
Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.
Quick and easy check-out
You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.
Focus on what matters
Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!
Frequently asked questions
What do I get when I buy this document?
You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.
Satisfaction guarantee: how does it work?
Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.
Who am I buying these notes from?
Stuvia is a marketplace, so you are not buying this document from us, but from seller THEEXCELLENCELIBRARY. Stuvia facilitates payment to the seller.
Will I be stuck with a subscription?
No, you only buy these notes for $15.99. You're not tied to anything after your purchase.