Section4.5Markov chains and Google's PageRank algorithm

Activity4.5.0.1.

Suppose that a rental car company rents from two locations \(P\) and \(Q\text{.}\) We find that 80% of the cars rented from location \(P\) are returned to \(P\) while the other 20% are returned to \(Q\text{.}\) For cars rented from location \(Q\text{,}\) 60% are returned to \(Q\) and 40% to \(P\text{.}\)

We will use \(P_k\) and \(Q_k\) to denote the number of cars at the two locations on day \(k\text{.}\) The following day, the number of cars at \(P\) equals 80% of \(P_k\) and 40% of \(Q_k\text{.}\) This shows that

If we use the vector \(\xvec_k =
\twovec{P_k}{Q_k}\) to represent the distribution of cars on day \(k\text{,}\) find a matrix \(A\) such that \(\xvec_{k+1} = A\xvec_k\text{.}\)

(b)

Find the eigenvalues and associated eigenvectors of \(A\text{.}\)

(c)

Suppose that there are initially 1500 cars, all of which are at location \(P\text{.}\) Write the vector \(\xvec_0\) as a linear combination of eigenvectors of \(A\text{.}\)

(d)

Write the vectors \(\xvec_k\) as a linear combination of eigenvectors of \(A\text{.}\)

(e)

What happens to the distribution of cars after a long time?

Definition4.5.0.1.

A vector whose entries are nonnegative and add to 1 is called a probability vector. A square matrix whose columns are probability vectors is called a stochastic matrix.

Activity4.5.0.2.

Suppose you live in a country with three political parties \(P\text{,}\)\(Q\text{,}\) and \(R\text{.}\) We use \(P_k\text{,}\)\(Q_k\text{,}\) and \(R_k\) to denote the percentage of voters voting for that party in election \(k\text{.}\)

Voters will change parties from one election to the next as shown in the figure. We see that 60% of voters stay with the same party. However, 40% of those who vote for party \(P\) will vote for party \(Q\) in the next election.

(a)

Write expressions for \(P_{k+1}\text{,}\)\(Q_{k+1}\text{,}\) and \(R_{k+1}\) in terms of \(P_k\text{,}\)\(Q_k\text{,}\) and \(R_k\text{.}\)

(b)

If we write \(\xvec_k =
\threevec{P_k}{Q_k}{R_k}\text{,}\) find the matrix \(A\) such that \(\xvec_{k+1} = A\xvec_k\text{.}\)

(c)

Explain why \(A\) is a stochastic matrix.

(d)

Suppose that initially 40% of citizens vote for party \(P\text{,}\) 30% vote for party \(Q\text{,}\) and 30% vote for party \(R\text{.}\) Form the vector \(\xvec_0\) and explain why \(\xvec_0\) is a probability vector.

(e)

Find \(\xvec_1\text{,}\) the percentages who vote for the three parties in the next election. Verify that \(\xvec_1\) is also a probabilty vector and explain why \(\xvec_k\) will be a probability vector for every \(k\text{.}\)

Activity4.5.0.3.

Use the matrix \(A\) you found in the previous activity.

(a)

Find the eigenvalues of the matrix \(A\) and explain why the eigenspace \(E_1\) is a one-dimensional subspace of \(\real^3\text{.}\) Then verify that \(\vvec=\threevec{1}{2}{2}\) is a basis vector for \(E_1\text{.}\)

(b)

As every vector in \(E_1\) is a scalar multiple of \(\vvec\text{,}\) find a probability vector in \(E_1\) and explain why it is the only probability vector in \(E_1\text{.}\)

(c)

Describe what happens to \(\xvec_k\) after a very long time.

Definition4.5.0.2.

If \(A\) is a stochastic matrix, we say that a probability vector \(\qvec\) is a steady-state or stationary vector if \(A\qvec = \qvec\text{.}\)

We will form the Markov chain beginning with the vector \(\xvec_0 = \twovec{1}{0}\) and defining \(\xvec_{k+1} = A\xvec_k\text{.}\) The Sage cell below constructs the first \(N\) terms of the Markov chain with the command markov_chain(A, x0, N). Define the matrix \(A\) and vector \(x0\) and evaluate the cell to find the first 10 terms of the Markov chain.

(e)

What do you notice about the Markov chain? Does it converge to the steady-state vector for \(A\text{?}\)

Find the eigenvalues of \(B\) along with a steady-state vector for \(B\text{.}\)

(b)

As before, find the first 10 terms in the Markov chain beginning with \(\xvec_0 = \twovec{1}{0}\) and \(\xvec_{k+1} = B\xvec_k\text{.}\)

(c)

What do you notice about the Markov chain? Does it converge to the steady-state vector for \(B\text{?}\)

Activity4.5.0.6.

What condition on the eigenvalues of a stochastic matrix will guarantee that a Markov chain will converge to a steady-state vector?

Definition4.5.0.4.

We say that a matrix \(A\) is positive if either \(A\) or some power \(A^k\) has all positive entries.

Theorem4.5.0.5.Perron-Frobenius.

If \(A\) is a positive stochastic matrix, then the eigenvalues satisfy \(\lambda_1=1\) and \(|\lambda_j| \lt
1\) for \(j\gt 1\text{.}\) This means that \(A\) has a unique positive, steady-state vector \(\qvec\) and that every Markov chain defined by \(A\) will converge to \(\qvec\text{.}\)

Activity4.5.0.7.

We will explore the meaning of the Perron-Frobenius theorem in this activity.

Show \(C\) is a positive matrix by checking powers \(C^k\text{.}\)

(b)

Find the eigenvectors of \(C\) and verify there is a unique steady-state vector.

(c)

Using the Sage cell below, construct the Markov chain with initial vector \(\xvec_0= \twovec{1}{0}\) and describe what happens to \(\xvec_k\) as \(k\) becomes large.

(d)

Construct another Markov chain with initial vector \(\xvec_0=\twovec{0.2}{0.8}\) and describe what happens to \(\xvec_k\) as \(k\) becomes large.