# TBIL Activities for Understanding Linear Algebra

## Section4.5Markov chains and Google's PageRank algorithm

### Activity4.5.0.1.

Suppose that a rental car company rents from two locations $$P$$ and $$Q\text{.}$$ We find that 80% of the cars rented from location $$P$$ are returned to $$P$$ while the other 20% are returned to $$Q\text{.}$$ For cars rented from location $$Q\text{,}$$ 60% are returned to $$Q$$ and 40% to $$P\text{.}$$
We will use $$P_k$$ and $$Q_k$$ to denote the number of cars at the two locations on day $$k\text{.}$$ The following day, the number of cars at $$P$$ equals 80% of $$P_k$$ and 40% of $$Q_k\text{.}$$ This shows that
\begin{equation*} \begin{aligned} P_{k+1} \amp {}={} 0.8 P_k + 0.4Q_k \\ Q_{k+1} \amp {}={} 0.2 P_k + 0.6Q_k\text{.} \\ \end{aligned} \end{equation*}

#### (a)

If we use the vector $$\xvec_k = \twovec{P_k}{Q_k}$$ to represent the distribution of cars on day $$k\text{,}$$ find a matrix $$A$$ such that $$\xvec_{k+1} = A\xvec_k\text{.}$$

#### (b)

Find the eigenvalues and associated eigenvectors of $$A\text{.}$$

#### (c)

Suppose that there are initially 1500 cars, all of which are at location $$P\text{.}$$ Write the vector $$\xvec_0$$ as a linear combination of eigenvectors of $$A\text{.}$$

#### (d)

Write the vectors $$\xvec_k$$ as a linear combination of eigenvectors of $$A\text{.}$$

#### (e)

What happens to the distribution of cars after a long time?

### Definition4.5.0.1.

A vector whose entries are nonnegative and add to 1 is called a probability vector. A square matrix whose columns are probability vectors is called a stochastic matrix.

### Activity4.5.0.2.

Suppose you live in a country with three political parties $$P\text{,}$$ $$Q\text{,}$$ and $$R\text{.}$$ We use $$P_k\text{,}$$ $$Q_k\text{,}$$ and $$R_k$$ to denote the percentage of voters voting for that party in election $$k\text{.}$$
Voters will change parties from one election to the next as shown in the figure. We see that 60% of voters stay with the same party. However, 40% of those who vote for party $$P$$ will vote for party $$Q$$ in the next election.

#### (a)

Write expressions for $$P_{k+1}\text{,}$$ $$Q_{k+1}\text{,}$$ and $$R_{k+1}$$ in terms of $$P_k\text{,}$$ $$Q_k\text{,}$$ and $$R_k\text{.}$$

#### (b)

If we write $$\xvec_k = \threevec{P_k}{Q_k}{R_k}\text{,}$$ find the matrix $$A$$ such that $$\xvec_{k+1} = A\xvec_k\text{.}$$

#### (c)

Explain why $$A$$ is a stochastic matrix.

#### (d)

Suppose that initially 40% of citizens vote for party $$P\text{,}$$ 30% vote for party $$Q\text{,}$$ and 30% vote for party $$R\text{.}$$ Form the vector $$\xvec_0$$ and explain why $$\xvec_0$$ is a probability vector.

#### (e)

Find $$\xvec_1\text{,}$$ the percentages who vote for the three parties in the next election. Verify that $$\xvec_1$$ is also a probabilty vector and explain why $$\xvec_k$$ will be a probability vector for every $$k\text{.}$$



### Activity4.5.0.3.

Use the matrix $$A$$ you found in the previous activity.

#### (a)

Find the eigenvalues of the matrix $$A$$ and explain why the eigenspace $$E_1$$ is a one-dimensional subspace of $$\real^3\text{.}$$ Then verify that $$\vvec=\threevec{1}{2}{2}$$ is a basis vector for $$E_1\text{.}$$

#### (b)

As every vector in $$E_1$$ is a scalar multiple of $$\vvec\text{,}$$ find a probability vector in $$E_1$$ and explain why it is the only probability vector in $$E_1\text{.}$$

#### (c)

Describe what happens to $$\xvec_k$$ after a very long time.

### Definition4.5.0.2.

If $$A$$ is a stochastic matrix, we say that a probability vector $$\qvec$$ is a steady-state or stationary vector if $$A\qvec = \qvec\text{.}$$
If $$A$$ is a stochastic matrix and $$\xvec_k$$ a Markov chain, does $$\xvec_k$$ converge to a steady-state vector?

### Activity4.5.0.4.

Consider the matrix
\begin{equation*} A=\left[\begin{array}{rr} 0 \amp 1 \\ 1 \amp 0 \\ \end{array}\right]\text{.} \end{equation*}

#### (a)

Verify that $$A$$ is a stochastic matrix.

#### (b)

Find the eigenvalues of $$A\text{.}$$

#### (c)

Find a steady-state vector for $$A\text{.}$$

#### (d)

We will form the Markov chain beginning with the vector $$\xvec_0 = \twovec{1}{0}$$ and defining $$\xvec_{k+1} = A\xvec_k\text{.}$$ The Sage cell below constructs the first $$N$$ terms of the Markov chain with the command markov_chain(A, x0, N). Define the matrix $$A$$ and vector $$x0$$ and evaluate the cell to find the first 10 terms of the Markov chain.
def markov_chain(A, x0, N):
for i in range(N):
x0 = A*x0
print (x0)
## define the matrix A and x0
A =
x0 =
markov_chain(A, x0, 10)


#### (e)

What do you notice about the Markov chain? Does it converge to the steady-state vector for $$A\text{?}$$

### Activity4.5.0.5.

Consider the matrix
\begin{equation*} B=\left[\begin{array}{rr} 0.4 \amp 0.3 \\ 0.6 \amp 0.7 \\ \end{array}\right]\text{.} \end{equation*}

#### (a)

Find the eigenvalues of $$B$$ along with a steady-state vector for $$B\text{.}$$

#### (b)

As before, find the first 10 terms in the Markov chain beginning with $$\xvec_0 = \twovec{1}{0}$$ and $$\xvec_{k+1} = B\xvec_k\text{.}$$
def markov_chain(A, x0, N):
for i in range(N):
x0 = A*x0
print (x0)
## define the matrix A and x0
A =
x0 =
markov_chain(A, x0, 10)


#### (c)

What do you notice about the Markov chain? Does it converge to the steady-state vector for $$B\text{?}$$

### Activity4.5.0.6.

What condition on the eigenvalues of a stochastic matrix will guarantee that a Markov chain will converge to a steady-state vector?

### Definition4.5.0.4.

We say that a matrix $$A$$ is positive if either $$A$$ or some power $$A^k$$ has all positive entries.

### Activity4.5.0.7.

We will explore the meaning of the Perron-Frobenius theorem in this activity.
Consider the matrix $$C = \left[\begin{array}{rr} 0 \amp 0.5 \\ 1 \amp 0.5 \\ \end{array}\right] \text{.}$$

#### (a)

Show $$C$$ is a positive matrix by checking powers $$C^k\text{.}$$

#### (b)

Find the eigenvectors of $$C$$ and verify there is a unique steady-state vector.

#### (c)

Using the Sage cell below, construct the Markov chain with initial vector $$\xvec_0= \twovec{1}{0}$$ and describe what happens to $$\xvec_k$$ as $$k$$ becomes large.
def markov_chain(A, x0, N):
for i in range(N):
x0 = A*x0
print (x0)
## define the matrix C and x0
C =
x0 =
markov_chain(C, x0, 10)


#### (d)

Construct another Markov chain with initial vector $$\xvec_0=\twovec{0.2}{0.8}$$ and describe what happens to $$\xvec_k$$ as $$k$$ becomes large.

### Activity4.5.0.8.

Consider the matrix $$D = \left[\begin{array}{rrr} 0 \amp 0.5 \amp 0 \\ 1 \amp 0.5 \amp 0 \\ 0 \amp 0 \amp 1 \\ \end{array}\right]$$

#### (a)

Compute several powers of $$D$$ using Sage, and determine whether $$D$$ is a positive matrix.



#### (b)

Find the eigenvalues of $$D$$ and then find the steady-state vectors. Is there a unique steady-state vector?

#### (c)

What happens to the Markov chain defined by $$D$$ with initial vector $$\xvec_0 =\threevec{1}{0}{0}\text{?}$$

#### (d)

What happens to the Markov chain with initial vector $$\xvec_0=\threevec{0}{0}{1}\text{?}$$

### Activity4.5.0.9.

Explain how the matrices $$C$$ and $$D\text{,}$$ which we have considered in previous activities, relate to the Perron-Frobenius theorem.
1. The P-F Theorem tells us $$C$$ and $$D$$ both converge to a unique steady-state vector.
2. The P-F Theorem tells us $$C$$ converges to a unique steady-state vector, but $$D$$ cannot converge to a unique steady-state vector.
3. The P-F Theorem tells us $$C$$ converges to a unique steady-state vector, but it tells us nothing about $$D\text{.}$$
4. The P-F Theorem tells us nothing about $$C$$ and $$D\text{.}$$ Neither needs to converge to a unique steady-state vector.
Do Activity 4.5.5 18  and Activity 4.5.6 19  on Google's Page Rank Algorithm directly in Understanding Linear Algebra.
davidaustinm.github.io/ula/sec-stochastic.html#activity-58
davidaustinm.github.io/ula/sec-stochastic.html#activity-59