7.13. Linear Algebra Review#

7.13.1. 🎯 Learning Objectives#

By the end of this notebook, you will be able to:

  1. Create and manipulate vectors and matrices β€” Use NumPy arrays for efficient numerical computing

  2. Perform matrix operations β€” Execute element-wise, scalar, and matrix multiplication with the @ operator

  3. Apply transpose and inverse β€” Understand when and how to use these fundamental operations

  4. Connect linear algebra to finance β€” Express portfolio returns using dot products and matrix multiplication

7.13.2. πŸ“‹ Table of Contents#

  1. Setup

  2. Vectors

  3. Matrices

  4. Matrix Multiplication

  5. Transpose

  6. Identity Matrix

  7. Inverse

  8. Application: Portfolio Returns

  9. Exercises

  10. Key Takeaways

#@title πŸ› οΈ Setup: Run this cell first <a id="setup"></a>

# Uncomment the line below if running in Google Colab
# !pip install numpy matplotlib

import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline

# Set consistent plot style
plt.style.use('seaborn-v0_8-whitegrid')
plt.rcParams['figure.figsize'] = [10, 6]

7.13.3. Vectors #

A vector is \(N\) numbers stored together:

\[\begin{split}x = \begin{bmatrix} x_1 \\ x_2 \\ \vdots \\ x_N \end{bmatrix}\end{split}\]

In NumPy, a vector is a 1-dimensional array.

# Creating vectors
x = np.array([1, 2, 3])
y = np.array([4, 5, 6])

print(f"x = {x}")
print(f"y = {y}")
print(f"Shape of x: {x.shape}")
x = [1 2 3]
y = [4 5 6]
Shape of x: (3,)

7.13.3.1. Element-wise Operations#

Operations are applied element by element:

\[\begin{split}z = x \circ y = \begin{bmatrix} x_1 \circ y_1 \\ x_2 \circ y_2 \\ x_3 \circ y_3 \end{bmatrix}\end{split}\]
print(f"Addition:       x + y = {x + y}")
print(f"Subtraction:    x - y = {x - y}")
print(f"Multiplication: x * y = {x * y}")
print(f"Division:       x / y = {x / y}")
Addition:       x + y = [5 7 9]
Subtraction:    x - y = [-3 -3 -3]
Multiplication: x * y = [ 4 10 18]
Division:       x / y = [0.25 0.4  0.5 ]

7.13.3.2. Scalar Operations#

A scalar operates on every element:

\[\begin{split}w = a \circ x = \begin{bmatrix} a \circ x_1 \\ a \circ x_2 \\ a \circ x_3 \end{bmatrix}\end{split}\]
print(f"3 + x = {3 + x}")
print(f"3 * x = {3 * x}")
print(f"x / 2 = {x / 2}")
3 + x = [4 5 6]
3 * x = [3 6 9]
x / 2 = [0.5 1.  1.5]

7.13.3.3. The Dot Product#

The dot product between vectors \(x\) and \(y\) is:

\[x \cdot y = \sum_{i=1}^N x_i y_i\]

🐍 Python Insight: The @ operator

The @ operator performs dot products (and matrix multiplication):

result = x @ y  # Dot product of vectors
result = A @ B  # Matrix multiplication

This is cleaner than np.dot(x, y) and matches mathematical notation!

print(f"x = {x}")
print(f"y = {y}")
print(f"x Β· y = {x @ y}")
print(f"Manual: 1*4 + 2*5 + 3*6 = {1*4 + 2*5 + 3*6}")
x = [1 2 3]
y = [4 5 6]
x Β· y = 32
Manual: 1*4 + 2*5 + 3*6 = 32

7.13.4. Matrices #

An \(N \times M\) matrix is a collection of \(M\) vectors stacked as columns:

\[\begin{split}A = \begin{bmatrix} a_{11} & a_{12} & \dots & a_{1M} \\ a_{21} & \ddots & & a_{2M} \\ \vdots & & \ddots & \vdots \\ a_{N1} & a_{N2} & \dots & a_{NM} \end{bmatrix}\end{split}\]

In NumPy, a matrix is a 2-dimensional array.

# Creating matrices
A = np.array([[1, 2, 3],
              [4, 5, 6]])

B = np.ones((2, 3))  # 2x3 matrix of ones

print("Matrix A:")
print(A)
print(f"\nShape: {A.shape} (2 rows, 3 columns)")

print("Matrix B:")
print(B)
print(f"\nShape: {B.shape} (2 rows, 3 columns)")
Matrix A:
[[1 2 3]
 [4 5 6]]

Shape: (2, 3) (2 rows, 3 columns)
Matrix B:
[[1. 1. 1.]
 [1. 1. 1.]]

Shape: (2, 3) (2 rows, 3 columns)

Element-wise and scalar operations work the same way as with vectors:

print("A + B =")
print(A + B)

print("\n2 * A =")
print(2 * A)
A + B =
[[2. 3. 4.]
 [5. 6. 7.]]

2 * A =
[[ 2  4  6]
 [ 8 10 12]]

7.13.5. Matrix Multiplication #

Matrix multiplication generalizes the dot product:

\[C_{ij} = \sum_{k=1}^M A_{ik} B_{kj}\]

Matrix multiplication

⚠️ Caution:

Matrix multiplication requires compatible shapes:

  • If \(A\) is \(N \times M\), then \(B\) must be \(M \times K\)

  • Result is \(N \times K\)

The β€œinner dimensions” must match!

# Matrix multiplication examples
X = np.array([[1, 2],
              [3, 4],
              [5, 6]])  # 3x2

Y = np.array([[1, 2, 3],
              [4, 5, 6]])  # 2x3

print(f"X shape: {X.shape}")
print(f"Y shape: {Y.shape}")
print(f"X @ Y shape: {(X @ Y).shape}")
print("\nX @ Y =")
print(X @ Y)
X shape: (3, 2)
Y shape: (2, 3)
X @ Y shape: (3, 3)

X @ Y =
[[ 9 12 15]
 [19 26 33]
 [29 40 51]]
# Matrix-vector multiplication
v = np.array([1, 2])
print(f"X shape: {X.shape}")
print(f"v shape: {v.shape}")
print(f"X @ v = {X @ v}")
X shape: (3, 2)
v shape: (2,)
X @ v = [ 5 11 17]

7.13.6. Transpose #

The transpose flips a matrix along its diagonal:

\[A^T_{ij} = A_{ji}\]

If \(A\) is \(N \times M\), then \(A^T\) is \(M \times N\).

A = np.array([[1, 2, 3],
              [4, 5, 6],
              [7, 8, 9]])

print("Original A:")
print(A)
print("\nTranspose A.T:")
print(A.T)
Original A:
[[1 2 3]
 [4 5 6]
 [7 8 9]]

Transpose A.T:
[[1 4 7]
 [2 5 8]
 [3 6 9]]

πŸ“Œ Remember:

Two ways to transpose in NumPy:

  • A.T β€” Shorthand (most common)

  • A.transpose() β€” Explicit method


7.13.7. Identity Matrix #

The identity matrix \(I\) has 1s on the diagonal and 0s elsewhere:

\[\begin{split}I = \begin{bmatrix} 1 & 0 & \dots & 0 \\ 0 & 1 & \dots & 0 \\ \vdots & & \ddots & \vdots \\ 0 & 0 & \dots & 1 \end{bmatrix}\end{split}\]

It acts like 1 in matrix multiplication: \(AI = IA = A\)

I = np.eye(3)  # 3x3 identity matrix
print("Identity matrix:")
print(I)

print("\nA @ I = A:")
print(A @ I)
Identity matrix:
[[1. 0. 0.]
 [0. 1. 0.]
 [0. 0. 1.]]

A @ I = A:
[[1. 2. 3.]
 [4. 5. 6.]
 [7. 8. 9.]]

7.13.8. Inverse #

The inverse of matrix \(A\), written \(A^{-1}\), satisfies:

\[A A^{-1} = A^{-1} A = I\]

This lets us β€œsolve” matrix equations like \(Ax = b\):

\[x = A^{-1}b\]

⚠️ Caution:

Not all matrices have an inverse! The matrix must be:

  • Square (\(N \times N\))

  • Non-singular (determinant β‰  0)

# Invertible matrix
A = np.array([[1, 2, 0],
              [3, 1, 0],
              [0, 1, 2]])

A_inv = np.linalg.inv(A)

print("A inverse:")
print(A_inv.round(3))

print("\nA @ A_inv (should be I):")
print((A @ A_inv).round(10))
A inverse:
[[-0.2  0.4  0. ]
 [ 0.6 -0.2  0. ]
 [-0.3  0.1  0.5]]

A @ A_inv (should be I):
[[1. 0. 0.]
 [0. 1. 0.]
 [0. 0. 1.]]

7.13.9. Application: Portfolio Returns #

Linear algebra makes portfolio calculations elegant and efficient.

7.13.9.1. Example: Computing Portfolio Value#

Consider a portfolio with:

  • 4 dollars in Asset A (return: 3%)

  • 2.50 dollars in Asset B (return: 5%)

  • 8 dollars in Asset C (return: -1.1%)

# The slow way: manual calculation
value_manual = 4.0 * (1 + 0.03) + 2.5 * (1 + 0.05) + 8 * (1 - 0.011)
print(f"Manual calculation: ${value_manual:.4f}")
Manual calculation: $14.6570
# The fast way: dot product
positions = np.array([4.0, 2.5, 8.0])
returns = np.array([0.03, 0.05, -0.011])

value_dot = positions @ (1 + returns)
print(f"Dot product: ${value_dot:.4f}")
Dot product: $14.6570

7.13.9.2. Portfolio Weights and Returns#

The portfolio return is the weighted average of asset returns:

\[r_p = \sum_{i=1}^N w_i r_i = w \cdot r\]
# Compute weights
port_value = np.sum(positions)
weights = positions / port_value

print(f"Total portfolio value: ${port_value:.2f}")
print(f"Weights: {weights}")
print(f"Weights sum to: {weights.sum():.4f}")
Total portfolio value: $14.50
Weights: [0.27586207 0.17241379 0.55172414]
Weights sum to: 1.0000
# Portfolio return
portfolio_return = weights @ returns
print(f"Portfolio return: {portfolio_return:.4%}")

# Verify: ending value = starting value Γ— (1 + return)
ending_value = port_value * (1 + portfolio_return)
print(f"Ending value: ${ending_value:.4f}")
Portfolio return: 1.0828%
Ending value: $14.6570

πŸ’‘ Key Insight:

Matrix algebra scales effortlessly:

  • 3 assets? One line of code.

  • 3,000 assets? Same one line of code!

No loops needed.


7.13.10. πŸ“ Exercises #

7.13.10.1. Exercise 1: Warm-up β€” Vector Operations#

πŸ”§ Exercise:

Given vectors a = [2, 4, 6] and b = [1, 3, 5]:

  1. Compute a + b

  2. Compute a * b (element-wise)

  3. Compute the dot product a Β· b

  4. Verify that a Β· b = sum(a * b)

# Your code here
a = np.array([2, 4, 6])
b = np.array([1, 3, 5])
πŸ’‘ Click to see solution
a = np.array([2, 4, 6])
b = np.array([1, 3, 5])

print(f"a + b = {a + b}")
print(f"a * b = {a * b}")
print(f"a Β· b = {a @ b}")
print(f"sum(a * b) = {np.sum(a * b)}")

7.13.10.2. Exercise 2: Extension β€” Matrix Shapes#

πŸ€” Think and Code:

Given matrices:

  • A is 3Γ—4

  • B is 4Γ—2

  • C is 2Γ—3

  1. What is the shape of A @ B?

  2. What is the shape of B @ C?

  3. Can you compute A @ C? Why or why not?

  4. Create these matrices and verify your answers

# Your code here
πŸ’‘ Click to see solution
A = np.ones((3, 4))
B = np.ones((4, 2))
C = np.ones((2, 3))

print(f"A @ B shape: {(A @ B).shape}")  # 3Γ—2
print(f"B @ C shape: {(B @ C).shape}")  # 4Γ—3

# A @ C would be 3Γ—4 @ 2Γ—3 β€” inner dimensions don't match!
# This will raise an error
try:
    A @ C
except ValueError as e:
    print(f"A @ C error: {e}")

7.13.10.3. Exercise 3: Open-ended β€” Solving a System#

πŸ€” Think and Code:

Solve the system of equations: $\(2x + 3y = 8\)\( \)\(4x - y = 2\)$

  1. Write this as a matrix equation \(Ax = b\)

  2. Use np.linalg.inv() to find \(x = A^{-1}b\)

  3. Verify your solution by computing \(Ax\)

# Your code here
πŸ’‘ Click to see solution
# System: Ax = b
A = np.array([[2, 3],
              [4, -1]])
b = np.array([8, 2])

# Solve using inverse
x = np.linalg.inv(A) @ b
print(f"Solution: x = {x[0]:.4f}, y = {x[1]:.4f}")

# Verify
print(f"A @ x = {A @ x}")
print(f"b = {b}")

7.13.11. 🧠 Key Takeaways #

  1. Vectors are 1D arrays, matrices are 2D arrays β€” NumPy handles both with np.array()

  2. The @ operator is your friend β€” Use it for dot products and matrix multiplication

  3. Shape compatibility matters β€” For \(A @ B\), columns of \(A\) must equal rows of \(B\)

  4. Key operations summary:

Operation

Syntax

Result

Dot product

x @ y

Scalar

Matrix multiply

A @ B

Matrix

Transpose

A.T

Flipped matrix

Identity

np.eye(n)

\(n \times n\) identity

Inverse

np.linalg.inv(A)

\(A^{-1}\)

  1. Finance application: Portfolio returns are just dot products: \(r_p = w \cdot r\)