Linear Algebra with Python

Pratik Sharma
7 min readNov 12, 2020

This article is a beginner’s introduction to linear algebra with python. This article has no prerequisite. If you have a basic understanding of all the modules, you can skip to Linear regression with linear algebra. Here. If you are a complete beginner just start a new jupyter notebook or google colab. We will discuss Numpy mostly.

There is also scipy. SciPy is a library that uses NumPy for more mathematical functions. SciPy uses NumPy arrays as the basic data structure, and comes with modules for various commonly used tasks in scientific programming, including linear algebra, integration (calculus), ordinary differential equation solving, and signal processing.

Want to read this story later? Save it in Journal.

We will also perform linear regression with linear algebra in python.

Vector

A vector is a list of number ( can be in a row or column). A vector of length n is just a sequence of n numbers. In numpy, we have arrays method which takes an object as a parameter.

import numpy as npVr = np.array([[1, 2, 3]]) # row vectorVc = np.array([[1],[2], [3]]) # Column vector## To find out the shape of a vector 
Vr.shape # output => (1, 3)
Vc.shape # output => (3, 1)

Vector Operations

1 . Addition of vectors

Addition of two vectors, we add them element-by-element

x = np.ones(3)       # Vector of three ones
y = np.array((5, 4, 6)) # Converts tuple (5, 4, 6) into array
x + y #output => array([8, 7, 9])#Similarly, subtraction of two vectors
y - x # output => array([2, 1, 3])

2. Scalar Multiplication of Vector

Multiply each element of a vector by a scalar

4 * x     # output => array([4, 4, 4])
5 * y # output => array([ 25, 20, 30])

3. Dot Product

The dot product of two vectors is a scalar. Matrix multiplication is the dot product of two matrices.

Dot product of two vectors
X = np.array([1,2,3])Y = np.array([4,5,6])np.dot(x,y)                 # output= 32

4. Hadamard Product

  • element-wise multiplication and outputs a vector
X = np.array([2, 3, 4])
Y = np.array([5,6, 7])
H = X * Y # 2 *5, 3* 6, 4*7
print(H) #output = [10, 18, 28]

Matrices

Matrices are list of vectors. Matrices are two dimensional NumPy array. An matrix of n *k is a rectangular array A of number with n rows and k columns:

Note: A vector is a matrix if neither n = 1 or k = 1

M = np.array([[1, 2], [3, 7], [-1, 5]])# A 3 row * 2 column matrix

Matrix multiplication

Matrix multiplication relies on dot product to multiply various combination of row and column.

Rules for matrix multiplication

  1. The number of columns of the 1st matrix must equal the number of rows of the 2nd.
  2. The product of an M x N matrix and an N x K matrix is an M x K matrix. The new matrix takes the rows of the 1st and columns of the 2nd.
for khan academy
A = np.array([[1, 7], [2, 4]])
B = np.array([[3, 3], [5, 2]])
C = np.linalg.dot(A, B)
print(C)
#output
[[38 17]
[26 14]]
  • Note that using the * for multiplication will result in hadamard product or elementwise operation.
a = np.array(
[[2,3],
[2,3]])
b = np.array(
[[3,4],
[5,6]])

# Uses python's multiply operator
print(a * b) #output
[[ 6, 12],
[10, 18]]

Power Matrix

There is a special method in numpy that can be used to calculate the power matrix .

from numpy.linalg import matrix_powerM = np.array([[1, 4], [5, 2]])
print(M)
M2 = matrix_power(M, 2) # M*M or M to the power 2
print(M2) # [[21, 12], [15, 24]]

Transpose

Here is very good gif from mathsisfun.com

Numpy method .transpose() can be applied to any numpy array to get the transpose matrix.

Mathsisfun.com
A = np.array([[1, 2],[3,4],[5, 6]])AT = A.transpose()  #Give the transpose#shorter notation
A.T #Give the transpose

Determinant

Determinant of a squared matrix can be viewed as the volume scaling factor of the linear transformation described by the matrix. Denoted by det(A), det A or |A|.

System of Linear equation don’t always have a single solution. Other possibilities are of no solution or an infinite set of solutions. Determinant can be used to determine whether a system of equations has a single solution.

more on how to calculate determinant by pen and paper : here

a = np.array([[1, 2], [3,4]])
dtr = np.linalg.det(a) # 1*4 - 2*3
print(dtr) # output = -2.0

Inverse

numpy method = np.linalg.inv(matrix)

The inverse of A is A^(-1) only when:

Inverse of a matrix

more on how to calculate inverse by pen and paper : here

C = np.array([[1,3],[4,5]])C_inverse = np.linalg.inv(C)
print(C_inverse)
#code end
#output =>[[-0.71428571 0.42857143]
[ 0.57142857 -0.14285714]]

QUESTION For the linear regression.

Other Ways to create Matrices and Vectors

Numpy provide number of ways to create matrices and vectors:

V = np.zeros((2,2))             #Creates an array of all zeroes
print(V)
X = np.ones((2,3)) # Creates an array of all ones
print(X)
Y = np.full((2,4), 7) # Creates an array of constant number 7
print(Y)
Z = np.eye(3) #Creates an identity matrix of 3 x 3
print(Z)
R = np.random.random((2,2))#Creates an random array of dimension 2x2
print(R)

Performing Linear Regression with Linear Algebra

Least Squares Linear regression using matrices.

  • For points(x1, y1), (x2, y2), (x3,y3)….,(xn, yn), the least Square regression line can be given by :
  • which will minimize the sum of the squared error, which are the errors in using the regression function f(x) to estimate the true y values,

where

is the error approximating y_i.

Using our points (x1, y1), (x2, y2), (x3,y3)….,(xn, yn), we would have the following system of equation:-

Linear equations
  • Now, Let’s set up a matrix equation
Matrix equations
  • The solution of least Square regression equation Y = XA + E is given by :-
Solution Equation
  • The Sum of Squared Errors(SSE) is given by
Sum of squared error

Python code

Let’s have a regression data and try to solve it with numpy.

Question: Determine the least squares regression line using a matrices. The price is $x and y is the monthly sales. Then find the sum of the squared errors.


import numpy as np # linear algebra
import pandas as pd #
import matplot.pyplot as plt # graphs
X = np.array([[1, 49],
[1, 69],
[1, 89],
[1, 99],
[1, 109]])
Y = np.array([[124],
[95],
[71],
[45],
[18]])

Breaking the solution matrix A in three steps:

  1. a = XT * X
XT = X.transpose()
a = np.dot(XT, X)

2. b = inverse matrix of a

b = np.linalg.inv(a)

3. c = XT * Y

c = np.dot(XT, Y)

And, Then finally we will have A = bc.

A = np.dot(b,c)

Slope of the Regression line

slope= round(float(A[1]), 2)              

Y-intercept of the Regression line

y_intercept = round(float(A[0], 2)

Pandas addition to find out the SSE and Y-i(prediction of the line with f(x))

X = np.array([[49], [69], [89], [99], [109]]) #Column matrix of Xdf = pd.Dataframe(data=X, columns=["X"])print(df)          #prints our X element in a tabledf["Y"] = Yfx = X*slope + 211df["fx"] = fx##difference between the actual Y and predicted Y_i = Fxdf['E_i'] = df["Y"]- df["fx"]

The sum of square of error is given by : SSE = ET* E

E_i = [[x] for x in df["E_i"]]             # List comprehension 
E_i = np.array(E_i)
ET = E_i.transpose()SSE = np.dot(ET, E_i)

SSE is equal to 205.2765 , which is quite large but let’s find out the mean square error

N = 5                               # Number of data points
Mean_square_error = SSE/N

Mean_square_error = 41.0553

Let’s plot our Sample data points

fig = plt.plot(df["X"], df["Y"])
fig.show()
The Graph of sample points

Regression line

fig = plt.plot(df["X"], df["fx"])
Regression Line

Wait, this all can be done with Numpy submodule polyfit

m, b = np. polyfit(df["X"], df["Y"], 1)

m = slope of the regression line

b = y-intercept

Let’s plot it :-

plt.plot(df["X"], m*df["X"] + b)

Next Step, here are some articles on linear algebra with python you should check out. I also took some reference from here:-

  1. Quantitative Economics with python :- here
  2. Linear Algebra with SciPy :- here

If (you liked the article) {

clap!

} else {

critical constructive comment

}

Both are appreciated !!

More from Journal

There are many Black creators doing incredible work in Tech. This collection of resources shines a light on some of us:

--

--