MA2333 Mathematical Foundations of Machine Learning

This course develops the mathematical foundations necessary for

work in machine learning. Topics covered include elements of linear algebra, calculus, prob-

ability theory, and a rudimentary introduction to gradient-based optimization. Concepts

are illustrated through laboratory exercises using the Python programming language and

PyTorch machine learning framework.

Prerequisite

Basic Python programming

Lecture Hours

4

Lab Hours

1

Course Learning Outcomes

A student who successfully completes this course

will be able to

• Recognize and understand basic linear algebra concepts and constructs that arise in ar-

tificial intelligence applications, including vectors and matrices, linear transformations,

matrix operations.

• Recognize and understand basic concepts from calculus that form the foundation of

artificial intelligence applications, including differentiation of univariate and multivari-

ate functions, integration of univariate functions, Taylor’s theorem, the fundamental

theorem of calculus, and elementary optimization theory.

• Recognize and understand basic concepts from probability theory that are foundational

to artificial intelligence applications, including probability measures, continuous and

discrete distribution functions, conditional probabilities and Bayes’s theorem, random

variables, expectation, variance, covariance and simple linear regression.

• Have a basic understanding of how to use computers to work with the constructs

developed in the class in the PyTorch machine learning framework.