Maths for Machine Learning: Multivariate Calculus

Many important machine learning approaches have calculus at their very core. The aim of this module is for you to see the connection between the maths and the meaning. There are six sections:

1. What is calculus?
2. Multivariate calculus
3. Multivariate chain rule and its applications
4. Taylor series and linearisation
5. Intro to optimisation
6. Regression

The course will start by laying down the foundation of calculus through the use of intuition-building animations, before generalising these concepts to multi-dimensional systems, where you will discover how to navigate mountain ranges in the dark. Next, you will learn exactly how calculus is used to train neural networks, as well as working through how to write code to do this using the Python programming language. You will also have the chance to put all of your multivariate calculus skills into action by writing some code to fit complicated functions to real data.

What are the learning outcomes?

What is calculus:

• Recall the definition of differentiation
• Apply differentiation to simple functions
• Describe the utility of time saving rules
• Apply sum, product and chain rules

Multivariate calculus:

• Recognize that differentiation can be applied to multiple variables in an equation
• Use multivariate calculus tools on example equations
• Recognise the utility of vector/matrix structures in multivariate calculus
• Examine two dimensional problems using the Jacobian matrix

Multivariate chain rule and its applications:

• Apply the multivariate chain rule to differentiate nested functions
• Explain the structure and function of a neural net
• Apply multivariate calculate tools to relate network parameters to outputs
• Implement backpropagation on a small neural network

Taylor series and linearization:

• Recognise power series approximations to functions
• Interpret the behaviour of power series approximations for ill- behaved functions
• Explain the meaning and relevance of linearisation
• Select appropriate representation of multivariate approximations

Intro to optimization:

• Recognize the principles of gradient descent
• Implement optimisation using multivariate calculus
• Examine cases where the method fails to return the best solution
• Solve gradient descent problems that are subject to a constraints using Lagrange Multipliers

Regression:

• Describe regression as a minimisation of errors problem
• Distinguish appropriate from inappropriate models for particular data sets
• Calculate multivariate calculus objects to perform a regression
• Create code to fit a non-linear function to data using gradient descent

Who is this course aimed at?

Any students from all undergraduate Science and Engineering degrees. It is recommended that you are comfortable with the material of the ‘Maths for Machine Learning: Linear Algebra’ course before you attempt this course. Please note that you should also be comfortable in your A-Level Maths before you start this course as it is designed to challenge you.

How will this course be delivered?

This is an asynchronous module and will be delivered online, via the Coursera platform. Please visit the links to courses tab in your Microsoft Teams space for information on how to get started.

How much time will the course take up?

The course is designed to take up a total of approximately 18 hours, to be distributed according to your own preference.