Richard Bergmair's Media Library



ML Lecture #1: The PAC Model

Given an approach to a machine learning problem, how do we know it’s correct? In what sense does it need to be correct? A theoretician’s answer: It needs to be PAC correct, i.e. “probably approximately” correct.

Nov-07 2014


ML Lecture #2: Data Representation & Information Theory

==> Machine learning is about fitting a model to a data sample. But how do you measure the amount of information that’s contained within a sample? How do you measure the amount of information required to form a concept of that data? Information theory has some very good answers to questions of how to measure information. <==

Show full text.

Nov-21 2014


ML Lecture #3: Data Representation & Statistics

The data samples forming the point of departure for any machine learning problem come about through some kind of a natural or computational process, which leave recognizable patterns in the data.

Dec-05 2014

ML Lecture #4: Decision Trees & Issues in Representation

Decision trees are a highly generic concept class which can be used to fit a concept to almost any kind of data that might be thrown at a machine learner.

Dec-19 2014

ML Lecture #5: Machine Learning Evaluation & Methodology

Evaluation Methodology greatly influences the outcome of any effort to find a solution to a machine learning problem, as it provides the theoretical framing of the problem at hand, and how to tell whether a given solution is better or worse than another.

Jan-09 2015

ML Lecture #6: Data Representation & Similarity Measures

Notions of similarity and dissimilarity, as well as closeness and distance are at the heart of the kinds of mathematical models that enable machine learning.

Jan-23 2015

ML Lecture #7: Nearest Prototype Methods

By nearest prototype methods, we mean methods such as k-means and k-nearest neighbour. These are also sometimes referred to as instance-based methods.

Feb-20 2015

ML Lecture #8: Bayesian Decision Theory & Gaussians

Bayesian Decision Theory establishes how prior and posterior probabilities factor into decision making in the presence of uncertainty. It can therefore be thought of as one of the most essential and versatile tools of any data scientist.

Mar-20 2015

ML Lecture #9: Naive Bayes & Bayesian Networks

Naive Bayes and Bayesian Networks apply Bayesian Decision Theory to data sets consisting of data points that are represented as strings of features that can be present or absent.

Apr-17 2015