AI Speakers  >>

Logic Speakers  >>

Machine Learning Speakers >>

Machine Learning Summer School - Speakers & Courses


Game Theory and Clustering


Title: From Cliques to Equilibria: The Dominant Set Framework for Pairwise Data Clustering.

Content: The pairwise clustering problem. Dominant sets and their properties. Overview of (evolutionary) game theory. Dominant sets as game equilibria. Algorithms. Applications.

Course Description: The course will provide an overview of recent work on pairwise data clustering which has lead to establish intriguing connections between unsupervised learning and (evolutionary) game theory. The framework is centered around the notion of a "dominant set," a novel graph-theoretic concept which generalizes that of a maximal clique to edge-weighted graphs. Algorithms inspired from evolutionary game theory, and applications in computer vision and pattern recognition will be discussed.

Pre-requisites: Linear algebra and calculus. Some familiarity with the basic notions of optimization theory and dynamical systems.

Lecturer:

Marcello Pelillo is an associate professor of computer science at Ca' Foscari University, Venice, where he also serves as the chair of the board of studies of the Computer Science School. He held visiting research positions at Yale University, the University College London, McGill University, the University of Vienna, and York University (England). He has published more than 100 technical papers in refereed journals, handbooks, and conference proceedings in the areas of computer vision, pattern recognition and neural computation. He was actively involved in the organization of several scientific meetings and, in 1997, he co-established a new series of international workshops devoted to energy minimization methods in computer vision and pattern recognition (EMMCVPR), which has now reached the sixth edition www.emmcvpr org. He is regularly on the program committees of the major international meetings in his fields and serves on the editorial board for the journals IEEE Transactions on Pattern Analysis and Machine Intelligence and Pattern Recognition. Prof. Pelillo is a Senior Member of the IEEE and a Fellow of the IAPR.


Learning Theory


Title: Unifying Risk, Divergence and Information

Content: Dual views of Binary Classification. Scoring Rules. Information, Risk and Divergence. Divergence Estimation.

Course Description: This course highlights some relationships between surrogate losses, scoring rules, f-divergences, Bregman divergences, statistical information and ROC curves and their implications for applications such as divergence estimation.

Pre-requisites: Basic probability theory and statistics, some familiarity with convex analysis.

Lecturer:

Mark Reid is a post-doctoral Research Fellow in the Research School of Information Sciences and Engineering at the Australian National University, Canberra.


Unsupervised Learning


Title: Un-supervised, Semi-supervised and Partially-supervised Learning

Course Description: The first part of his tutorial will discuss un-supervised, semi-supervised and partially-supervised learning. Convex relaxations will be presented for un-supervised and semi-supervised training of support vector machines, max-margin Markov networks, log-linear models and Bayesian networks. The concept of partially-supervised training will then be introduced, with convex relaxations developed for training multi-layer perceptrons and deep networks. Relationships of these methods to classical training algorithms (EM, Viterbi-EM, and self-supervised training) will be discussed. Limitations of convex relaxations will also be considered. The tutorial will then present methods for scaling up such training algorithms. Finally, some simple approximation bounds will be introduced, along with a rudimentary generalization theory for self-supervised training.

Dual Dynamic Programming and Reinforcement Learning

The second presentation will cover a dual approach to reinforcement learning and dynamic programming for Markov decision processes. This dual approach is based on representing state-action visit distributions instead of value functions (which summarize expected future discounted returns). The alternative dual representation leads to a new family of dynamic programming and reinforcement learning algorithms that converge in situations where standard primal approximation strategies diverge. Some applications will be discussed.

Lecturer:

Dale Schuurmans is a Professor in the Department of Computing Science at the University of Alberta.


Document Analysis


Title: Latent Variable Models for Document Analysis

Content: Introduction to NLP, and document analysis and information access from machine learning perspective; latent variable models in text applications.

Course Description: We will consider various problems in document analysis (named entity recognition, natural language parsing, information retrieval), and look at various probabilistic graphical models and algorithms for addressing the problem. This will not be an extensive coverage of information extraction or natural language processing, but rather a look at some of the theory, methods and practice of particular cases.

Pre-requisites: Graphical models, probabilistic models, maximum likelihood methods.

Lecturer:

Wray Buntine is a Principal Researcher in the Statistical Machine Learning Group at NICTA's Canberra Laboratory and an Adjunct Professor at the Research School of Information Sciences and Engineering at the Australian National University.


Group Theory in Machine Learning


Title: Aspects of Group Theory in Machine Learning

Content: Basic Theory, Orbit Counting, Coloring with Symmetry, Fastest Mixing Markov Chains with Symmetry. Fourier Transform on Groups. Integral Geometry in Pattern Analysis.

Course Description: This course covers diverse aspects of the role played by symmetry in pattern analysis and machine learning. It is designed to provide background knowledge using examples and to touch current research topics without over emphasizing formalizations and technical descriptions.

Pre-requisites: Basic algebra

Lecturer:

Marconi Barbosa is a Researcher at NICTA's Canberra Laboratory, where he is a member of the Statistical Machine Learning research group. He is also an Adjunct Research Fellow at the Research School of Information Sciences and Engineering, Australian National University.


Introduction to Statistical Machine Learning


Content: Bayesian inference and maximum likelihood modeling; regression, classification, density estimation, clustering, principal component analysis; parametric, semi-parametric, and non-parametric models; basis functions, neural networks, kernel methods, and graphical models; deterministic and stochastic optimization; overfitting, regularization, validation.

Course Description: This course provides a brief overview of the methods and practice of statistical machine learning, which is concerned with the development of algorithms and techniques that learn from observed data by constructing stochastic models that can be used for making predictions and decisions. The idea of the course is to (a) give a mini-introduction and background to logicians interested in the AI courses, and (b) to summarize the core concepts covered by the machine learning courses during this week.

Lecturer:

Marcus Hutter is Associate Professor in the RSISE at the Australian National University in Canberra and NICTA adjunct.


Computer Vision


Title: Pseudo-boolean and discrete optimization and its applications in computer vision

Course Description: A pseudo-boolean function is a function from the space B^n of boolean (0-1) vector to the real numbers. They occur naturally in problems in computer vision related to segmentation where every pixel in an image should be labelled 0 or 1 to minimize a certain cost function. Although the minimization of such functions in in general NP hard, many techniques have been develloped to minimize certain classes of such functions. This is the topic of pseudo-boolean optimization, which will be the subject of this talk. Useful methods include graph-cuts algorithms, message passing and linear programming relaxation. The extension to functions with a finite label set will also be considered.

Lecturer:

Richard Hartley is a professor at the Australian National University and NICTA, best known for his work in geometric Computer Vision (Multi-view geometry). Previously, he was employed for 16 years at General Electric Research Laboratories in Schenectady, New York, where he worked on many applications of Computer Vision to industrial problems.


Reinforcement Learning


Title: Introduction to Reinforcement Learning

Content: Markov Decision Processes (MDPs), Model-based MDP solutions (dynamic programming, policy iteration, linear programming), Model-free MDP solutions (Monte Carlo, Q-learning, Temporal Differences), Function approximation, Hybrid planning and learning.

Course Description: This course covers the theory and application of reinforcement learning: the task of learning to make optimal sequential decisions when given a delayed reward signal. Topics will include planning in known and unknown environments and will place equal emphasis on theoretical results and practical implementation issues in the context of various applications.

Pre-requisites: Basic discrete probability theory.

Lecturer:

Scott Sanner is a Researcher at NICTA's Canberra Laboratory, where he is a member of the Statistical Machine Learning research group. He is also an Adjunct Research Fellow at the Research School of Information Sciences and Engineering, Australian National University.


Graphical Models


Title: Graphical Models

Content: Basic Theory, Bayesian Networks, Markov Random Fields, Belief Propagation, Junction Trees, Sampling, EM-algorithm, Applications

Course Description: This course covers the basics of Probabilistic Graphical Models, including the basic theory of Bayesian Networks and Markov Random Fields, as well as inference and learning algorithms and applications.

Pre-requisites: Basic linear algebra, statistics and calculus.

Lecturer:

Tiberio Caetano is a senior researcher at NICTA's Canberra Laboratory, where he is a member of the Statistical Machine Learning research group. He is also an Adjunct Research Fellow at the Research School of Information Sciences and Engineering, Australian National University.


Data Mining


Content: Contrast pattern mining and its applications

Course Description: The ability to distinguish, differentiate and contrast between different datasets is a key objective in data mining. Such an ability can assist domain experts to understand their data, and can help in building classification models. This presentation will introduce the principal techniques for contrasting datasets. It will also focus on some important real world application areas that illustrate how mining contrasts is advantageous.

Lecturer:

Professor Ramamohanarao (Rao) Kotagiri received his degrees BE at Andhra University, ME at the Indian Institute of Science, Bangalore and PhD at Monash University. He was awarded the Alexander von Humboldt Fellowship in 1983. He has been at the University Melbourne since 1980 and was appointed a professor in computer science in 1989. Rao held several senior positions including Head of Computer Science and Software Engineering, Head of the School of Electrical Engineering and Computer Science at the University of Melbourne, Deputy Director of Centre for Ultra Broadband Information Networks, Co-Director of the Key Centre for Knowledge-Based Systems, and Research Director for the Cooperative Research Centre for Intelligent Decision Systems. He served as a member of the Australian Research Council Information Technology Panel. He served on the Prime Minister's Science, Engineering and Innovation Council working party on Data for Scientists. He also served on the Editorial Boards of the Computer Journal. At present he is on the Editorial Boards for Universal Computer Science, the Journal of Knowledge and Information Systems, IEEE TKDE (Transactions on Knowledge and Data Engineering), Journal of Statistical Analysis and Data Mining and VLDB (Very Large Data Bases) Journal. He served as a program committee member of several International conferences including SIGMOD, IEEE ICDM, VLDB, ICLP and ICDE. He was the program Co-Chair for VLDB, PAKDD, DASFAA and DOOD conferences. He is a steering committee member of IEEE ICDM, PAKDD and DASFAA. Rao is a Fellow of the Institute of Engineers Australia, Australian Academy Technological Sciences and Engineering and Australian Academy of Science. Rao has research interests in the areas of Database Systems, Logic Based Systems, Agent Oriented Systems, Information Retrieval, Data Mining, Intrusion Detection and Machine Learning.