Great ocean road

We work on deep learning, from fundamental mathematical theory through to real-world applications.

There is a new industrial revolution on the way, and it is imperative that Australia increases the growth-rate of research and applied expertise in deep learning. To this end we aim to publish research in the top conferences (e.g. NeurIPS, ICML, ICLR) and train Masters and PhD students in this quickly emerging field. We hope that some of these students will found companies, providing employment for other mathematicians and contributing to Australian productivity growth through new forms of perceptual and cognitive automation.

We are part of the School of Mathematics and Statistics at the University of Melbourne. We run a seminar on deep learning. We are looking for highly-motivated students to join our group, at either Masters or PhD level (see the Projects section below). You can be interested in anything from the engineering aspect of deep learning, all the way through to the algebraic geometry and statistics of neural networks.

Curious? Feel free to drop by for a chat in our public office hours on Zoom (on hiatus for one week while a new time is decided, see Discord). Our Discord is a deep learning study group, all welcome.

People

The group involves five faculty from across the School of Mathematics and Statistics. The primary researchers are those for whom deep learning is a major component of their overall research agenda:

The other researchers in the group, for whom deep learning is a minor research area:

Our chief composer is Lucas Cantor and our favourite piece of music is his Softbank Sinfonia.

Research projects

Students affiliated with the group are primarily supervised by one of Gong, Wei, or Murfet and are expected to participate in the group seminar. We supervise students at both Masters and PhD level. Here are some of the currently active projects for which we are seeking student contributors:

  • Singular learning theory: (led by Susan Wei, Daniel Murfet, Jesse Gell-Redman, Thomas Quella) Applications of algebraic geometry and stochastic processes to the development of a foundational theory of deep learning, following the work of Sumio Watanabe.

  • Causal Discovery: (led by Mingming Gong) develop methods to infer causal graphs from various kinds of observational data, for example, incomplete time series, noisy data, nonstationary/heterogeneous data, etc.

  • Causal Domain Adaptation: (led by Mingming Gong) leverage causal information to develop machine learning models that can adapt to distributions different from the training distribution.

  • Deep Generative Models: (led by Mingming Gong) leverage the power of neural networks to learn a function which can approximate the model distribution to the true distribution.

  • Fairness in deep learning: (led by Susan Wei) develop and implement statistical methods to fight against algorithm bias, by improving techniques for imposing invariance on deep learning algorithms.

  • Reasoning in deep reinforcement learning: (led by Daniel Murfet) in follow-up work to the simplicial Transformer we are applying these methods to the study of error correcting codes in the design of topological quantum computers, along the lines of Sweke et al (joint with James Wallbridge and James Clift). There are a variety of other possible projects in the context of deep reinforcement learning and Transformer architectures for scientific applications.

  • Program synthesis in linear logic: (led by Daniel Murfet) building on a series of recent papers with James Clift we are using differential linear logic to lay the foundations for a theory of gradient-based program synthesis (survey), also in the context of singular learning theory. This project involves logic as well as implementation in Tensorflow or PyTorch. These topics are discussed in a recent talk.

The required background for these projects varies widely. In the more engineering-led projects you should already be a highly competent programmer and some kind of coding test may be part of the application process. For the more theory-led projects we are looking for students with a strong pure math background and basic programming skills (and the willingness to quickly develop those skills).

To apply send an email to one of the primary supervisors Gong, Wei or Murfet with your CV and transcript (note the official process is no different to a normal Masters or PhD application, in particular we do not currently have any extraordinary scholarships to offer).

Events

We run a research seminar on a range of topics within deep learning, in hiatus for semester one of 2020. For past seminars see here. The best way to be notified of upcoming deep learning classes, bootcamps or seminars run by the group is to subscribe to the group mailing list.