Yani's portrait photo

Yani Ioannou, PhD

Dept. of Electrical and Software Engineering
Schulich School of Engineering
University of Calgary

I'm an Assistant Professor at the University of Calgary in the Department of Electrical and Software Engineering of the Schulich School of Engineering, and lead the Calgary Machine Learning Lab.

I was previously a Postdoctoral Research Fellow at the Vector Institute and University of Guelph, working with Prof. Graham Taylor, and a Visiting Researcher at Google Brain Toronto/Google AR Core.

I completed my PhD at the University of Cambridge in 2018 supported by a Microsoft Research Ph.D. Scholarship, where I was supervised by Professor Roberto Cipolla and Dr. Antonio Criminisi.

I am currently interested in efficient deep learning, specifically for computer vision problems, and sparse neural network training. I have in the past worked on exoplanet detection with NASA, medical imaging, and 3D computer vision methods for processing and recognizing objects in large point clouds.

Selected Publications

For a full publications list, see Google Scholar.

Paper coverpage

Dynamic Sparse Training with Structured Sparsity

M. Lasby, A. Golubeva, U. Evci, M. Nica, Y. Ioannou

International Conference on Learning Representations (ICLR) 2024, Vienna, Austria.

Paper coverpage

Gradient Flow in Sparse Neural Networks and How Lottery Tickets Win

*U. Evci, *Y. Ioannou, C. Keskin, Y. Dauphin
* Equal Contribution

Oral Presentation
36th AAAI Conference on Artificial Intelligence (AAAI) 2022, Vancouver, BC, Canada

Paper coverpage

Rapid Classification of TESS Planet Candidates with Convolutional Neural Networks

H. Osborn, M. Ansdell, Y. Ioannou, M. Sasdelli, et al.

Astronomy & Astrophysics
Volume 633, Number A53
January, 2020

Paper coverpage

Scientific Domain Knowledge Improves Exoplanet Transit Classification with Deep Learning

M. Ansdell , Y. Ioannou , H. P. Osborn , M. Sasdelli, et al.

The Astrophysical Journal Letters
Volume 869, Number 1
December 5, 2018

PhD Thesis Coverpage

Structural Priors in Deep Neural Networks

Y. Ioannou

Ph.D. Thesis, Department of Engineering, University of Cambridge, Sept. 2017.
PDF (Print) BibTeX LaTeX Source

Paper coverpage

Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups

Y. Ioannou, D. Robertson, R. Cipolla, A. Criminisi

IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2017, Honolulu, Hawaii.
BibTeX Poster

Paper coverpage

Training CNNs with Low-Rank Filters for Efficient Image Classification

Y. Ioannou, D. Robertson, J. Shotton, R. Cipolla, A. Criminisi

International Conference on Learning Representations (ICLR) 2016, San Juan, Puerto Rico.
BibTeX Poster Models

Paper coverpage

Decision Forests, Convolutional Networks and the Models in-Between

Y. Ioannou, D. Robertson, D. Zikic, P. Kontschieder, J. Shotton, M. Brown, A. Criminisi

Microsoft Research Tech Report
MSR-TR-2015-58, April, 2015.

Paper coverpage

Difference of Normals as a Multi-Scale Operator in Unorganized Point Clouds

Y. Ioannou, B. Taati, R. Harrap, M. Greenspan

3D Imaging, Modeling, Processing, Visualization and Transmission (3DIMPVT) 2012, Z├╝rich, Switzerland. BibTeX Poster

Master's Thesis Coverpage

Automatic Urban Modelling using Mobile Urban LIDAR Data

Y. Ioannou

Thesis (M.Sc. Computing), Queen's University, Canada, 2010.


Courses I'm Teaching/I've Taught Recently


Recent Invited Talks and Lectures

Aligning Research in Sparsity with Hardware

Workshop on Sparsity in Neural Networks,
ICLR 2023, Kigali, Rwanda

May 5, 2023

Training Unstructured Sparse Neural Networks

Visual Computing Lab, Electronic Engineering Department
Hanyang University, South Korea

August 18, 2022

Inductive Bias in Deep Learning

IEEE Southern Alberta Computer Society Chapter,
University of Calgary, Calgary, Alberta, Canada

April 20, 2022

Curriculum Vitae/Resume

See LinkedIn for more information.

Contact Me

Preferably contact me about careers on , and research on . Alternatively, e-mail me.

You can also find out more about me through one of the many social links below.