Spring 2020

April 21st: Stefanie Jegelka – “Representation and Learning in Graph Neural Networks”

April 28th: Mads Nielsen & Akshay Pai – “Risk assessment of severe Covid-19 infection”

May 26th: Ravi Shankar & Ambar Pal – “Non-Parallel Emotion Conversion in Speech via Variational Cycle-GAN” & “A Regularization view of Dropout in Neural Networks”

July 16th: Eli Sherman – “Identification Theory in Segregated Graph Causal Models”

Fall 2020

September 1st: Enzo Ferrante – “Towards anatomically plausible medical image segmentation, registration and reconstruction”

September 8th: Anima Anandkumar- “Bridging the Gap Between Artificial and Human Intelligence: Role of Feedback”

September 15th: Giles Hooker – “Ensembles of Trees and CLT’s: Inference and Machine Learning”

September 22nd: Jelena Diakonikolas – “On Min-Max Optimization and Halpern Iteration”

September 29th: Tom Goldstein – Evasion and poisoning attacks on neural networks: theoretical and practical perspectives

October 6th: Daniel Hsu – Contrastive learning, multi-view redundancy, and linear models

October 13th: Kate Saenko – Learning from Small and Biased Datasets

October 27th: Rama Chellappa – Generations of Generative Models for Images and Videos with Applications

November 3rd: Adam Charles – Data Science in Neuroscience: From Sensors to Theory

November 10th: SueYeon Chung – Emergence of Separable Geometry in Deep Networks and the Brain

November 17th: Kimia Ghobadi – Inverse Optimization

November 24th: Poorya Mianjy – Understanding the Algorithmic Regularization due to Dropout

December 1: Eva Dyer – Representation learning and alignment in biological and artificial neural networks

December 15: Ida Momennejad – Multi-scale Predictive Representations

Spring 2021

January 26: Surya Ganguli – Weaving together machine learning, theoretical physics, and neuroscience

February 2: Wiro Niessen- Biomedical Imaging and Genetic Data Analysis With AI: Towards Precision Medicine

February 16: Andrej Risteski- Representational aspects of depth and conditioning in normalizing flows

February 23: Mario Sznaier- Easy, hard or convex?: the role of sparsity and structure in learning dynamical models

March 2: Lalitha Sankar- Alpha-loss: A Tunable Class of Loss Functions for Robust Learning

March 9: Daniella Witten- Selective inference for trees

March 16: Smita Krishnaswamy- Geometric and Topological Approaches to Representation Learning in Biomedical Data

March 23: Rong Ge- A Local Convergence Theory for Mildly Over-Parameterized Two-Layer Neural Network

March 30: Juan Carlos Niebles- Event Understanding: a Cornerstone of Visual Intelligence

April 6: Maria De-Arteaga: Mind the gap: From predictions to ML-informed decisions

April 13: Kristen Grauman- Sights, sounds, and space: Audio-visual learning in 3D environments

April 20: Su-In Lee: Explainable Artificial Intelligence for Biology and Health

April 27: Sharon Yixuan Li: Towards Reliable Open-world Machine Learning

Fall 2021

August 31st: Ben Grimmer: “Radial Duality: Scalable, Projection-Free Optimization Methods”

September 7th: Praneeth Netrapalli, “Pitfalls of Deep Learning”

September 14: Sara A. Solla, “Population Dynamics in Neural Systems”

September 21st: Mário Figueiredo, “Three Recent Short Stories About Image Denoising”

September 28th: Betsy Ogburn, “Disentangling confounding and nonsense associations due to dependence”

October 5th: Alex Dimakis, “Generative models and Unsupervised methods for Inverse problems”

October 12th: Bruno Olshausen, “Perception as Inference”

October 19th: Claire Boyer, “Sampling rates for l1-synthesis”

October 26th: Laura Balzano, “Finding low-dimensional structure in messy data”

November 2nd: Pragya Sur, “Precise high-dimensional asymptotics for AdaBoost via max-margins & min-norm interpolants”

November 9th: Jean-Philippe Vert, “Deep learning for biological sequences”

November 16th: Anqi (Angie) Liu: “Towards Trustworthy AI: Distributionally Robust Learning under Data Shift”

November 23rd: Fall Break

November 30th: Soledad Villar, “Equivariant Machine Learning, Structured Like Classical Physics”