•   When: Tuesday, February 16, 2021 from 11:00 AM to 12:00 PM
  •   Speakers: Vaggos Chatziafratis, Visiting Faculty Researcher, Google Research
  •   Location: ZOOM
  •   Export to iCal

Abstract: 

In this talk, we shed new light on two basic questions in machine learning using ideas from approximation algorithms, optimization and dynamical systems. 

The first question concerns a popular tool in unsupervised learning that partitions a dataset in a hierarchical manner, called Hierarchical Clustering. Despite its long history and plethora of heuristics, a principled framework for understanding its optimization properties had been missing. Our work takes a formal approach and puts Hierarchical Clustering on a firm theoretical grounding, highlighting new connections to convex optimization and graph algorithms. 

The second question concerns the benefits of depth in neural networks. A crucial element in the success of deep learning is the deployment of progressively deeper networks, but is there a mathematical explanation behind this phenomenon? Introducing new ideas from discrete dynamical systems, we present depth vs width tradeoffs, showing that for certain tasks, depth can be exponentially more important than width.

 

Speaker Bio: 

Vaggos Chatziafratis' primary interests are in Algorithms and Machine Learning Theory. He is currently a Visiting Faculty Researcher at Google Research in New York, hosted by Mohammad Mahdian and Vahab Mirrokni, where he is part of the Algorithms and Graph Mining team. Prior to that, he received his Ph.D. in Computer Science at Stanford, where he was part of the Theory group, advised by Tim Roughgarden and co-advised by Moses Charikar. His PhD thesis was on algorithms and their limitations for Hierarchical Clustering. Prior to Stanford, he received a Diploma in EECS from the National Technical University of Athens, Greece.

Posted 3 years, 4 months ago