Season 1 - Episode 04: Energy functions and shortcut learning

This week we are joined by Kyunghyun Cho. He is an associate professor of computer science and data science at New York University, a research scientist at Facebook AI Research and a CIFAR Associate Fellow. On top of this he also co-chaired the recent ICLR 2020 virtual conference.

We talk about a variety of topics in this weeks episode including the recent ICLR conference, energy functions, shortcut learning and the roles popularized Deep Learning research areas play in answering the question “What is Intelligence?”.

Underrated ML Twitter: https://twitter.com/underrated_ml

Kyunghyun Cho Twitter: https://twitter.com/kchonyc?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor

Please let us know who you thought presented the most underrated paper in the form below:

https://forms.gle/97MgHvTkXgdB41TC8

Links to the papers:

Shortcut Learning in Deep Neural Networks” - https://arxiv.org/pdf/2004.07780.pdf

"Bayesian Deep Learning and a Probabilistic Perspective of Generalization” - https://arxiv.org/abs/2002.08791

"Classifier-agnostic saliency map extraction" - https://arxiv.org/abs/1805.08249

Deep Energy Estimator Networks” - https://arxiv.org/abs/1805.08306

End-to-End Learning for Structured Prediction Energy Networks” - https://arxiv.org/abs/1703.05667

On approximating nabla f with neural networks” - https://arxiv.org/abs/1910.12744

Adversarial NLI: A New Benchmark for Natural Language Understanding“ - https://arxiv.org/abs/1910.14599

Learning the Difference that Makes a Difference with Counterfactually-Augmented Data” - https://arxiv.org/abs/1909.12434

Learning Concepts with Energy Functions” - https://openai.com/blog/learning-concepts-with-energy-functions/

Previous
Previous

Season 1 - Episode 05: Language independence and material properties

Next
Next

Season 1 - Episode 03: Pooling Layers and learning from Brains