I am a staff research scientist at DeepMind. My research interests are in machine learning and natural language processing.
I received my PhD from Carnegie Mellon University where I was advised by Noah Smith as a member of Noah's ARK.
Prior to CMU, I was a Monbukagakusho (文部科学省) fellow at the University of Tokyo in Tanaka-Ishii's Lab.
Google Scholar, Twitter.
- November 2020: I gave talks at NYU and Cambridge on Semiparametric Language Models (video).
- November 2020: I will serve as an area chair for ACL 2021 and ICML 2021.
- July 2020: I gave a talk at ICML 2020 Retrospectives Workshop on Retrospectives on Learning Language Representations.
- June 2020: I will serve as a virtual infrastructure chair for EMNLP 2021.
- June 2020: I will serve as an area chair for ICLR 2021.
- April 2020: I will serve as an action editor for TACL (2020-2022).
- April 2020: A paper on the cross-lingual transferability of monolingual representations and a position paper on unsupervised cross-lingual learning have been accepted to ACL 2020.
- March 2020: I will serve as an area chair for NeurIPS 2020.
- February 2020: A new preprint on modeling latent skills for multitask language generation.
- February 2020: I will serve as an area chair for EMNLP 2020.
- December 2019: A paper on an information theoretic perspective of language representation learning has been accepted to ICLR 2020 as a spotlight presentation.
- December 2019: Flying to NeurIPS 2019. Check out our episodic memory paper!
- November 2019: I will serve as an area chair for ICML 2020.
- October 2019: I will serve as an area chair for ACL 2020.
- October 2019: AlphaStar in Nature!
- August 2019: I will serve as an area chair for ICLR 2020.