machine learning,

EMNLP 2020

Kumarmanas Nethil Dec 21, 2020 · 2 mins read

Individual summary notes from EMNLP 2020.

Amresh says

I have recently been following the branch of model explainability for practical purposes. I was delighted to know EMNLP was experiencing a surge of similar research. I encountered Minumum Description Length (MDL) as a measure to understand a model’s ability to capture linguistic properties. The core idea being changing the probing task of identifying labels to transmitting data.

Karthik says

The Dialogue and Interactive systems track in EMNLP was very interesting and introduced novel techniques and approaches to some of the ML problems that I am currently working on for example, MAD-X: a framework for effective transfer learning to new languages using adapter base architecture , TOD-BERT: Pre-training Transformer LMs on open-source dialogue data-sets for better few-short learning capability this could be helpful to mitigate cold-start problem of SLU module. Also the generative approach based papers for dialogue state tracking were an interesting approach that might have few-shot learning capability as compared to discriminative ones.

Lohith says

“Productising” ML Models has become one of the primary answers for most of the complex problem we face in day to day life, and explaining how they work, has become a bigger concern. I was happy to see a lot of papers and demos in that direction, like the LIT tool, which helps both the developer and the user to understand the underlying workings of an ML model and how it looks at each input and how they help in the decision making process. I was glad to see a lot of papers working with various NLP techniques for better and efficient Health Care for people, which we can all say is more important than ever, given what we have seen this year.

Manas says

The Insights from Negative Results in NLP, Workshop was a highlight for me. It revolved around ideas of what an interesting question is and how to connect ideas that didn’t work out. These questions feel central to our research efforts in-team, and the keynotes offered a rewarding and scientific route to exploring open problems in speech tech.

Swaraj says

The papers i liked from this year’s EMNLP were : “Incremental Processing in the Age of Non-Incremental Encoders” by Madureira, B., & Schlangen, D (2020), which was similar to how humans incrementally process natural language and provided a way to probe the workings of transformer encoders; and “Digital Voicing of Silent Speech” by Gaddy, D., & Klein, D. (2020) which was presented very well and i felt they did a good job modelling the EMG features and got good improvements on silent speech. There were a lot of papers and workshops on interpretability of models which is essential considering the direction where the field is heading and the gather sessions did a good job of ensuring one doesn’t miss out on the social aspects of attending a conference.