SDR Deposit of the Month: Dissertation on AI breakthrough makes leaderboard

Occasionally I review the analytics for content published via the Stanford Digital Repository to see what is currently trending. Upon returning to my Lathrop desk in January after the recent winter break, I checked in and discovered that a dissertation submitted last month by student Danqi Chen had enjoyed a whopping 2,736 pageviews in just four weeks since it was published on December 11, 2018. That is an extraordinarily impressive number! I had to find out why this publication was of such widespread interest.
Chen’s work is titled, Neural Reading Comprehension and Beyond, and describes her research to address "one of the most elusive and long-standing challenges of artificial intelligence”: teaching machines to understand human language documents. The results of her work boil down to four significant contributions to the field (you can read about them in some detail on dissertation pages 4-5), pointing the way to advancing natural language processing technology through improved performance, proven and generalizable models, and potential applications for “question answering systems”.
A quick Google search revealed that the news about Chen’s exciting research spread quickly via Twitter and other newsfeeds devoted to machine learning topics. I reached out to Chen to let her know about the high SDR pageview numbers and asked her for a brief phone interview. She spoke with me from New Jersey, where she is starting as an Assistant Professor at Princeton University.
When I asked Chen why her dissertation is getting such great interest and so quickly, she replied: “My thesis includes three papers produced in the course of my PhD studies, and offers the first comprehensive history of this particular field, which is young — it started in 2015 — and is very active right now.... The work is important for electronic processing.” This characterization struck me as humble.
For additional perspective, I contacted Chen's dissertation advisor, Christopher Manning, Professor of Linguistics and Computer Science. He remarked, "Danqi Chen is a pioneer in taking neural network approaches to natural language understanding, and her simple and clean, highly successful models have drawn a lot of interest... The dissertation focuses on neural network reading comprehension and question answering. These emerging technologies are leading to much better information access methods: Rather than a search system that simply returns relevant documents, the system can actually answer your precise questions."
We congratulate Chen on her achievements as a Stanford graduate stuadent and her status as a Stanford Digital Repository depositor: her’s is the 6th most visited SDR PURL in 2018 and the 20th most visited SDR PURL in the past decade!