2021 is a very productive year for natural language processing (NLP) and machine learning (ML). Now it is time to count the papers in the field of NLP and ML last year.
MAREK REI, a researcher in machine learning and natural language processing from the University of Cambridge, summarized and analyzed classic papers in 2021 and summarized the statistics of ML and NLP publications in 2021. The major conferences and journals in the intelligence industry were analyzed, including ACL, EMNLP, NAACL, EACL, CoNLL, TACL, CL, NeurIPS, AAAI, ICLR, and ICML.
The analysis of the paper is completed using a series of automated tools, which may not be perfect and may contain some flaws and errors. For some reason, some authors started publishing their papers in obfuscated form to prevent any form of content duplication or automated content extraction, and these papers were excluded from the analysis process.
Now let’s take a look at the MAREK REI statistical results.
The number of submissions to most conferences continues to rise and break records. ACL appears to be an exception, with AAAI almost leveling off and NeurIPS still growing steadily.
The leading research institution in the number of papers published in 2021 is undoubtedly Google ; Microsoft ranked second; CMU, Stanford University, Meta and MIT ranked closely behind, and Tsinghua University ranked seventh. Microsoft, CAS, Amazon, Tencent, Cambridge, Washington, and Alibaba stand out with a sizable proportion of papers at NLP conferences, while other top organizations seem to focus primarily on the ML field.
From the data of 2012-2021, Google published 2170 papers and ranked first, surpassing the 2013 papers published by Microsoft . CMU published 1,881 papers, ranking third.
Most institutions continue to increase their annual publication numbers. The number of papers published by Google used to grow linearly, and now this trend has eased, but it still publishes more papers than before; CMU had a plateau last year, but has made up for it this year; IBM seems to be the only company that publishes slightly more papers Declining institutions.
Next, let’s take a look at 2021 Researchers who publish the most papers per year. Sergey Levine (Assistant Professor of Electrical Engineering and Computer Science, University of California, Berkeley) published 42 papers, ranking first; Liu Tieyan (Microsoft), Zhou Jie (Tsinghua University), Mohit Bansal (University of North Carolina at Chapel Hill), Graham Neubig (CMU) also ranks relatively high in the number of papers published.
Throughout 2012-2021, the papers published by Sergey Levine ranked first. Last year he ranked sixth. This year It jumped to the first place; Yoshua Bengio (Montreal), Graham Neubig (CMU), Zhang Yue (Westlake University), Zhou Ming (Chief Scientist of Innovation Works), Ting Liu (Harbin Institute of Technology) and others also ranked relatively high in terms of the number of papers they published. .
Sergey Levine sets a new record by a considerable margin; Mohit Bansal’s number of papers also increases significantly, 2021 Published 31 papers in 2020, the same as Graham Neubig; Yoshua Bengio's number of papers decreased in 2020, but is now rising again.
Researchers who publish the most papers are usually postdocs and supervisors. In contrast, people who publish more papers as first authors are usually people who do actual research.
Ramit Sawhney (Technical Director of Tower Research Capital) published 9 influential papers in 2021, Jason Wei (Google) and Tiago Pimentel (PhD student at Cambridge University) published respectively 6 influential papers were published.
From the 2012-2021 distribution, Ivan Vulić (University of Cambridge) and Zeyuan Allen-Zhu (Microsoft) are both first authors Published 24 influential papers, tied for first place; Yi Tay (Google) and Li Jiwei (Shannon Technology) ranked second, having published 23 and 22 influential papers as first authors respectively. papers on NeurIPS; Ilias Diakonikolas (University of Wisconsin-Madison) has published 15 NeurIPS papers as the first author.
Number of publications by country in 2021, United States The number of publications is the largest, with China and the UK ranking second and third respectively. In the United States and the United Kingdom, NeurIPS accounts for the largest proportion, while AAAI accounts for the largest proportion in China.
The vertical coordinates from top to bottom are 500, 1000, 1500, 2000, 2500, and so on
Almost all top-ranked countries continue to increase their number of publications and set new records in 2021. The increase was the largest for the United States, further extending its lead.
In the United States, Google, Microsoft and CMU once again lead the list in terms of number of publications.
##In China, Tsinghua University, Chinese Academy of Sciences and Peking University published the most papers in 2021.
##Based on topic correlation statistics
We can also visualize the author, but this visualization is a bit difficult to understand.
Statistics based on keywords
The word “neural” seems to be on a slight downward trend, although you can still find it in 80% of papers. At the same time, the proportions of "recurrent" and "convolutional" are also declining, and the word "transformer" appears in more than 30% of papers.
If you look at the word "adversarial" alone, we will find that it is very common in ICLR, and almost half of the papers mention it. The proportion of "adversarial" in ICML and NeurIPS seems to have peaked before, while AAAI has not. In the past few years, the term "transformer" has become very popular. It is particularly widely used in NLP papers, with over 50% of published papers containing it, and its popularity is steadily increasing across all ML conferences.
The above is the detailed content of 2021 ML and NLP academic statistics: Google ranks first, and reinforcement learning expert Sergey Levine tops the list. For more information, please follow other related articles on the PHP Chinese website!