ACL 2025 Opens Amid a Deepening Shift in Global NLP Research Participation
-
ACL 2025 has officially opened in Vienna, marking the 63rd edition of the premier NLP conference. This year’s event is the largest ACL gathering ever, both in terms of attendance and the sheer volume of research output. The numbers reveal a continuation and amplification of the global realignment in NLP research.
Author Demographics: A Trend Intensifies
Metric ACL 2025 ACL 2024 All Authors (~20 K total) China 51.0% · United States 18.6% · South Korea 3.4% · United Kingdom 2.9% · Germany 2.6% · Singapore 2.4% · India 2.3% · Japan 1.6% · Australia 1.4% · Canada 1.3% · Italy 1.3% · France 1.2% – First Authors China 51.3% · United States 14.0% · South Korea 5.2% · India 4.6% · Germany 2.8% China 30.5% · United States 29.5% · United Kingdom 4.5% · India 3.6% · Germany 3.4% Chinese scholars now contribute over half of all authors. Rather than a sudden pivot, this represents a sharp continuation of an existing trend from 2024.
Conference by the Numbers
- Main conference papers: 1,700
- Findings papers: 1,400
- Industry track papers: 108
- CL (Computational Linguistics) journal track: 17
- TACL papers: 40
- Keynotes / Panels: 2 keynotes, 1 panel
- Workshops: 28 (800+ submissions)
- Tutorials: 8
- Demos: 64
- Student Research Workshop papers: 104
Additional D&I programming includes a BoF & affinity-group lunch, mentorship events, and the introduction of the ACL Doctoral Dissertation Award.
Submission & Acceptance Trends
As the top conference in NLP, ACL's submission has increase 10 times in the past 10 years and 4 times in the past 5 years.
Year Total Submissions Δ YoY Main Papers Accepted Main Acc. Rate Findings Papers Accepted Findings Acc. Rate 2025 8,360 +70 % 1,699 20.3 % 1,392 16.7 % - Desk rejections rose by 160 % (template issues 31 %, responsible checklist 27 %, anonymity 20 %, limitations section 17 %).
- Reviewing capacity expanded: 5,903 reviewers (+38 %), 1,122 ACs (+58 %), 169 SACs (+150 %).
Research Focus Areas (Main Track)
Topic Share NLP Applications 13.1 % Resources & Evaluation 12.4 % Multimodality & Language Grounding 7.3 % Efficient/Low-Resource Methods 7.0 % Language Modeling 6.6 % Interpretability & Analysis 6.4 % Remaining categories (e.g., Generation, Dialogue, Ethics) ≤ 6 % each Other Notable Statistics
- 67 % of titles/abstracts include “LLM”
- 9 % mention GPT
- 8 % mention LLaMA
- 2 % mention DeepSeek, BERT, or Gemini/Gemma
- 50 authors submitted ≥ 10 papers; 23 % of authors submitted ≥ 2 papers
- 250 papers list ≥ 10 authors; 20 papers are single-authored
- 65 % of paper titles contain a colon “
:
”
The ACL 2025 Doctoral Dissertation Award
Winner (2025 inaugural edition): Sewon Min
- Affiliation: Assistant Professor, EECS, UC Berkeley; Research Scientist, Allen AI
- Thesis: Rethinking Data Use in Large Language Models (157 pp., University of Washington, 2024)
- Core contributions:
- Demonstrated that in context learning relies heavily on knowledge memorized from training corpora.
- Proposed nonparametric LLMs that treat training data as a retrieval database—boosting accuracy and updatability.
- Built one of the first widely adopted neural retrieval systems and a single-stage retrieval-generation pipeline.
- Outlined future goals: efficient scaling, factuality enhancement, and decentralized architectures.
Looking Ahead
ACL 2025’s record size and shifting author demographics underline a rapidly globalizing NLP landscape. The strengthened presence of emerging research communities especially in East Asia suggests that collaboration networks, review pools, and leadership roles will continue to diversify. Meanwhile, the rise of nonparametric approaches and a heavy focus on LLMs point to data-centric methods and model modularity as defining themes for the year ahead.
Stay tuned for the Best Paper announcements and more detailed workshop outcomes as the conference progresses.