🚨AAAI & NeurIPS’ Ruthless Early Rejection Waves: A Perfect Storm Brewing for ICLR 2026?
-
A Storm of Rejections: The New Norm at Top ML Conferences
In the fall of 2025, the academic ML community was hit by a double blow:
- AAAI 2026 introduced a draconian early rejection phase, slicing off 50–67% of submissions in Phase 1 alone.
- Just days later, NeurIPS 2025 shocked the community again by rejecting ~400 papers that had already been conditionally accepted, due to venue constraints.
Let that sink in: hundreds of papers that passed peer review and received acceptance nods were retroactively cut. Welcome to the new reality of top-tier ML venues.
The Numbers Are Brutal
Here’s a quick snapshot of the recent madness:
-
AAAI 2026: Received a record-breaking 29,000 submissions.
- CV: ~10,000 papers
- ML: ~8,000 papers
- NLP: ~4,000 papers
- These three domains alone accounted for ~75% of the total volume.
- Rejection rate in Phase 1: up to 67% in the hottest areas .
-
NeurIPS 2025: Due to “physical resource constraints,” ~10,000 papers were rejected, including ~400 that had been already approved by reviewers and ACs .
Now do the math:
AAAI early rejects (13,000) + NeurIPS culls (~10,000) = ~23,000 high-effort papers looking for a new home.
And all eyes are now turning to…
ICLR 2026: The Imminent Avalanche
With both AAAI and NeurIPS purging submissions at unprecedented rates, ICLR 2026 is set to receive an overwhelming deluge of re-submissions — possibly north of 35,000 submissions, many of which are polished and improved versions of rejected papers.
But here's the twist: ICLR may not be ready, nor willing, to absorb this wave.
ICLR's Uniquely Transparent Model: A Double-Edged Sword
Unlike AAAI and NeurIPS, ICLR uses an open-review system where all reviews, positive or negative, are publicly posted on OpenReview and persist forever, even if the paper is withdrawn or rejected.
This transparency, while laudable for scientific discourse, also introduces real fear among junior researchers, particularly those from regions with less academic security or those navigating a hyper-competitive job market.
“You mean my paper will be permanently public — even if it's ripped apart in the comments and gets rejected?”
— Anonymous Junior ResearcherThis makes ICLR an intimidating venue for those afraid of public scrutiny, even if it’s technically more “welcoming.”
đź§Ş Review Quality Under Strain: Crowdsourcing vs. Collapse
Facing such volume, conferences are scrambling to keep up:
- AAAI has adopted “peer reviewing among authors”, where submitting a paper may automatically enroll you as a reviewer.
- ICLR has introduced rules for LLM-assisted reviewing, allowing reviewers to use AI tools under responsible disclosure.
- NeurIPS tried to scale by adding dual locations, yet still couldn’t accommodate its full program.
The consequences? Inconsistent reviews, burnout, and ironically early rejection as the only defense mechanism to keep the system from collapsing.
The Big Picture: A Crisis in the Making?
This situation raises urgent questions:
- Is the current peer review system sustainable for fields like ML, CV, and NLP that grow exponentially?
- What is the true cost of excessive early rejections?
- It disproportionately affects novel, niche, or interdisciplinary works.
- It discourages junior and international researchers who may not have “insider knowledge” on how to survive these filters.
- Does transparency (as in ICLR) help or hinder scientific growth, when rejection becomes public record?
And perhaps most critically:
Are we optimizing for conference logistics... or scientific progress?
What Needs to Change?
Here are a few ideas circulating in the community:
- Cap submissions per author to reduce flooding.
- Introduce a rolling review system across conferences, reducing redundant reviews.
- Improve reviewing incentives, to encourage quality over quantity.
- Establish conference tiers or tracks (e.g., main vs. emerging topics) to avoid one-size-fits-all standards.
đź§ Final Thoughts
The current conference system is not just overwhelmed, it’s breaking.
As researchers, reviewers, and community members, we need to engage in systemic reform. Otherwise, the next wave of talented minds may find themselves locked out — not due to lack of merit, but due to arbitrary quota-based gatekeeping.
️ Join the conversation: What reforms would YOU like to see in the top-tier peer review process?
-
Any source on this one? AAAI has not yet released anything...