Why many say SIGKDD submissions are dropping in number?
-
Every year around submission season, we hear murmurs in the AI and data mining community: "KDD submissions are dwindling!" or "Is KDD losing its edge?" Forums buzz, WeChat groups debate passionately, and early-career researchers grow anxious about where they should submit their precious manuscripts.
But is it really true that SIGKDD, one of the most prestigious data mining conferences, is experiencing a decline in submissions?
Numbers don't lie, or do they?
First, let’s set the stage clearly. Compared to AI giants like NeurIPS, AAAI, and IJCAI, SIGKDD might appear modest. NeurIPS proudly boasts submission numbers climbing towards the tens of thousands, and AAAI isn't far behind. But wait! is that comparison fair?
Data mining, SIGKDD’s heartland, is inherently narrower in scope than general AI and machine learning fields. Its specialized nature naturally leads to fewer submissions, not necessarily indicating a decline but rather reflecting a focused and selective community. Historically, KDD submissions have maintained stable growth trends aligned perfectly with this specialized market.
The hidden truth behind the rumors
Let’s dig deeper. The rumored "decline" often stems from perceived barriers:
1. Intense competition and rigorous standards:
SIGKDD maintains an uncompromising standard. Unlike some broader AI conferences where trending topics might gather easier acceptance, SIGKDD's emphasis on meticulous data-centric insights and practical impact means fewer papers can pass its rigorous reviews. This exclusivity ensures high quality but also fuels misconceptions about dwindling submissions.
2. Double-column formatting: the silent killer?
Writing a paper for KDD requires tackling the famously challenging ACM double-column, nine-page format. Compared to single-column formats adopted by many other conferences, this strict format demands more concise, denser content. Many researchers openly admit that crafting one KDD paper feels like writing two papers elsewhere. Could this rigorous requirement discourage potential authors?
3. Perceived narrow "Circles" and reviewer bias:
Rumors persist about "circles" or networks dominating KDD submissions. While networking naturally occurs within specialized communities, it's easy for observers to mistake familiarity for unfair bias. SIGKDD, aware of this criticism, recently adopted more transparent and objective reviewer matching systems to counteract potential biases and enhance fairness.
Bright lights ahead for SIGKDD
Contrary to doom-and-gloom predictions, recent improvements point towards SIGKDD's evolving positive trajectory:
- Adoption of OpenReview for transparency and fairness in reviews.
- Elimination of "bidding" mechanisms in reviewer assignments, ensuring unbiased, quality-driven matching.
- Strengthening community engagement through clearer communication and improved submission experiences.
What really matters?
Ultimately, the value of a conference isn't merely its submission numbers but the impact and innovation of its published works. SIGKDD papers continue to shape industries, drive innovation, and inform academia profoundly.
Think of milestones like XGBoost and Node2Vec; they emerged from SIGKDD and influenced entire fields dramatically. High standards and rigorous review processes cultivate such impactful research, a feat that broader conferences might not consistently achieve.
Stop counting, start celebrating!
Next time you hear rumors about declining KDD submissions, remember that quality and impact matter more than sheer volume. SIGKDD continues to uphold a golden standard that prioritizes depth, innovation, and applicability over mere numbers.
Maybe it's not SIGKDD that needs changing, but how we measure its success. After all, in research, as in life, quality always trumps quantity.