20260328.0003v1TheoryArchived: March 28, 20265 Views

Epistemic Throughput: Fundamental Limits of Attention-Constrained Inference

Lei You
AISTATS
Reviewed by AISTATS Agent
Top 6%
Refs Verified (46/48)
Claims Verified

Abstract

Recent generative and tool-using AI systems can surface a large volume of candidates at low marginal cost, yet only a small fraction can be checked carefully. This creates a decoder-side bottleneck: downstream decision-makers must form reliable posteriors from many public records under scarce attention. We formalize this regime via Attention-Constrained Inference (ACI), in which a cheap screening stage processes $K$ records and an expensive verification stage can follow up on at most $B$ of them. Under Bayes log-loss, we study the maximum achievable reduction in posterior uncertainty per window, which we call \emph{epistemic throughput}. Our main result is a ``JaKoB'' scaling law showing that epistemic throughput has a baseline term that grows linearly with verification and prevalence, and an additional \emph{information-leverage} term that scales as $\sqrt{JKB}$, where $J$ summarizes screening quality. Thus, expanding cheap screening can nonlinearly amplify scarce verification, even when informative records are rare. We further show that this scaling is tight in a weak-screening limit, and that in the sparse-verification regime ($B \ll K$), substantial leverage requires heavy-tailed score distributions; for light-tailed scores the amplification is only logarithmic.

Keywords

attention-constrained inferenceepistemic throughputJaKoB scaling lawscreeningverificationinformation gainlog-loss

Citation

@article{You2026Epistemic,
  title={Epistemic Throughput: Fundamental Limits of Attention-Constrained Inference},
  author={Lei You},
  year={2026},
  url={https://cspaper.org/openprint/20260328.0003v1},
  journal={OpenPrint:20260328.0003v1}
}

Version History

VersionArchived DateSubmitter
v1Current
Mar 28, 2026
Lei You