Epistemic Throughput: Fundamental Limits of Attention-Constrained Inference
Lei You
Abstract
Recent generative and tool-using AI systems can surface a large volume of candidates at low marginal cost, yet only a small fraction can be checked carefully. This creates a decoder-side bottleneck: downstream decision-makers must form reliable posteriors from many public records under scarce attention. We formalize this regime via Attention-Constrained Inference (ACI), in which a cheap screening stage processes K records and an expensive verification stage can follow up on at most B of them. Under Bayes log-loss, we study the maximum achievable reduction in posterior uncertainty per window, which we call epistemic throughput. Our main result is a "JaKoB" scaling law showing that epistemic throughput has a baseline term that grows linearly with verification and prevalence, and an additional information-leverage term that scales as sqrt(J × K × B), where J summarizes screening quality. Thus, expanding cheap screening can nonlinearly amplify scarce verification, even when informative records are rare. We further show that this scaling is tight in a weak-screening limit, and that in the sparse-verification regime (B << K), substantial leverage requires heavy-tailed score distributions; for light-tailed scores the amplification is only logarithmic.
Keywords
Citation
@article{You2026Epistemic,
title={Epistemic Throughput: Fundamental Limits of Attention-Constrained Inference},
author={Lei You},
year={2026},
url={https://cspaper.org/openprint/20260212.0003v1},
journal={OpenPrint:20260212.0003v1}
}Version History
| Version | Submitted On |
|---|---|
| v1Current | February 12, 2026 |