We modern humans have become quite used to practically unstoppable technological progress. Every computer is expected to be faster, each aircraft more efficient. In cameras, too, we expect every upgrade to bring us, simultaneously, more pixels, better highlight handling and lower noise.
The question is, Why do we have that expectation? Every practical digital cinematography device in the world uses the same underlying science. Photons strike an electronic device made, principally, of silicon, and the electrons liberated are a measure of the brightness of that light. That phenomenon — the photovoltaic effect — has been known since the 1830s, and the physics does what it does.
Still, there are still advances to be had in terms of how sensors are built and how signals are processed. That’s largely how we’ve come to this point. Increases in sensitivity and dynamic range come with improvements in noise reduction, which serves to make shadow detail more usable. Bigger pixels, too, are more sensitive simply due to the increased likelihood of photons striking them. Higher resolution, on the other hand, fights against these factors by forcing down pixel size.
So when we talk about noise reduction as a benefit of downscaling high-resolution images, this is becoming a zero-sum game. Barring new science (which is naturally possible), there are absolute limits that will at some point conspire to make cameras stop getting better. If that happens, and considering how difficult it can be to make camera equipment pay during a nine-month period of popularity, some of us will probably breathe a sigh of relief.