The basic issue with maximum sensor resolution is the wavelength/’size’ of light:
Visible radiation has a wavelength of around 500 nm so you would have 2,000 photons per millimetre.
That would make the maximum possible resolution/light sensitivity:
(Where 35mm sensors are 36*24mm):
2,000 per mm * 36 = 72,000 pixels wide
2,000 per mm * 24 = 48,000 pixels wide
3,5 K megapixel
Therefore: The theoretical maximum sensor resolution for full frame sensor is 3,5 K megapixels. That’s amazing…
But it’ seems that the optics is the current limit:
“About 55 to 60 lpmm (line pairs per millimeter) is the max for the best quality glass currently available.”
For 35mm (full frame) sensors (35mm wide) will need to resolve 2,100*2? No… that can’t be right. Anyone have any ideas here?
Issues of how closely you can stack light sensors and how we move on to super-resolution (where longer time capture not only allows more photons to arrive but they are interrogated for more information and best likely image is computed based on averages and thus the diffraction limit of the system is transcended and noise calculated out) come into it after this, but a 21,6 Quadrillion pixel camera sounds pretty interesting.
Storage and processing capacity is just a matter of following Moore’s Law down the rabbit hole so that’s not interesting from a theoretical point of view but it is of course a major constraint for a practical implementation.