It’s less about individual small screenshots (PNGs for example are pretty large with real photographs, which can take minutes to load with a bad connection) and more about multiple images on one site. User retention is strongly affected by things like latency and loading speed. The best way to improve these metrics is to reduce network traffic. Images are usually the biggest part of a page load.
All digital cameras are imperfect - there is always a bit of noise, but usually it doesn’t come through since your scene is bright enough to make small amounts of noise imperceptible. In a completely dark room the camera still tries to get data from the photo sensor, but the noise (created by temperature fluctuations, imperfections in the chip itself and so on) is all you get. You may theoretically be able to predict the noise on short time scales, but it’s a chaotic system.
No, CloudFlare doesn’t use lava lamps to generate random numbers, that was a marketing stunt. Using a camera in a completely dark room is a better source of entropy than one pointed at lava lamps.
Also, nobody is saying that computers create a number out of nothing. The environment is a great source of entropy (temperature fluctuations, user inputs and so on) which are then expanded into a larger amount of entropy through CSPRNGs.
I had to run experiments that generate a lot of data (think hundreds of megabytes per minute). Our laptops had very little internal storage. I wasn’t allowed to use an external drive, or my own NAS, or the company share - instead they said “can’t you just delete the older experiments?”… Sure, why would I need the experiment data I’m generating? Might as well /dev/null it!