The obvious awnser would be VR and AR where the faster the refresh rate is the less likely you are to get motion sick. A display with a refresh rate that high would be displaying a frame every millisecond meaning if the rest of the hardware could keep up a headset using this display would be able to properly display the micro movements your head makes.
Mold is actually the biggest concern with the most popular archival format LTO. EMPs aren’t that much of a concern. Bit flips and bit rot are your main concerns traditionally when using flash for archival storage. It’s recommended if you go the flash route to keep your array hot (ie powered on) and use a file system with data scrubbing capabilities such as ZFS.
Eh this is to be expected seeing how nvidia handles their architecture. For the last few generations nvidia has designed their architecture around the enterprise world’s needs with gaming being an after thought. The fact is their gaming GPUs are defective enterprises GPUs not to say that’s a bad thing but nvidia does not sell any GeForce GPUs that utilizes the full die. Simply put the enterprise world is all in on AI workloads and ROPs do not benefit those workloads so why would Nvidia dedicate precious silicon to an after thought?