I can't believe games used to look that fuzzy. They seemed just fine at the time.
I do a little photography every once in a while, so I know a *little* about this stuff, and I'm fairly certain two things are happening.
First, old analog video gets imbedded with a ton of errors when it's digitized. Scan an old analog/film photo, print it, and compare with the original. You'll see what I mean. Digitization adds a bunch of "noise" that distorts image quality.
Second, old cathode ray TVs (CRTs) (yes, the ones you could hear when they were on, that released tons of static, that you could accidentally destroy with a magnet, and also released X-rays) had a smoother and less-refined display than liquid crystal displays used today in pretty much every digital screen. That smoothing effect is extremely important for graphics before 2005 or so, and old video game (90s and early 2000s) graphics suffer on modern digital screens because the images were never crisp — nor were they meant to be.
TL;DR – digitization ruins/compresses image quality and modern screens are so crisp and clear that they show every single analog imperfection.
At least that's how I understand it.