The NES was actually 256x240, using all 240 lines of an NTSC field, but many TVs would cut off part of the picture on the top and bottom -- thus, 256x224 was the "usable" space that was safe on most TVs. But this overscan was inconsistent from one TV to another, and modern TVs and emulators will usually show you all 240 lines.
The SNES's vertical resolution was configurable to either 224 or 240 lines, as the article mentions. Most games stuck with 224, as the longer vertical blanking interval gives you more time to transfer graphics to the PPU.
> 256x224 was the "usable" space that was safe on most TVs
Adding further complication, although most arcade cabinet games also used 15Khz CRTs similar to de-cased televisions, since all the cabinets were being assembled by the manufacturer using CRTs they specified, designers could take some liberties with varying the resolution, frame rate, scan lines and scanning frequency of the video signal generated by their game's hardware circuit board. Being analog devices, CRTs of this era were generally tolerant of such variations within specified ranges since they had to sync up with inputs from disparate over-the-air television channels, cable boxes, VCRs or even live cameras. This allowed arcade hardware designers to optimize their circuits either for slightly better resolution and frame rates or alternatively reduce them somewhat in cases where their hardware wasn't quite able to generate enough pixels in real-time. For example, the Mortal Kombat 1, 2 and 3 cabinets displayed video at 54 Hz (instead of NTSC-standard 59.94 Hz) enabling higher horizontal resolution. They could also optionally choose to use the entire overscan safe area for active game pixels since they knew the CRT's width and height adjustments could be dialed on the cabinet manufacturing line to ensure the entire active picture area was visible inside the bezels - whereas few consumer TVs exposed all of these adjustments externally to users.
All this subtle variation in classic arcade hardware makes clock-for-clock, line-for-line accurate emulation especially challenging. Fortunately, the emulation community has solved this thorny set of problems with a special version of MAME called GroovyMAME which is designed specifically to generate precisely accurate emulated output signals so these emulated classic arcade games can be perfectly displayed on analog CRTs. This requires using one of the many PC graphics cards with native analog RGB output, which was most graphics cards made up to 2015 - but sadly none since. GroovyMAME works with specially modified Windows graphics card drivers to generate correct signal ranges from the analog output hardware of most off-the-shelf, pre-2015 Radeon and NVidia cards - which are still widely available on eBay ($10-$50).
For arcade preservationists and retro gaming purists, the resulting output to a CRT is sublime perfection, identical to the original cabinet hardware circuit boards, many of which are now dead or dying. This enables creating an emulation arcade cabinet either using a period-correct CRT matching the traits of a certain series of original cabinets, or alternatively, using a special tri-sync or quad-sync analog RGB CRT which is able to display a wide variety of signals in the 15Khz, 25Khz, 31Khz and 38Khz ranges. This is what I have in my dedicated analog CRT cabinet and using GroovyMAME along with a 2015 Radeon GPU it can precisely emulate 99+% of raster CRT arcade cabinets released from 1975 up to the early 2000s accurately (and automatically) recreating hundreds of different native resolutions, frame rates, pixel aspect ratios and scanning frequencies on my cabinet's 25-inch CRT. For more info on GroovyMAME and accurate CRT emulation visit this forum: http://forum.arcadecontrols.com/index.php/board,52.0.html.
Fascinating post, and innovation to get this all emulated with more recent hardware. Thank you for sharing.
What is the future for RGB-output video cards looking like? Are there more specialised cards still in production?
And are these tri-/quad-sync analog CRTs still manufactured?
The feeling of CRTs and contemporaneous hardware provokes almost overwhelming nostalgia for me, and I feel like modern television hardware is only just beginning to catch up with respect to UI responsiveness and reliability, for instance changing channel & volume, and playback functions like pausing, fast-forwarding and rewinding videos.
> What is the future for RGB-output video cards looking like? Are there more specialised cards still in production?
Sadly, no graphics card manufacturer still makes cards with native analog RGB output and, AFAIK there haven't been any since ~2015. There may be cards which have analog output but it's not natively generated with variable analog timing (dot clocks etc). Instead it's created as a native digital signal and then converted to analog, at which point it's no better than adding an analog converter externally to an HDMI or Displayport output connector (this is pointless and not worth doing).
On the positive side, there are a ton of used graphics cards with native analog output available on eBay for dirt cheap (or free if you have PC hobbyist friends or access to a typical corporate or edu recycle pile). Arcade cabinet games which output to CRT monitors stopped being made by around 2005 and game consoles which hooked to CRT TVs ended with the sixth generation (PS2, Gamecube, Dreamcast). This is good news because emulating the vast majority of arcade cabinet and console games up through the early 2000s doesn't require a fast GPU or CPU so using an older GPU with native analog output does everything you need (and saves a lot of money).
The last, best GPU made with native analog output was the Radeon R9 380x launched at the end of 2015. I have this card in my arcade cabinet emulation system (plugged into a 2014 HP ProDesk 600 G1 with i5-4590 Haswell CPU (~$70 used on eBay)). This PC is more than fast enough to perfectly emulate everything relevant to CRT gaming and the 380x GPU is substantial overkill. Being the last analog output card, the 380x is overpriced on eBay at >$50 but I only got it because I have a quite rare Wells Gardner D9200 quad-sync industrial CRT made specifically for arcade cabinets and that monitor is fairly unique because it can scan up to 38Khz (800x600 resolution). No games originally designed for CRTs use resolutions that high so it's only relevant for some PS2 games, and only then if I use non-authentic, 2x upscaling or HD texture packs in the emulator. So, I might theoretically, occasionally actually need the otherwise uselessly excess power of the 380x. If you're not me, just use almost any Radeon graphics card from 2012-2014 which can be had for ~$10-$15 to drive your CRT with GroovyMAME. Card compatibility list: https://emulation.gametechwiki.com/index.php/GroovyMAME). GroovyMAME forum: https://forum.arcadecontrols.com/index.php/board,52.0.html
> And are these tri-/quad-sync analog CRTs still manufactured?
All CRT manufacturing stopped around 2010. I was fortunate to buy my industrial-grade, quad-sync CRT new directly from the manufacturer in 2009. However, there are lot of used CRT TVs locally available from Craigslist and thrift stores, many of them for free or close to it. Higher quality CRTs like the Sony PVM and BVM series made for video production studios and broadcasters are now collectables selling for astronomical prices. However, high-quality consumer TVs from the late 90s and early 2000s, like Sony WEGA and any of dozens of models based on the well-regarded Sony BA-5 chassis, can be had in good condition for fairly reasonable prices. Many of these can also be modded to accept direct analog RGB input in addition to composite or S-Video, elevating their quality significantly higher (Modding Guide: https://sector.sunthar.com/guides/crt-rgb-mod/sony-ba-5.html). With the exploding interest in CRT retro gaming (for example: https://www.reddit.com/r/crtgaming), I'm surprised no one has yet restarted CRT manufacturing but CRTs are pretty complex beasts, essentially a kaiju-scale vacuum tube with arcane analog driver circuitry bolted on.
> I feel like modern television hardware is only just beginning to catch up
To be fair, with expanded color spaces, higher contrast, wide color gamuts (HDR10 etc), high-nits, faster gray-to-gray response times, black frame insertion and VRR, the latest, most expensive digital flat screen tech is getting closer in many ways. I can imagine it maybe getting there in the future but, unfortunately, the hardest part may be actually finding a modern television without ads, apps, online updates and DLC bloat.
Although I'm a retro purist and will never part with my beloved CRT-based emulation cabinet, I know not everyone is quite as obsessed or may not have space for such a system. So, it's important to also share that in recent years modern GPU-based pixel shader CRT emulation has gotten impressively closer to emulating analog CRTs, including shadow masks, analog glow, glass warping and even ray-traced bezel reflections. If you can't play on a real CRT, I encourage everyone to at least play games which were originally created for CRTs via CRT emulation. It's easy to do and retro pixel art looks so much better when presented as originally intended. See this image comparison: https://x.com/CRTpixels/status/1408451743214616587. Without CRT scanlines and phosphor glow, the art looks terrible and is just completely wrong. Check out Retroarch's shader community (https://forums.libretro.com/c/retroarch-additions/retroarch-...) and ReShade.
I only discovered that a lot of games output at weird refresh rates when I was putting together a mister. My vrr TV handles most of the weird refresh rates and resolutions but not all (in particular not bad dudes Vs the dragon ninja).
I didn't know mk 1,2,3 run at 54 Hz! Mister doesn't have support for the mk boards so I've only played them via emulation on a pc, this means they have been running too fast! (I think)
One thing, my pc has a Nvidia 3070, if I tell retro arch to output at the original refresh rate (which my vrr TV should be able to handle) I'll get the correct refresh rate?
I think what you have been talking about is that post 2015, analogue output on graphics cards isn't natively generated so it's as bad as a hdmi to analogue adapter. Digital output to a vrr is completely separate.
> this means they have been running too fast! (I think)
Not necessarily. There are settings in MAME which provide some options on how to address frame rate mismatches. I think they all have various trade-offs like dropping, doubling or blending frames but I'm not current on what they are since all my serious retro gaming is on my CRT-based arcade cabinet :-). In theory at least, a modern GPU's ability to synthesize motion interpolated frames should allow fairly decent frame rate matching, even without VRR.
> if I tell retro arch to output at the original refresh rate (which my vrr TV should be able to handle) I'll get the correct refresh rate?
Yes. VRR is basically intended to do with a digital display what an analog CRT has always done, vary the display's refresh rate to match the source clock. However, I'll add a small caveat here. VRR is relatively new and advanced digital display features newly added to revisions of existing consumer video standards have a tendency to go through some teething pains as various device and display manufacturers figure out the nuances. I've only played around a little bit with VRR but haven't done any serious validation myself. Until it's more mature, I wouldn't assume correctness or compatibility of any recent addition to HDMI 2.1 (looking at you Source-based Tone Mapping!). So... trust but verify :-)
Also, since you mentioned Retroarch, here's a ProTip: Retroarch is admittedly convenient but for serious retro gaming I generally recommend using the original emulators directly, especially if you're striving for emulation accuracy and display correctness. MAME's interface is definitely more clunky and it's probably possible to achieve identical results with Retroarch but as a wrapper, it adds another layer of abstraction and potential for gremlins. There's also the potential for cores to not be up to date and the RA authors do change some things and make certain trade-offs to integrate disparate cores into their architecture. For CRT users I also don't know if there's even a GroovyMAME core for RetroArch.
> Retroarch is admittedly convenient but for serious retro gaming I generally recommend using the original emulators directly, especially if you're striving for emulation accuracy and display correctness.
This comes with the caveat that sometimes RetroArch's frontend is better than the standalone emulator's frontend -- RetroArch's graphics and input is quite mature and configurable on all platforms, and I've definitely had problems with bugs or latency in some less-maintained standalone emulators that aren't a problem when running through RetroArch. But yeah, agreed otherwise -- RetroArch is another layer between you and the emulator core that doesn't always do what you want or expose the options you need.
Yeah, on retro arch and specific emulators, you are right. I've been messing around with emulators since about 1997 and it's only in recent years I've been using retro arch, I miss the days of zsnes and genesyst. I actually think retro archs UI is pretty awful. Also, I don't get why their website reads like an infomercial. For example they really want to make a point that FPGA emulation is no match for retro arch, which is silly, they both have advantages and drawbacks. Maybe they make money off retro arch, not sure.
The layer of abstraction point you make is spot on, I've been using my steam deck a lot, I'm using emu deck which installs emulation station which installs retro arch. Configuration is scattered everywhere.
I haven't mucked about much with individual emulators in a while, so I'm not sure if they'll support run ahead latency reduction features, that's the one big thing I like in retro arch.
Edit: my main issue currently is figuring out what settings I should be using for particular cores/emulators. The steam deck screen isn't vrr, but it does allow refresh limiting. So that is its own set of problems. Similarly I think I'm using the right settings for my pc vrr set up, but never certain.
Actually, I spend more time fiddling with setting Vs playing games!
The Mega Drive also uses a 320 wide mode for most games, the "width" of an analogue TV picture is somewhat arbitrary and based on things like available bandwidths / sample rates and so on, so it's a bit flexible depending on system design.
The SNES's vertical resolution was configurable to either 224 or 240 lines, as the article mentions. Most games stuck with 224, as the longer vertical blanking interval gives you more time to transfer graphics to the PPU.