This page is a mirror of Tepples' nesdev forum mirror (URL TBD).
Last updated on Oct-18-2019 Download

Which emulator most accurately represents NTSC colors?

Which emulator most accurately represents NTSC colors?
by on (#39493)
I've been testing my game on FCEUXD SP 1.07, and I think the title screen looks great on that emulator.

When I run the game on Nestopia 1.40, the title screen looks noticeably different. All the colors seem to have an extra greenish tint to them.

This is what I'm talking about:

Image

The title screen on the left says, "Look at me! I'm vibrant! I'm fun! PRESS START already, willya?"

The title screen on the right says, "Life sucks."

Then it occurred to me: What if the Nestopia colors are actually more accurate? It'd be better to find out sooner rather than later, so I can choose my colors appropriately.

Nestopia has a reputation for being extremely accurate; does that reputation apply to its color output as well?

by on (#39494)
Don't worry about it, it will look different on various TVs also. Some people say NTSC stands for "Never Twice the Same Color".

by on (#39495)
There is never one accurate palette, since every TV generates color differently with different color decoders. That said, Nestopia can accurately display any appropriate color TV decoder, given you have the R-Y, G-Y, and B-Y angles and gains. Nestopia starts out with a Consumer decoder, which is the one known as Sony CXA2095S/U. I've read that the default Canonical decoder is accurate for PAL games, but not NTSC games, and that a canonical decoder for NTSC TVs, which I've read is uncommon, would actually be shifted 15 degrees forward. If you test different encoders, you'll see that hues $2 and $8 are actually unreliable - $2 can be either a greenish or reddish blue, and $8 can either be a yellowish orange or a yellowish green. I'm not sure if this is inconsistent with Japanese NTSC-J TVs also; both of the 2 Japanese encoders that I know of make $2 a reddish blue and $8 a yellowish green.

If you're using a computer monitor (I believe LCDs especially), or any other monitor at brighter than default settings, you'll need to be wary of how you use certain color combinations. This can be a problem if you're mixing colors like $0c and $01 or $07 and $06; on a overly bright or high-gamma monitor, it may either look bad or even mess up the luminance order altogether. I made a thread about that here. Set your computer monitor to default settings to see if it can roughly display the CRT gamma. I've read that LCD computer monitors (maybe not TV?) have linear gamma, so you may need to adjust your video card/palette to a gamma correction of 0.45 or so.

by on (#39496)
If you care about NTSC artifacts, go look into an emulator with that NTSC filter, like Nestopia; otherwise, the palette seems empirical for each emulator author though. You could try my emu, as the palette was very welcome.

<joke>As last instance, try NESticle!</joke>

by on (#39501)
Fx3 wrote:
If you care about NTSC artifacts, go look into an emulator with that NTSC filter, like Nestopia; otherwise, the palette seems empirical for each emulator author though. You could try my emu, as the palette was very welcome.

I just experimented with the NTSC filter on Nestopia, and I must say, it's pretty awesome. I think I'll go with that until I'm ready to test on actual hardware.

EDIT: Dude, you're the guy who made RockNES? That's rad! I've been using that emulator for years.

by on (#39504)
SecretServiceDude wrote:
I just experimented with the NTSC filter on Nestopia, and I must say, it's pretty awesome.

Yeah... That filter is a pretty good preview of what the game will look like on a TV.

by on (#39512)
strangenesfreak wrote:
I've read that LCD computer monitors (maybe not TV?) have linear gamma

The controller chip in every PC LCD monitor that I've tested has implemented something close to the sRGB curve.

by on (#39529)
tepples wrote:
The controller chip in every PC LCD monitor that I've tested has implemented something close to the sRGB curve.

Yeah, I've read about that too...it's really confusing. My PC LCD at default settings passes an LCD gamma test for 2.2, but doesn't work at all with a CRT gamma test. My PC CRT at default settings gives me 3.0 from that CRT gamma test and 2.85 for 48% and 2.5 for 25% and 10% from the LCD test, but I think CRTs and LCDs convert to 2.2 gamma differently. I have a hunch that CRT PC monitors assume a higher input gamma than LCD PC monitors...does anyone know if CRT and LCDs interpret input gamma differently?

EDIT: I think the real problem is that computer monitors just interpret input gamma differently from TVs. After configuring my PC CRT to a gamma of 2.2, it now looks very similar to the PC LCD. So maybe the input gamma of the NES is 2.2, and the input of computers is linear?