The problem with gamma-correctness in emulators is you ALSO have to emulate limited nominal ranges (digitally, 16-235 for Y, 16-240 for UV) on top of doing NTSC or PAL coloring correctly.
Since doing limited ranges on sRGB looks so weird (mainly, 16 is surprisingly bright for being black), some people gamma correct for 1.8 instead of 2.2... which is a hilarious value when you realize that is the gamma value for classic Macs.
I feel pretty good about the "unfiltered" display on an emulator being the original RGB. Seems like adding that 16-235 NTSC YUV conversion on top of 5-bit-per-channel SNES RGB would just make banding even worse, and not really look different?
I mean, I don't like running with the accurately rainbowy and scanliney composite cable mode filters myself, and I thought I cared.
To do this naively: Mathematically, 5 bit is 32 values. Multiply each value step by 8, and you get 0, 8 .. 256. Multiply each step by 7, and you get 0, 7 ... 224. Offset each step by 16, you get 16, 17 ... 240.
Also, some emulators do offer chroma subsampling effects just to make certain things look more accurate.
That said, yes, all my emulation days involved "incorrect" SNES rendering: 5 bit per channel SNES RGB, with each step being +32, with no NTSC adjustment.
Since doing limited ranges on sRGB looks so weird (mainly, 16 is surprisingly bright for being black), some people gamma correct for 1.8 instead of 2.2... which is a hilarious value when you realize that is the gamma value for classic Macs.