Hi there!
I recently got myself an OSSC, and I am extremely happy about the result I am getting on an Amiga 500 + SCART RGB cable, and on a Sega Megadrive
The megadrive RGB signal is surprisingly crisp at least to me (despite it being a model 1 console). The results I am getting on a modern 4K TV, I could mistake them for the output of an emulator. Super sharp pixels!
Delving a bit into FPGA technology used in retrogaming, I was looking at what the MiSTer project is doing (I know, totally different thing, they are running systems on an FPGA, not doing extra low latency line doubling), and I got a bit curious about what the Genesis/MegaDrive has an option called "Composite blending". I have no first-hand experience with that option myself, but I find the idea interesting.
A lot of games on that system exploit dithering patterns and alternating pixels on foreground sprites to simulate wider color gamut and transparency. 2 things that the hardware is not technically able to do.
I know the OSSC's goal is to digitize the signal and increase resolution by just processing one line at a time and that the filtering options available in that scenario are extremely limited (basically, I understand how we can do the scanlines and the bob de-interlace methods in that scenario). I do not know how the filter is implemented on Mister, but I am wondering if, in a single line of the pixel, there is a way to try to apply that sort of "signal degradation" and to dose it somewhere in the middle between
Here's a video with a timecode that showcases some of the natural artifacts these low-quality video outputs are producing https://youtu.be/x0weL5XDpPs?t=314
I legitimately have no idea if there's space on the FPGA of the OSSC to try to add that sort of filtering after the DAC/sampling stage; I also do not know how it could be implemented exactly. I have a "vague" idea of how a post-processing shader would be able to blend colors like that. I have no clue how you would implement that with straight-up logic gates.
Any thoughts ? 🤔
Hi there!
I recently got myself an OSSC, and I am extremely happy about the result I am getting on an Amiga 500 + SCART RGB cable, and on a Sega Megadrive
The megadrive RGB signal is surprisingly crisp at least to me (despite it being a model 1 console). The results I am getting on a modern 4K TV, I could mistake them for the output of an emulator. Super sharp pixels!
Delving a bit into FPGA technology used in retrogaming, I was looking at what the MiSTer project is doing (I know, totally different thing, they are running systems on an FPGA, not doing extra low latency line doubling), and I got a bit curious about what the Genesis/MegaDrive has an option called "Composite blending". I have no first-hand experience with that option myself, but I find the idea interesting.
A lot of games on that system exploit dithering patterns and alternating pixels on foreground sprites to simulate wider color gamut and transparency. 2 things that the hardware is not technically able to do.
I know the OSSC's goal is to digitize the signal and increase resolution by just processing one line at a time and that the filtering options available in that scenario are extremely limited (basically, I understand how we can do the scanlines and the bob de-interlace methods in that scenario). I do not know how the filter is implemented on Mister, but I am wondering if, in a single line of the pixel, there is a way to try to apply that sort of "signal degradation" and to dose it somewhere in the middle between
Here's a video with a timecode that showcases some of the natural artifacts these low-quality video outputs are producing https://youtu.be/x0weL5XDpPs?t=314
I legitimately have no idea if there's space on the FPGA of the OSSC to try to add that sort of filtering after the DAC/sampling stage; I also do not know how it could be implemented exactly. I have a "vague" idea of how a post-processing shader would be able to blend colors like that. I have no clue how you would implement that with straight-up logic gates.
Any thoughts ? 🤔