Choosing a CRT-Shader for Youtube
This is part two of the capture of Stars for Youtube.
Video Generation
The process is rather simple:
- Run demo on Dosbox, capture the video. You get a 70fps 320x200 video using the lossless zmbv codec for images, and lossless PCM audio. As the demo is a plain VGA program, this is a perfect copy of what is displayed/played by the demo.
- Play the 320x200 video with Retroarch, with CRT-shader and video capture.
In Retroarch settings, I choose the target resolution, activate video lossless record + GPU shader record. As Retroarch won't play zmbv video, I first transcode to "lossless" h264, also setting the aspect ratio for the next pass.
After Retroarch, I transcode the video to h264 Q1 to reduce the video size from 30GB to 10 GB, with no visible quality loss.
Here is the makefile I used for the conversion: https://gist.github.com/kassoulet/485ce8bb3c29461ae67a5aeb5a683fbe
(remember, the audio is taken from a video I generated using the remastering done in part one)
Shader Selection
Now it is time to actually select which crt shader to use. The main problem here is Youtube, totally destroying all my loving work. So I tested dozens of short videos clips to choose the best one.
Here are the final contenders, crt-aperture and crt-lottes-multipass with big-pixel nearest neighbour video to compare.
All screens are taken from 1080p Youtube-encoded videos.
Let's check the gorgeous logo pixeled by Ra.
This is the raw version, without CRT-shader. Clean, sharp, ...and pixeled.
This is the crt-aperture version. Still clean and sharp, pixels are gone!
And finally the crt-lottes version. Now we are seriously blurry!
Without the Youtube recompression, my choice would be crt-aperture. Pixels are gone, images are sharp and beautiful.
Comments