Perhaps it's time to start a collection of ffmpeg commands for every situation, similar to bro pages [1]. I feel like I have to look at my old projects every time I need to use it. It took me a lot of trial and error to write an all-purpose ffmpeg command.
This is how I convert videos to 720p web-playable videos (if it's not already web-playable):
The hover previews is an awesome example. Thanks for sharing!
I like collections of commands. However, the challenges that seem unsolved are (1) keeping the example in sync with the CLI options, and (2) making it easy to dig into parts of examples. The former is a classic documentation problem, of course.
The videoprocessing one is more mature. It's been converting various torrents movies to playable mp4s for a few years.
The 10x1s preview looks magical once you see it in production. It's also a great introduction to ffmpeg filter syntax, which really isn't that complex.
I just noticed you have one for "Create single-image video with audio".
This seemingly easy task is actually pretty hard to get right with FFMPEG.
Just leave a few notes here if anyone want to use that.
Video part
Chroma sampling
Without any `pix_fmt` argument, FFMPEG defaults to 4:4:4 instead of typical 4:2:0. Which has very limited support (for example, Firefox doesn't support it.)
To fix: add `pix_fmt yuv420p`, or chain a `format=yuv420p` vf at the end to your other vf(s).
Color space
FFMPEG by default uses BT.601 when converting your RGB images into YUV. This is an issue if your image / output video is in HD resolution. Almost all video players (including browsers) would assume BT.709 for anything >=720P. This causes color to shift in playback(255,0,0 would become 255,24,0).
To fix: add `-vf zscale=matrix=709`.
Note: there are some other video filters can do the same. The most famous being good ol' `scale` (based on libswscale). However, it has a notorious bug that would cause the color to shift on its own (becomes yellow) if the input is in BGR (all the `.bmp`s). See: https://trac.ffmpeg.org/ticket/979 So stick with better `zscale`.
Framerate etc.
You can set framerate with `-r` to a small number for both input (reading the same image X times per second) and output (since it's a still image, you can get away with very low framerate. `-tune stillimage` should also be used for (default) x264 encoder.
Even if we ignore all the image/video troubles above, `ffmpeg -loop 1 -i image.png -i sound.mp3 -shortest video.mp4` still doesn't work well. "-shortest" argument has a long-standing bug (https://trac.ffmpeg.org/ticket/5456) that is the output video would be longer than your input audio (by quite a few seconds, worse if using -r 1). There are some workarounds (listed in the ticket) but they don't eliminate the issue entirely.
Your best bet (if the length match is crucial) is to just convert the output video again and use -t to cut to the proper length.
This is how I convert videos to 720p web-playable videos (if it's not already web-playable):
https://github.com/nicbou/homeserver/blob/22c0a160f9df5f4c34...
This is how I create hover previews like on modern streaming sites:
https://github.com/nicbou/timeline/blob/9d9340930ed0213dffdd...
[1] http://bropages.org/tar