this ended up being a large refactoring of the way pixel formats are
handled in the new ffmpeg pipeline. This was done to make reasoning
about which format to use when a bit easier -- it's still complicated.
The main issue here was for 10-bit output: when the incoming video was
using something like yuv420p10le, but we were using hwaccel, we have to
be sure to use the equivalent p010 format when reformatting frames from
hardware; p010le and yuv420p10le, for instance, are equivalent.
The way this is done now is by keeping the notion of "wrapper" pixel
format types while also introducing a delineation between
software/hardware pixel formats. While the naming isn't technically
"correct" (we could use p010le on hardware, if we wanted) it is meant to
be clear about which formats can be used on the boundaries between
software/hardware and when.