I've compressed tens of thousands of videos in recent years and critically looked at the results.
My gut feeling is that implementations of hardware acceleration are just simply not as good as software implementations, the cause perhaps relating to localised resource constraints in hardware when recursing deeper or somesuch.
The algorithms are the same, the chip versions are more resource constrained as the stack goes deeper if you will.
Hardware compressors are much much faster, no doubt, by an order of magnitude and more .. but software compressors left to run slow, with two passes, just produce a much better result.
Standard hardware compressor settings tend to leave moire patterns when the camera pans outside a chain link fence onto a player inside a court (for example), or leaves a blocky halo around the significant figure point of interest moving in an otherwise stillish scene - it's subltle small things that grate when you see them.
The root cause of perceived downgrade in quality is bulk pushing of high quality originals through hardware accelerated compression in order to save time and reduce cost when generating lower bitrate versions to stream.
Absolutely. Software encoders are constantly pushing the envelope in terms of quality, and they have tuneable knobs to allow you to pick the speed vs quality trade off.
Hardware encoders are much more restrictive. They will target a maximum resolution and frame rate, realtime encoding a possibly low latency as requirement.
The standards define the bitstream, but is lots of scope for cheating to allow for simpler hardware implementation. You could skip motion prediction and use simple quantisation, it would just contribute to the residual entropy that needs to be encoded. Then, you can use inefficient implementations of the entropy encoders. The end result is something that can be interpreted as a valid bitstream but threw out all the algorithmic tricks to give you high quality compressed video.
I think specifically in terms of YouTube, it’s not a hardware encoding issue but it’s a purposeful optimisation for storage and bandwidth.
There are tradeoffs. IIRC Netflix uses software encoders and they work pretty hard to squeeze the maximum quality for a given bitrate. Makes sense for their use case. OTOH, for like video calls, certain kinds of archiving, in general cases with few or no views, it makes sense to use hardware encoding even if it means compromised image quality.
Sure, there are many use cases and Netflix generally has streams with high quality and few artifacts.
Real time video communications are going to go for fast low bitrates and that's more or less expected.
In other cases where encoders have no real time constraint, eg pirate torrents and youtube alternate versions, there are multiple tiers of quality - for a given resolution (say 720p) and bit rate rate | file size the offerings can be split by time spent encoding, quick and shitty to slow and few visible quirks.
I don't have a deep dive, but I can confirm youtube has made things worse. I came across a copy of 『the TV show』 that I downloaded in 2009. 720p, H.264, 2460kbps. Looking again a couple weeks ago, it was 720p, VP9, 1300kbps. VP9 is more efficient but at half the bitrate it looks much worse.
Poking more, there's an AV1 version that's a little less bad at about the same bitrate, but it's still significantly worse than the old encode. There's also a new version of H.264 that's 2/3 the size of the old one. It's about as bad as the VP9 version but the artifacts focus on different parts of the image.
You'd think after 15 years of storage prices dropping, bandwidth prices cratering, and screens getting bigger, youtube might decide to maintain bitrates and let new codecs increase the quality of videos. Or split the difference between quality and savings. But nah, worse.
It doesn't make much sense to compare 720p to 720p like that. What matters is the bitrate. So the better question is - does youtube offer better quality video at a given bitrate constraint now in comparison to the past?
I am comparing the maximum quality available. I don't much care that today's 1300kbps video looks better than 2009's 1300kbps video. I care that they used to offer higher quality, and now they don't. I am not bitrate-limited, or at least my bitrate limit is higher than youtube would ever consider going.
What matters is the bitrate, and they're being overly stingy on bitrate.
Ok, this wasn't clear to me. Perhaps it was a video with too few views and thus not considered worth spending storage on? I mean, popular videos on youtube have up to 4K and pretty high bitrate.
seems more like gog/yt shifted bitrate/resolution downwards because pleps demand 4k, and now the "High Definition" version that was good enough for cinemas looks like crap.
I thought it was well known that YouTube cut bitrates during the pandemic? In any case, I think you're quite right, I only watch YouTube in 720p and some videos seem comparable to stuff I downloaded from Kazaa back in the day. It's ok, though. I don't pay for YouTube and I think it's good for them to offer higher quality options for a charge rather than hound every single viewer for payment.
can't shake the feeling that most videos at 720p took a hard hit to quality within the last years (or my eyes are getting better)
any deep dives on the topic appreciated