Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's similarly annoying how many websites take the existence of the lossy format as a license to recompress all WebP uploads, or sometimes other filetypes converted to WebP, even when it causes the filesize to increase. It's like we're returning to ye olden days of JPEG artifacts on every screenshot.


I was thinking about this with YouTube as an example. A lot of people complain about the compression on YouTube videos making things look awful, but I bet there's a reasonable number of high-end content creators out there who would run a native(-ish, probably Electron) app on their local system to do a higher-quality encoding to YouTube's specifications before uploading.

In many (most?) cases, it's possible to get better compression and higher quality if you're willing to spend the CPU cycles on it, meaning that YouTube could both reduce their encoding load and increase quality at the same time, and content creators could put out better quality videos that maintain better detail.

It would certainly take longer to upload the multiple multiple versions of everything, and definitely it would take longer to encode, but it would also ease YouTube's burden and produce a better result.

Ah well, a guy can dream.


AFIAK you can upload any bitrate to youtube as long as the file is <256GB.

So you could upload a crazy high bitrate file to them for a 20 min video which I suspect would be close to "raw" quality.

I don't know how many corners youtube cut on encoding though.

I suspect most of the problem is people exporting 4k at a 'web' bitrate preset (15mbit/s?), which is actually gonna get murdered on the 2nd encode more than encoding quality on youtubes side?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: