Which reveals that "everyone can open a ZIP file" is a lie. Sure, everyone can open a ZIP file, as long as that file uses only a limited subset of the ZIP format features. Which is why formats which use ZIP as a base (Java JAR files, OpenDocument files, new Office files) standardize such a subset; but for general-purpose ZIP files, there's no such standard.
(I have encountered such ZIP files in the wild; "unzip" can't decompress them, though p7zip worked for these particular ZIP files.)
> On 15 June 2020, Zstandard was implemented in version 6.3.8 of the zip file format with codec number 93, deprecating the previous codec number of 20 as it was implemented in version 6.3.7, released on 1 June.[36][37]
> The `zip` command on Ubuntu is 6.0, which was released in 2009 and does not support zstd. It does support bzip2 though!
You probably mean the "unzip" command, which https://infozip.sourceforge.net/UnZip.html lists as 6.0 being the latest, released on 20 April 2009. Relevant to this discussion, new in that release are support for 64-bit file sizes, bzip2 compression method, and UTF-8 filenames.
The "zip" command is listed at https://infozip.sourceforge.net/Zip.html as 3.0 being the latest, released on 7 July 2008. New in that release are also support for 64-bit file sizes, bzip2 compression method, and UTF-8 filenames.
It would be great if both (or at least unzip) were updated to also support LZMA/XZ/ZSTD as compression methods, but given that there have been no new releases for over fifteen years, I'm not too hopeful.
there are A LOT of zip files using lzma in the wild.
also, how about people learn to use updated software? should newer video compression technologies not be allowed in mkv/mp4.
if you cant open it, well.. then stop using 90ies winzip
No. You can't get people to use updated software. You can't get a number of people to update past windows 7. This has been and will likely remain a persistent issue, and it's sure not one you're going to fix. All it will do is limit your ability to work with people. This isn't a hill on which you should die.
im okay with that. That being said, I have not had a single issue delivering zip files with lzma, and i KNOW that I have gotten MANY from the random sources.
I would also expect people to be able to decode h265 in an mp4 file.
Your proposal seems, to word it bluntly, retarded. You would have mp4 frozen for h264 for ETERNITY, and then invent a new format as replacement? or you would just say "god has bestowed upon the world h264, and it shall be the LAST CODEC EVER!".
get with the program. Things change, you cannot expect to be forwards compatible for ever. Sometimes people have to switch to newer versions of software.
If your customer is stuck in the 90s because his 90s technology works perfectly fine and he has no intention to fix things that are not broken. Then deliver stuff that is compatible with 90s technology. He will be happy, will continue to work with you and you will make money.
If your customer is using the latest technologies and values size efficiency, then use the latest codecs.
I usually default to being conservative, because those who are up to date usually don't have a problem with bigger files, but those who are not are going to have a problem with recent formats. Maybe overly so, but that's my experience with working with big companies with decades long lifecycles.
Your job is not to lecture your customer, unless he asked for it. And if he asked for it, he probably expects better arguments that "update your software, idiot". Your job is to deliver what works for him. Now, of course, it is your right to be picky and leave money on the table, I will be happy to go after you and take it.
not everything is a client<->customer relationship.
Professionally I can definitely support old stuff. It costs extra most often.
Conservative doesnt have to be stuck. I am not recommending we send h266 to everyone now, but h265 is well supported, as is AV1.
lzma support in zip has been widely supported for many years at this point. I am going to be choosing my "sane defaults", and if someone has a problem with that, they can simply do what they need to do to open it, or provide a damn good reason for me to go out of my way.
Installing new software has a real time and hassle cost, and how much time are you actually saving over the long run? It depends on your usage patterns.
the developer is hired by someone that gets to make that decision. Ultimately the customer does. Thats why some people spend extreme resources on legacy crap, because someone has deemed it worth it.
In the middle of San Francisco, with Silicon Valley level incomes, very possible. In the real world I still exchange files with users on rustic ADSL, where every megabyte counts. Many areas out there, in rural Mongolia or in the middle of Africa that's just got access to the internet, are even worse in that regard.
English is evolving as a hieroglyphic language. That floppy disk icon stands a good chance of becoming simply the glyph meaning "save". The UK still uses an icon of an 1840s-era bellows camera for its speed camera road signs. The origin story will be filed away neatly and only its residual meaning will be salient.
More 'useful' one is webp. It has both a lossy and lossless compression algorithm, which have very different strengths and weaknesses. I think nearly every device supports reading both, but so many 'image optimization' libraries and packages don't - often just doing everything as lossy when it could be lossless (icons and what not).
It's similarly annoying how many websites take the existence of the lossy format as a license to recompress all WebP uploads, or sometimes other filetypes converted to WebP, even when it causes the filesize to increase. It's like we're returning to ye olden days of JPEG artifacts on every screenshot.
I was thinking about this with YouTube as an example. A lot of people complain about the compression on YouTube videos making things look awful, but I bet there's a reasonable number of high-end content creators out there who would run a native(-ish, probably Electron) app on their local system to do a higher-quality encoding to YouTube's specifications before uploading.
In many (most?) cases, it's possible to get better compression and higher quality if you're willing to spend the CPU cycles on it, meaning that YouTube could both reduce their encoding load and increase quality at the same time, and content creators could put out better quality videos that maintain better detail.
It would certainly take longer to upload the multiple multiple versions of everything, and definitely it would take longer to encode, but it would also ease YouTube's burden and produce a better result.
AFIAK you can upload any bitrate to youtube as long as the file is <256GB.
So you could upload a crazy high bitrate file to them for a 20 min video which I suspect would be close to "raw" quality.
I don't know how many corners youtube cut on encoding though.
I suspect most of the problem is people exporting 4k at a 'web' bitrate preset (15mbit/s?), which is actually gonna get murdered on the 2nd encode more than encoding quality on youtubes side?
So apparently webp is also 'RIFF" which is the container for WAV files as well it seems. I did not know this. Also webp has its own specialized lossless algorithm. For things like icon art I generally just continue to use PNG. Is there an advantage to using Webp Losslesss?
Tar files also have the miserable limitation of having no index; this means that to extract an individual file requires scanning through the entire archive until you find it, and then continuing to scan through the rest of the archive because a tar file can have the same file path added multiple times.
That makes them useful for transferring an entire set of files that someone will want all or none of, e.g. source code, but terrible for a set of files that someone might want to access arbitrary files from.
You can use ZSTD with ZIP files too! It's compression method 93 (see https://pkware.cachefly.net/webdocs/casestudies/APPNOTE.TXT which is the official ZIP file specification).
Which reveals that "everyone can open a ZIP file" is a lie. Sure, everyone can open a ZIP file, as long as that file uses only a limited subset of the ZIP format features. Which is why formats which use ZIP as a base (Java JAR files, OpenDocument files, new Office files) standardize such a subset; but for general-purpose ZIP files, there's no such standard.
(I have encountered such ZIP files in the wild; "unzip" can't decompress them, though p7zip worked for these particular ZIP files.)