Ubuntu is evaluating it as the default in order to see if it’s ready. That’s something you want to do before declaring something 1.0.
If it’s not ready, they’ll roll it back.
Part of why you have to do something like this is because the test suite just isn’t comprehensive, nor should we expect it to be. Real world usage is what shakes out the long tail of bugs. You just have to have some sort of stage like this in order to get things into a good state.
No. Because the tests that don’t pass are edge cases and corners that most people wouldn’t notice. It’s arguably more important to fix bugs that impact actual usage, so it can be a valid strategy to do this even before you hit 100% coverage, to help you prioritize the remaining bugs to fix.
In other words, there may be more serious bugs not in the test suite than the ones that aren’t passing that are in the suite. And you only find that out through real usage.
> Because the tests that don’t pass are edge cases and corners that most people wouldn’t notice.
This standard may be justified when there is significant benefit. There is not in this case. And some projects have stricter standards.[1]
> In other words, there may be more serious bugs not in the test suite than the ones that aren’t passing that are in the suite. And you only find that out through real usage.
You should assume everyone understands how Ubuntu's decision would benefit this project. You should assume most Ubuntu users do not care. You replied to a comment which told you this before.[2]
It is about discovering how less mature a project is so that its maturity can be improved. Nothing has been replaced yet. That decision will be made once its maturity has been evaluated.
> It is about discovering how less mature a project is so that its maturity can be improved.
Must I repeat? You should assume everyone understands how Ubuntu's decision would benefit this project. You should assume most Ubuntu users do not care.
> Nothing has been replaced yet.
Replace means put something in the place of another thing. Not eradicate the other thing.
Many of the utils, such as sort, aren't locale-aware. Considering that most of the world do not use English/ASCII, do you still consider that an irrelevant edge case?
I don't consider it irrelevant, but neither does uutils. However, it's also not something that is currently at a zero. I'm not even sure that this percentage of tests is related to locale support specifically. I'm sure parity will be reached here.
so they see issues that rise up from real world issues that tests might not cover? the same ubuntu version also bundles the latest kernel which is not considered stable to begin with.
The point of the blog is that even at "supposed" deterministic generative sampling, non-determinism creeps in. This in turn has disastrous effects in very real experiments.
I'm pretty sure they didn't do their research well. They probably think mastodon's app is the top result that comes up when mastodon is typed into google. They also decided to block MeWe which is weird because nobody I know has ever heard of it. Another interesting choice was Rumble. Twitch was left alone but Rumble was blocked
> MeWe which is weird because nobody I know has ever heard of it. Another interesting choice was Rumble. Twitch was left alone but Rumble was blocked
From experience, this is a symptom of them wanting to censor a specific piece of content which is on all those platforms. Look for it, you may discover something interesting.
I live in Tunisia, which had one of the most censored internet in the world before 2011.
> decided to block MeWe which is weird because nobody I know has ever heard of it
Seems to indicate they're not actually trying to prevent their citizens from doing anything in particular, they're just trying to get these international companies to follow their local laws since they operate there.
One could argue that. There were also a few services that complied a long time ago: TikTok, Viber et al. Twitter(X) is currently discussing with government about this. Also, a big population in Nepal seem to agree with this decision. I could see a lot of people celebrating the decision to block these services.
Mastodon gGmbH does not have an operating presence in Nepal - but it is not the whole of Mastodon (or the Fediverse) by any means!
I had a quick look at the maps I could find that indicated the locations of Mastodon server instances, and I was not able to find anything local to their - of course that's not to say there is not one or more. It is important to the network that there should be many Mastodon instances, in many places, so it would be great if there were some!
At the same time, manufacturers do not release operating systems with extremely obvious flaws that have (atleast so far) no reasonable guardrails and pretend that they are the next messiah.
pdfjam [1] uses a LaTeX package under the hood, is included with the TeX Live distribution and acts as a wrapper for a LaTeX package. With this, I believe your example would be:
From my understanding of cryptography, most schemes are created with the assumption that _any_ function that does not have access to the secret key will have a probabilistically small chance of decoding the correct message (O(exp(-key_length)) usually). As LLMs are also a function, it is extremely unlikely for cryptographic protocols to be broken _unless_ LLMs can allow for new types of attacks all together.