While it's true that there are less opportunities to track you through leaky / malicious apps, "dumbphones" remain, well, phones, with data retention laws in almost all countries in the world (and the subsequent leaks / hacks, see the recent Salt typhoon events for a good example) plus extra vulnerabilities due to poorly-developed Operating Systems.
That depends on what threat scenario you're looking at. If your government wants to track you, they'll find a way, regardless of what phone you're using (or not).
But the privacy gains of not installing a load of third-party apps from a dozen different data-selling businesses (or using an operating system built by the mother of all data-selling businesses) are very substantial.
My benchmarks are not public, but on AMD EPYC processors ChaCha12 (5 GB / s) is faster than hardware-accelerated AES-256-GCM (3.5 GB / s).
Unfortunately, this is comparing apples to oranges because AES-256-GCM is authenticated, so you will need a MAC with ChaCha12 (usually Poly1305) which finally makes ChaCha12 in AEAD mode slower than AES-256-GCM.
But the real question is: What is fast enough?
I believe that between 1 and 2 GB / s per core for an AEAD is fast enough as I/O will be your bottleneck way before that.
This is why I will always favor a ChaCha20/ChaCha12-based AEAD over AES and its many footguns.
1. Rust has had 10x better tooling right from the start. Cargo vs SBT, no joke.
2. Rust has improved its compiler performance by >3x in the last 5 years.
3. Rust hasn't gotten any major new language feature in the last 6 years. Most language improvement is improving orthogonality of existing features.
4. Rust has an excellent backwards compatibility story also right from the start. Code/libraries written in 2015 can be still compiled and used together with code from 2025.
5. Rust has a serious backing from all major IDE / editor providers.
6. Rust has integrates easily with other languages and ecosystems. It's easier to call Rust from Python than to call Scala from Python. It's easier and more performant to call C from Rust than to call C from Scala. Things like that. It's sad that Scala even struggled with interoperability with Java; which should theoretically be easy as it is the same platform.
Seriously, doesn't look to me like the same mistakes.
It's actually quite opposite.
> Rust hasn't gotten any major new language feature in the last 6 years. Most language improvement is improving orthogonality of existing features.
Rust also ships with a functioning epoch/edition system that makes it feasible to rework existing language features for improved generality, elegance etc. without losing backward compatibility. I'm not sure if Scala has anything like that.
Also, the nightly/stable split in Rust means that most new features don't reach the final adoption stage unless there's a very real consensus about their design. A lot of effort is made to get rid of incidental complexity wherever feasible, before the new features are actually stabilized.
Most language implementations that "haven't gotten any major new language feature in the last 6 years" would be described as broadly stagnant or stale, but Rust is quite different.
>Rust has had 10x better tooling right from the start. Cargo vs SBT, no joke.
Cargo is a really good tool, but I wouldn't say it is 10x better than SBT.On the other hand, Scala can be used with different build tools: Maven, Gradle, Mill. With Rust you have no choice(or I'm not aware of).
Rust-analyzer is better than metals, can't argue with that...
>Rust has improved its compiler performance by >3x in the last 5 years
Rust compiler is very slow. I can't imagine what compilation time was 5 years ago, but even now, it is incredibly slow, even compared to "slow" Scala compiler(which is no longer true). I'm working on a little project with bevy, and even on such small project compilation time sucks. You can't compare it with Scala, it's just completely different experience.
>Rust hasn't gotten any major new language feature in the last 6 years.
That's not true. Rust team handles these changes correctly. Rust has concept of "Editions", and every 3 years new edition introduces significant changes. I really like how editions work, it's a great way to maintain backwards compat.
>4 Mostly agree
>5 Fully agree
>It's easier and more performant to call C from Rust than to call C from Scala.
Yep, but it's not Scala but the JVM limitation. JNI sucks, but there is an effort to improve that (project Panama).
Overall, both Scala and Rust are great languages. Rust team has addressed many typical issues from the beginning. Scala had its mistakes, but moving in the right direction.
There is no sha-256 in the C or C++ standard library either. A cryptographic hashing algorithm like sha-256 is not something that necessarily has to be in a standard library.
There's plenty of old modules in the Python standard library that hardly any one uses. But removing them would break backwards compatibility guarantees. The Rust project has decided that they want to keep the standard library lean. And this works well in practise, because cargo is a really good package manager.
It is minimalist because it has to support many platforms and must be lightweight and easy to port. It also must be stable for a long time. Once something lands in the standard library, it is very hard to remove / modify / redesign. How many half-broken date time implementations Java ended up with?
But it makes up for the minimalist stdlib by making it a breeze to use third-party libraries.
Care to elaborate on which changes do you consider a mistake?
---------
To me the goals are sensible the flagship goals (for 2025H1 for example [1]) are all about DX and focus on downstream devs and their needs (all hands, Rust in the kernel).
The usual problem of "compile times" [2] (usually link times and LLVM throughput and so on) are things that are being worked on, but fundamentally they all can only be solved by caching, incremental compilation, and better developer tooling that are not language problems.
Yet it's Go that added big language features recently (generics, iterators, changing semantics of for loop) and not Rust. Rust hasn't gotten any new major features since adding async ~7 years ago. Rust is much more stable language-wise than Go, partly because it had much more features in 1.0, and Go has to gradually add them (what's next? enums?). It also has much more advanced mechanisms allowing to both evolve the language and not break existing code (editions). Things that neither Go nor Scala figured out yet.
Go has proven that people want cheap threads. If it didn't have it, it wouldn't get anywhere.
Now even Java has cheap threads (Loom).
And Go even has generics. Just 20-ish years later than Java. And it's likely that more features that now Java has will trickle into Go. If Go wants to survive.
All this to say that there is no space in the market for another language which is a stupid simple Algol. Go already occupies that space. And even Go will have to add features developers want/need, if it doesn't want to get cornered out of the market.
Go has proven that we still need the backing of large corporations for success in the tech world despite the fact that open-source powers most major tech companies today
Makes me want to buy it less for sure, since the maker is clearly focusing on a feature (set) I don’t want to use or support. In the worst case, the AI stuff cannot be disabled and will annoy me on a daily basis.
Less. It means that at best there's stuff I'm going to have to waste time disabling, at worst the product will have that crap using resources all the time for no benefit.
I associate it with marketing and privacy invasion, in most cases.
I keep griping about this, but I hate that features that we used to just love that were a more traditional ML now have to have "AI" branding slapped over it. The image search on iOS where I can just search "beer" and find a glass of beer I had a month ago is absolutely incredible, but it was early enough to not get hit with the "AI" branding. If that feature, or any other ML features that all smartphones have today, came out now, it'd be labeled "AI". It's a broad term that has now been shoved into everything, even in places where it may technically apply, but feels over the top.
With Pixel it doesn't really matter to me. I'll install GrapheneOS the minute I get it out of the box.
As for other vendors -- definitely less. I just want a decent phone with good software and minimum bloatware. I see all these AI features as bloatware.
Nobody asked Microsoft to put an unremovable Copilot button next to the cursor in every Office app. Shoving AI into everything has little to do with what customers want, and much to do with checking a box re: org strategy and investor PR.
1) the phone is already data mining the shit out of me; this is just paying more money for something that's going to destroy my privacy more effectively.
2) AI adds no value to what I need this phone to do: take pictures, calls, text, web browsing, and a few other apps as needed.
3) the utility of AI is still dubious in general -- Copilot and ChatGPT still get enough shit wrong I cannot trust it in a work context, and the code they provides is basic but wonky in weird ways. the utility or marginal utility here is effectively non-existent.
Mostly irrelevant to me. Maybe a tiny bit more *if* that means it comes with more RAM at the base price (as with "Copilot+ PC"s starting with a minimum of 16GB).
If the AI features ran totally locally, it would be a different story but there’s no way I want features to be locked behind having coverage/a subscription etc.