Recall suffered from a classic Microsoft mistake they've made time and again, but never learned from - how to correctly market and package your feature.
Microsoft always tends to "go big" with their integrations, often to their detriment, in order to increase adoption of new features. One notable time was with Windows 8. They really, REALLY wanted people to try out the new Metro UI, so they deeply integrated it into the OS, pushed it in every marketing campaign, and made it the first screen you saw on login. There were some great features in it - better performance and better search results, but it wasn't opt in. The reaction from customers who took a casual look was, "they removed the desktop!". It wasn't true, but because of how overzealous MS was to push the new feature, that became the takeaway.
The same thing is happening here - Microsoft pushed what objectively is a great tool, but they did so in a way that never gave users a choice of whether or not they wanted it. They've also framed the messaging and marketing in a way that's confusing to what is actually happening. Look at the amount of talk in this blogpost dedicated to mentioning how important security is for them, without ever actually going into what the security issues are or how they're addressing them.
Sloppy marketing + forced integration has bit Microsoft so many times now. I'm always shocked that they never learn from this.
The problem is not marketing. The problem is the tool is fundamentally not secure, and in my opinion, fundamentally not securable without major changes.
The core issue is that everyone has things on their computer that they want to be transient. I don't ever want my computer taking screenshots when I'm entering, say, my credit card number. More importantly, though, I oftentimes have text editors containing "scratch pads" that may contain sensitive data that I never want to persist.
Microsoft just never thought through the security implications of this feature.
How is this objectively a great feature? This is a spyware that stores screenshots unencrypted (and thus accessible to any other spyware). I am also not convinced that the AI tools would have been offline, thus effectively sharing your whole data with Microsoft (even more than before).
From a privacy perspective, this feature is an abomination
I'd caution us to separate out the feature from the implementation.
The feature provides the ability to search through all of the previous things you've done and gain context in an instant, in a way that can be queried with natural language. I think we can agree what it aims to achieve is beneficial.
The implementation is what you're debating. I see these are two separate things, but they play hand in hand. If you get the implementation wrong, it can easily tank the feature.
Still, the documentation for this seems to disagree with what you're saying.
> This is a spyware that stores screenshots unencrypted
This page[1] states "Snapshots are encrypted by Device Encryption or BitLocker". They suggest that things aren't shared with Microsoft, though I totally understand the skepticism there.
> This page[1] states "Snapshots are encrypted by Device Encryption or BitLocker".
That sounds like it just means it's encrypted at rest - ie. while you're logged out - but transparently decrypted in much the same way as everything else on the system while you're logged in. That is to say, any running malware would have just as much access as it would do on a system that doesn't use encryption.
From a functional point of view, it can be treated as being equivalent to being unencrypted, with the exception being when you aren't logged in - at which point you're not running any programs anyway.
While the claim that Bitlocker is used to encrypt them is true, it’s really not good enough here. The files are unencrypted during a live session, which makes them an easy target for malware.
Not just during a live session -- whenever Windows is running. Nobody needs to be logged in or actively using the machine for the files to be readable in unencrypted form.
"Objectively" is very strong, but I'd love a tool like this.
Except it's so thoroughly invasive and ripe for abuse that I can't imagine ever using something like this that isn't open source and thoroughly vetted. And I think your very valid points are stemming from that -- MS's implementation was hamfisted and halfassed, and people don't trust them even if they do it correctly. But those are issues with the implementation and the implementer, in my mind. Not the conceptual feature.
What's funny is if they had marketed it as Apple does (and had as much credibility as Apple does among their fans) then everyone would love it. I seriously doubt they intend to do much different than "Apple Intelligence." I.e., local access to all your data and uploads of data you use on cloud apps.
Recall as implemented is an absolutely security and privacy nightmare, and would absolutely become a tool of oppression for abusers. MS deserved to reap the whirlwind here, as would any firm that offered the same sort of feature.
There is no equivalence. Apple has been building on this technology for years now, all with a focus on privacy. Microsoft neither has the engineering talent, the time, nor the development ecosystem to catch up.
With Windows 8, Microsoft thought that tablets and touchscreens were the future, and Metro was designed for those. Tablets being the future of computing meant they made the new experience the default. Turns out keyboards and mice are still vastly more popular a decade later.
> The same thing is happening here - Microsoft pushed what objectively is a great tool, but they did so in a way that never gave users a choice of whether or not they wanted it.
Microsoft always tends to "go big" with their integrations, often to their detriment, in order to increase adoption of new features. One notable time was with Windows 8. They really, REALLY wanted people to try out the new Metro UI, so they deeply integrated it into the OS, pushed it in every marketing campaign, and made it the first screen you saw on login. There were some great features in it - better performance and better search results, but it wasn't opt in. The reaction from customers who took a casual look was, "they removed the desktop!". It wasn't true, but because of how overzealous MS was to push the new feature, that became the takeaway.
The same thing is happening here - Microsoft pushed what objectively is a great tool, but they did so in a way that never gave users a choice of whether or not they wanted it. They've also framed the messaging and marketing in a way that's confusing to what is actually happening. Look at the amount of talk in this blogpost dedicated to mentioning how important security is for them, without ever actually going into what the security issues are or how they're addressing them.
Sloppy marketing + forced integration has bit Microsoft so many times now. I'm always shocked that they never learn from this.