I was recently surprised to sell an old portable MiniDisc player on fb for close to $100. (FWIW, it was mint). I’m still nostalgic for them, and have another portable player and recording deck, but I’m left scratching my head at how much folks are willing to pay to pick up their first player. Shrug
Unless you want to turn it off, which I haven't been able to figure out how to do. Every now and then my phone will randomly prompt me to "ask Gemini", which is really annoying. When I want to use the LLM, I will go to it, stop shoving it in my face over and over.
I can't turn it off on my Samsung Galaxy S23 which I originally bought, because it was not marketed with AI. The patched it in later and ever since it just randomly starts as if it was listening all the time.
Apple’s analytics probably support this which is exactly why siri still sucks. But ya, everyone will continue to think they somehow know better and apple is wrong and poorly executing
I can count on one hand, the number of times where I have gone “gosh, I so wish my voice assistant was better”.
It queues up music correctly, and picks the right destination on maps in my car. 98% use case satisfied. Would I like it to be better? Don’t really care. Is it a purchasing point? Nope. Would I miss it if it disappeared tomorrow? Also nope.
A better voice assistant is a major selling point for me. I need glasses to use my phone. Messages, email, purchases, directions, constantly. A good voice commands would be godsend. Siri doesn’t work very well
I use the phone voice assistants to set timers, and call people when I'm driving.
It is objectively worse at calling people than Assistant was. If I ask you to call someone, don't come up with a scolling list of phone numbers that I have to pick from. At least Assistant called the primary designated number for someone, Gemini just froze and wouldn't take voice commands to pick the number but forced my to pick up my phone.
I turned that bullshit off a couple of days after they forced it on me without asking.
I still feel like they are in an incredible position when it comes to AI because of their hardware integration/advantage across all of their devices. I think they see immense value in getting things on-device and not having to rely on any of these other companies.
When it comes to AI, there's ~5 trillion dollars of datacenter revenue Apple could be competing for, but isn't. That's not good.
Now, maybe it would be justifiable if there were great local AI experiences on iPhone, or an easy $5 trillion to be made elsewhere. Until then, Apple is bleeding money hand-over-fist by refusing to sign the CUDA UNIX drivers and sell the rackmount Mac as a cutting-edge TSMC inference box. The Grace superchip is absolutely eating Apple's ARM lunch right now.
> That junior engineer possibly hasn't programmed without the tantalizing, even desperately tempting option to be assisted by an LLM.
This gives me somewhat of a knee jerk reaction.
When I started programming professionally in the 90s, the internet came of age and I remember being told "in my days, we had books and we remembered things" which of course is hilarious because today you can't possibly retain ALL the knowledge needed to be software engineer due to the sheer size of knowledge required today to produce a meaningful product. It's too big and it moves too fast.
There was this long argument that you should know things and not have to look it up all the time. Altavista was a joke, and Google was cheating.
Then syntax highlighting came around and there'd always be a guy going "yeah nah, you shouldn't need syntax highlighting to program, you screen looks like a Christmas tree".
Then we got stuff like auto-complete, and it was amazing, the amount of keystrokes we saved. That too, was seen as heresy by the purists (followed later by LSP - which many today call heresy).
That reminds me also, back in the day, people would have entire Encyclopaedia on DVDs collections. Did they use it? No. But they criticised Wikipedia for being inferior. Look at today, though.
Same thing with LLMs. Whether you use them as a powerful context based auto-complete, as a research tool faster than wikipedia and google, as rubber-duck debugger, or as a text generator -- who cares: this is today, stop talking like a fossil.
It's 2025 and junior developers can't work without LSP and LLM? It's fine. They're not in front of a 386 DX33 with 1 book of K&R C and a blue EDIT screen. They have massive challenged ahead of them, the IT world is complete shambles, and it's impossible to decipher how anything is made, even open source.
Today is today. Use all the tools at hand. Don't shame kids for using the best tools.
We should be talking about sustainability of such tools rather than what it means to use them (cf. enshittification, open source models etc.)
>"in my days, we had books and we remembered things" which of course is hilarious
it isn't hilarious, it's true. My father (now in his 60s) who came from a blue collar background with very little education taught himself programming by manually copying and editing software out of magazines, like a lot of people his age.
I teach students now who have access to all the information in the world but a lot of them are quite literally so scatterbrained and heedless anything that isn't catered to them they can't process. Not having working focus and memory is like having muscle atrophy of the mind, you just turn into a vegetable. Professors across disciplines have seen decline in student abilities, and for several decades now, not just due to LLMs.
Information 30 years ago was more difficult to obtain. It required manual labor but in todays' context there was not much information to be consumed. Today, we have the opposite - a huge vast of information that is easy to obtain but to process? Not so much. Decline is unavoidable. Human intelligence isn't increasing at the pace advancements are made.
Agreed, although LLMs definitely qualify as enabling developers compared to <social media, Steam, consoles, and other distractions> of today.
The Internet itself is full of distractions. My younger self spent a crazy amount of time on IRC. So it's not different than spending time on say, Discord today.
LLMs have pretty much a direct relationship with Google. The quality of the response has much to do with the quality of the prompt. If anything, it's the overwhelming nature of LLMs that might be the problem. Back in the day, if you had, say a library access, the problem was knowing what to look for. Discoverability with LLMs is exponential.
As for LLM as auto-complete, there is an argument to be made that typing a lot reinforces knowledge in the human brain like writing. This is getting lost, but with productivity gains.
Watching my juniors constantly fight the nonsense auto completion suggestions their LLM editor of choice put in front of them, or worse watching them accept it and proceed to get entirely lost in the sauce, I’m not entirely convinced that the autocompletion part of it is the best one.
Tools like Claude code with ask/plan mode seem to be better in my experience, though I absolutely do wonder about the lack of typing causing a lack of memory formation
A rule I set myself a long time ago was to never copy paste code from stack overflow or similar websites. I always typed it out again. Slower, but I swear it built the comprehension I have today.
> Watching my juniors constantly fight the nonsense auto completion suggestions their LLM editor of choice put in front of them, or worse watching them accept it and proceed to get entirely lost in the sauce, I’m not entirely convinced that the autocompletion part of it is the best one.
That's not an LLM problem, they'd do the same thing 10 years ago with stack overflow: argue about which answer is best, or trust the answer blindly.
No, it is qualitatively different because it happens in-line and much faster. If it’s not correct (which it seems it usually isn’t), they spend more time removing whatever garbage it autocompleted.
People do it with the autocomplete as well so I guess there's not that much of a difference wrt LLMs. It likely depends on the language but people who are inexperienced in C++ would be over-relying on autocomplete to the point that it looks hilarious, if you have a chance to sit next to them helping to debug something for example.
For sure, but these new tools spit out a lot more and a lot faster, and it’s usually correct “enough” that the compiler won’t yell. It’s been wild to see its suggestions be wrong far more often than they are right, so I wonder how useful they really are at all.
Normal auto complete plus a code tool like Claude Code or similar seem far more useful to me.
I spent the first two years or so of my coding career writing PHP in notepad++ and only after that switched to an IDE. I rarely needed to consult the documentation on most of the weird quirks of the language because I'd memorized them.
Nowadays I'm back to a text editor rather than an IDE, though fortunately one with much more creature comforts than n++ at least.
I'm glad I went down that path, though I can't say I'd really recommend as things felt a bit simpler back then.
I have the same policy. I do the same thing for example code in the official documentation. I also put in a comment linking to the source if I end up using it. For me, it’s like the RFD says, it’s about taking responsibility for your output. Whether you originated it or not, you’re the reason it’s in the codebase now.
I have worked with a lot of junior engineers, and I’ll take comprehension any day. Developing their comprehension is a huge part of my responsibility to them and to the company. It’s pretty wasteful to take a human being with a functioning brain and ask them to churn out half understood code that works accidentally. I’m going to have to fix that eventually anyway, so why not get ahead of it and have them understand it so they can fix it instead of me?
LLMs are in a context where they are the promised solution for most of the expected economic growth on one end, a tool to improve programmer productivity and skill while also being only better than doom scrolling?
Thats comparison undermines the integrity of the argument you are trying to make.
> When I started programming professionally in the 90s, the internet came of age and I remember being told "in my days, we had books and we remembered things" which of course is hilarious because today you can't possibly retain ALL the knowledge needed to be software engineer due to the sheer size of knowledge required today to produce a meaningful product. It's too big and it moves too fast.
But I mean, you can get by without memorizing stuff sure, but memorizing stuff does work out your brain and does help out in the long run? Isn't it possible we've reached the cliff of "helpful" tools to the point we are atrophying enough to be worse at our jobs?
Like, reading is surely better for the brain than watching TV. But constant cable TV wasn't enough to ruin our brains. What if we've got to the point it finally is enough?
I'm sure I'm biased by my age (mid 40s) but I think you are onto something there.
What if this constant decline in how people learn (on average) is not just a grumpy old man feeling? What if it's something real, that was smoothened out by the sheer increase of the student population between 1960 and 2010 and the improvements of tooling?
Ah, but lets do leetcode on the whiteboard as interview, for an re-balancing a red-black tree, regardless of how long those people have been in the industry and the job position they are actually applying for.
> "in my days, we had books and we remembered things" which of course is hilarious because today you can't possibly retain ALL the knowledge needed to be software engineer
Reading books was never about knowledge. It was about knowhow. You didn't need to read all the books. Just some. I don't know how many developers I met who would keep asking questions that would be obvious to anyone who had read the book. They never got the big picture and just wasted everyone's time, including their own.
"To know everything, you must first know one thing."
I typically jump on the latest macOS with enthusiasm. I once made the mistake to install the beta version of the next os, and well, that didn't go well for me. But typically, within X.1, I'm there.
However something shifted since this "visionOS" melted version of macOS (Tahoe); where I have absolutely no intension to upgrade from Sequoia. I hope they will fix it by the time I'll be forced to upgrade (post support deadline).
It started with the macOS that brought the iOS settings panel. We went from a logical structure of easily findable stuff to a complete mess. Just open the "Keyboard" settings on macOS today and it's bewildering how they could ship this and think this is fine. Steve would roll in his grave.
The process to allow running applications that are unsigned is just a horrible hack. It feels like a last minute "shove it and move on!".
By 2035 I wonder if we'll be all running KDE or WindowMaker and the hell with modern OS GUI.
From a Gestalt standpoint, human relations with desktop computers are not the same as with thumb driven mobile OS or air-pinch driven vision OS, period. The hell with "glass" or "flat" design. Desktop OS should be as forgettable as possible, as it's about having long stints of flow, not giving a feeling of "air" or "play".
> It started with the macOS that brought the iOS settings panel. We went from a logical structure of easily findable stuff to a complete mess.
It’s difficult to pinpoint when exactly the decline started. But one key event before the Settings app was the Catalyst apps that were straight out and dismal ports from their iOS versions. Till date, none of those work well and cannot be navigated properly using the keyboard. Reminders, Messages, Notes and more.
Craig Federighi seems to be increasingly taking on so much authority without having a trusted set of people under him and his leadership (or lack of it) has resulted in neglecting software across device platforms. Some of the Apple apps on tvOS with paid subscriptions are worse, because the bugs in them don’t get any attention at all.
If you cut text in a "separate window" note, it will delete the text, but it won't actually copy it unless you issued the command for a note in the main window. So when you go to paste it elsewhere, you find it's gone, and then you often find Notes has lost the undo history too.
I'm curious if this is a bug that other people deal with, but I have to screenshot stuff to send folks all the time.
Screenshot, right click, and "copy" doesn't appear. Sometimes moving the app to another screen makes it appear, sometimes just switching to another app and back will help, sometimes I can't get it to be an option at all and I have to close screenshot and retry.
Really awful. Just make it an option all the time.
When Microsoft decided to rewrite the screenshot app from Win32 to WinUI, it had plenty of bugs, to the point I kept the old exe around for about one year.
The one that mostly bothered me, was not being able to select desktop regions if using multiple monitors, the rectangle region went nuts on what was possible to select.
I just want to be able to save the image to a folder and copy it to my clipboard when taking a screenshot. iirc in KDE Plasma's Spectacle, these options are checkboxes, you can enable as many at once as you like.
I actually have a simple shortcut created just to turn off Bluetooth entirely (helps speed up watchOS updates by forcing it to download directly over WiFi). I also have shortcuts to turn off WiFi completely and to turn off both WiFi and cellular data without putting it in airplane mode.
I’ve always felt the decline in Mac OS started on the day of the ‘Back To The Mac’ event in 2010. And has continued since. Symbolically this event made clear the iOS first focus of the company. And since then Mac OS updates have continued to be secondary/lesser to iOS.
Mac OS is still my system of choice, but I don’t have as much confidence in it as I would like.
The big thing from around fifteen years ago is the mixed modes for autosave, where they sort of half heartedly changed the language around save/save as and just sort of… left it. Some apps use their new (for the 2010s) auto save system and some don’t. And it’s up the the user to muddle through. Weird. And there are many half baked things like this in the OS now.
Mac hardware, on the other hand, has never been better than it is right now!
> The big thing from around fifteen years ago is the mixed modes for autosave, where they sort of half heartedly changed the language around save/save as and just sort of… left it. Some apps use their new (for the 2010s) auto save system and some don’t.
I may be mistaken, but AFAIK, all Apple’s apps auto-save on quit and restore state on open. If so, what do you suggest they do about making third party applications do that as well?
I've had the preferences "close windows when quitting an application" and "ask to keep changes when closing documents" checked since the day they appeared in System Preferences.
With these two, most applications behave as they did in the pre-Lion document model.
> Mac hardware, on the other hand, has never been better than it is right now!
I thought the same until trying a framework laptop with Ubuntu. Mac is the “IBM” choice, no one gets fired for choosing it, but quite frankly there’s better options these days.
A framework laptop is very nice, and definitely has a lot of upsides, but it can't match screen, keyboard, trackpad, camera, or speaker quality with a MacBook Pro, not to mention the battery life.
Those may be nice for people who need them, and are ok with the software side.
for me, the screen on the framework is ok. I think there's little to gain with LCDs at this point. The trackpad on the framework is smaller, so it's better. A nicer camera requires a nicer piece of tape to cover it, I guess. Notification beeps do not require Atmos or whatever. I can pack a powerbank for trans-oceanic flights, but I'm usually at a desk if I work long stretches.
Having nicer stuff would be nice, but the value proposition does not work for me in light of the software situation.
1 - compromised hardware over better software is a trade-off you're willing to make and
2 - you believe that the Framework software experience is better than macOS
i can concede 2 (if true, I've not used a Framework laptop) but I don't understand point 1. packing a powerbank for example just feels ancient if you've used the arm chip macs. then again, I'm now pushing my trade-off
It's going to be a different experience for everyone. For example I never get why people care about the laptop weight. You put it in your backpack anyway, (unless it's a small handbag sized laptop situation then fair enough) it's not like anything below 5kg will be noticeable in reality. Yet for others is a big deal. Personal preferences...
"Packing a powerbank" was more of a hypothetical, as I've never actually had to.
My point was that it's a tradeoff between software preference, tech politics, price, and hardware features. I think it's pretty easy to understand. It's not like Apple has an insurmountable lead; there are some benefits for some use cases.
I concur. I have a Framework 13 I use as my personal laptop and a work-issued M3 MacBook Pro. While I love the freedom that my Framework 13 provides in terms of user serviceability and operating system choice, the MacBook Pro feels more premium, and it has absolutely amazing battery life.
What do people like about MacBook trackpads? I can't stand them because you can only do a "click" action at the bottom of the thing, but there's nothing tactile that would help you to find it.
By 2035, I'm not even sure I'll have a computer. (Sort of a joke, but like, at this rate...)
My current OS X update strategy is: I don't, mostly. I'm a few versions behind, and at this point, I'd rather keep an OS that sort of works and just deal with the script kiddies, then upgrade to an OS that doesn't work and have to deal with my OS vendor.
The majority of users are content with chromebooks, what does that tell you about the requirements of desktop computers today? It tells me that they are just niche professional tools; and professional tools largely suck for UX..
I had an interesting realisation the other day (that's tangentially related): on my iPhone and iPad: I can't access my work emails or chats at all. Yet on my significantly more difficult to secure laptops: no problem.
The mobile platforms have built-in mechanisms for remote attestation. Desktop operating systems do not.
I think as soon as companies realise that an iPad is "good enough" for email/excel/word workers, we'll see an even more precipitous decline of the desktop operating system experience.
Professional software is aimed at people who use it day in day out so they’re optimising for a different problem than software that’s aimed at the casual user.
Intuitiveness is often seen as a outright positive by most people but actually it’s more of a trade off. Often the greatest efficiency is achieved by interfaces that require a bit of learning by the user. The ultimate example of that is command line interfaces which are very powerful and efficient but require you to know what you’re doing and give you relatively little help.
You’re on the other side of a steep learning curve for a lot of professional software you use. A steep learning curve is bad UX.
"I think as soon as companies realise that an iPad is "good enough" for email/excel/word workers, we'll see an even more precipitous decline of the desktop operating system experience."
This has a ring of SurfacePro as a corporate EUC choice. Quite common these days.
I regularly wait almost a year after a given version of MacOS has been released before upgrading. I don't care about new features, and I already spend all day fixing bugs of my own creation. That leaves very little time for debugging other people's software.
about 5 months ago i jumped ship to kde plasma and it's been great. took a month or two to get the most prized things working the way i wanted but kde is so configurable that you can get it to work pretty much identical to a mac. toshy gives you all the familiar mac keyboard shortcuts and lets you do per application configs. I can't see going back to a mac unless an employer mandated it. the freedom you have is refreshing. if something doesn't work the way you want it you can change it.
> if something doesn't work the way you want it you can change it.
This sentence here is my biggest heartbreak with modern “computing.” I came up in the Windows 98/XP days and over about 7 years from 98-05 basically gained full mastery of basically every aspect of Windows and how to change it, and also from 03 on started using Mac OS X daily and found it to be just as customizable or more, in most ways that mattered. I felt that my computer was my own and loved having full control, making it perfect for me.
None of that is possible now. You cannot even select your own notification sound for Messages on MacOS anymore. Only the 20 sounds packaged with the OS. What. The. F%$k.
This is because in any monopoly/duopoly/oligopoly, the product inevitably stops being about what the user wants, and becomes about what the monopolist wants. They're removing features like this because simplifying configurations translates to reduced support costs, and reducing their costs and padding their margin is the name of the game for a monopolist, they believe there's nowhere else for you to go, so they can and will hose you over and over again.
We're now paying the piper for many years of accrued monopoly effects, it turns out the way our IP law is structured, the rights we've granted corporations to sue people who attempt any kind of reverse engineering etc. all privilege the monopolist and encourage the formation of the monopoly, because the entire legal and regulatory system is designed to juice corporate profits and pesky old laws like the Sherman Act which got in the way have essentially been ignored for decades.
One really important thing for people to understand is that until there's a serious change to these dynamics, IT WILL GET WORSE. Mac OS will get worse, FOREVER. So will Windows and all other monopolist products. This is why you really need to switch away from them as soon as you can; life will be an order of magnitude more miserable for whoever's still using these products a decade from now. They will just keep on squeezing whoever's left, harder and harder until the heat death of the universe.
> They're removing features like this because simplifying configurations translates to reduced support costs, and reducing their costs and padding their margin is the name of the game for a monopolist, they believe there's nowhere else for you to go, so they can and will hose you over and over again.
There may be some truth to that, but I really don't think it's the whole story. Otherwise how do you explain spending so much effort on eye candy like MacOS "liquid glass", or the redesigned settings app? For that matter, why bother with an annual release at all?
To me, I think it's a pretty obvious case of prioritizing style over substance. For whatever reason, but not to save money. If they really wanted to save money they'd stop with the gratuitous change.
> You cannot even select your own notification sound for Messages on MacOS anymore
I don’t see a UI for it, but when I drop a sound in ~/Library/Sounds (tested with .aif an .m4r; .aiff likely will work, too, looking at ~/System/Library/Sounds) it shows up in the “Sound Effects/Alert sound” pop-up for me.
Yes, that's the alert sound. I have 100 sounds in there and can use them for the System alert sound. However, when you receive a message in "Messages" that plays the sound chosen in Messages app's Settings window -> Message received sound. That one only shows the builtin sounds that are on the sealed, signed, tamper-proof volume.
To be fair, Windows 8 only came out in 2012, so they haven't had that much time to finish the settings migration. But they're making good progress. If they keep up this pace of moving 2 settings per month, they should be able to finish by 2053.
If the goal was to move everything to Settings, sure. But Settings seems to be for the most common settings the average user will want to look at whereas more detailed options are elsewhere. It's a way to easily funnel users away from more impactful settings to system stability. In this view Windows 11 release solidified that pretty well.
The character repeat and cursor blink rate settings were already in Settings but it just opened up the older windows forms. This just gives them a new coat of paint by putting them in the Settings app.
Windows went through a pendulum swing of integrating touch (I think they ended up in a place where they expect users to use more of a multi-modal approach instead of touch-only).
I suspect Mac is going through the same thing right now as ipad is "growing up" and they're trying to reconcile all their UI. I'm a little surprised that Macs have never introduced touch.
Just a data point, in case it’s useful to anyone: I saw the screenshots and blog posts about Liquid Glass and thought it looked miserable. But then I had a hardware issue and had to upgrade.
I swear after the first 3 days, I’ve only even noticed the new UI maybe a dozen times. If I stop and really pay attention to it, I think the old UI was a bit better, but strangely I don’t really seem to notice the new UI. If it’s worse, I rarely notice.
Then again I spend most of my time in jetbrains IDEs, iterm2, and Firefox (none of which have changed much). So I might be a special case.
> The process to allow running applications that are unsigned is just a horrible hack. It feels like a last minute "shove it and move on!".
If you're talking about the process that just says "Foo.app is damaged and can’t be opened." and the only way around that is to manually remove the com.apple.quarantine extended attribute, that's arguably working as intended. Apple doesn't want users to run untrusted apps period. They want only apps approved by them.
As a dev and open source dev I don't like it. But, I can't totally be against it I think. It is safer for some users and experts can learn how to remove the attribute with `xattr -d com.apple.quarantine filename`
Saying "Foo.app is damaged" is lying to the user though, which is not nice, and not a good sign, in general, for the health of a company / its culture.
Saying it's damage is by design. Apple wants to scare you aware. I agree it feels bad from one POV. That was my initial reaction. I also agree though that steering grandma away from evil apps is good too.
Part of the reason computer users like your grandma are so helpless is because OS' have devolved to be completely untrustworthy. Everything lies, and error messages now look like "oopsy windows made a fucky! >_<"
It's no wonder granny has zero confidence in the computer and is always behind.
Yeah, by design, of course, but I still think it's bad (& there are plenty of ways to scare grandma without lying to her, if you really need to do that).
In general I'd contend that the mindest which leads you to believe "we need to lie to our users because they are dumb" isn't conducive to making good software.
> By 2035 I wonder if we'll be all running KDE or WindowMaker and the hell with modern OS GUI
I love Linux, but I doubt that will happen. If anything, by then Linux will be a feature of a workstation OS running in a hypervisor, just like it is with Windows and ChromeOS today.
> The hell with "glass" or "flat" design. Desktop OS should be as forgettable as possible, as it's about having long stints of flow, not giving a feeling of "air" or "play".
There's nothing stopping you from running a Linux desktop with a minimalist tiling window manager - I have for years and found it does exactly what you say.
But it sounds like it's more that you don't like that there aren't many product offerings like that. That is true. Even computers with Linux pre installed use "bouncy" desktop environments like Gnome/KDE by default.
My preference - ChromeOS - comes the closest but is still nowhere near as stripped down as i3 tiling window manager (which I also think is great).
> Do you run ChromeOS Flex on some thinkpads or do you work on a Chromebook?
Chromebook.
> What are the pros/cons vs running a debian if you can elaborate?
I like minimalist desktop environments. I like full screen window tiling using keyboard shortcuts, power management, fingerprint readers, accelerated displays, phone tethering, touch screen, passkey support for auth, and verified boot, and preferences synced across devices.
And I like all that to work out of the box with no fiddling,
Thanks for opening my mind to this. I actually threw the ChromeOS Flex version onto a Thinkpad I had laying around and I was really surprised by how "ready" the setup is.
I feel the same about a good setup that works out of the box. Everything works, from sleep, to cameras, Bluetooth, and shutdown actually shuts down, which I can't get with debian, arch or bsd!
I will trial it further, the Linux WSL type enclosure feels right. I'm trying to understand if Penguin is actually a web rendered terminal or native. I only really know Ghostty which I can't get to run as it's missing some gnome libs I think. I will see if alacrity is better.
Yea im even worse, i was on mojave until last year.
Update frustration has long set in.
Im gonna buy the new macbook pro with the M5 max when it comes out in a couple of months (from an intel 2019) and this will probably be my last mac, im giving them one last chance before i move to Linux.
I'm still on Monterey, on a 2021 M1 that works just fine. I'm not buying a new Mac this year specifically to avoid having to spend days dealing with all the potential headaches of updating my dev environments. I hate upgrading. I don't want any of the new stuff. I just want something that works. The first thing I do when I get a new Mac is uninstall every piece of Apple software that can be uninstalled, then use Little Snitch to block all their IP addresses.
That being said, now AWS is forcing all my RDS instances to upgrade to mysql 9 (also: Why???), so I need to get 9 working on my dev box, and tonight I'm up against a wall trying to work through Homebrew issues. There's no way to win.
Aaaannnnd.... I figured out last night that I don't need to go to 9, only to 8.4. Strangely, what version I needed to upgrade to from 8.0 was not stated anywhere in the mass of emails Amazon sent me. I hadn't yet gone on RDS and noticed that 9 is not on the list of options, I was just trying to build 9 on my Mac.
8.4 won't build in homebrew under Monterey, though, so I'm stuck with 8.3 for my dev stack. I guess I can live with that. I'm dreading the next forced upgrade.
"macOS encounters an error or fault, but doesn’t report that to the user, instead just burying it deep in the log."
This is another huge facet of the problem. Not only does it hide glaring problems from the user and prevent him from taking action, but it prevents him from reporting it to Apple for potential redress.
Apple loves to hide information, with the excuse that it's "too scary" for the "average user." This has always been bullshit. If "the average user" is put off by information he receives, he can at least use it to consult someone who isn't.
iOS Mail is a great example. It can utterly fail to access your mail server because of wrong credentials or whatever, but it won't tell you. In fact, it'll claim, "Updated just now." So a day or two goes by and you've missed important work or personal E-mails before you even decide to investigate. This is obviously offensive, because Apple has decided that your work and your communications are less important than hiding their defects... which might not even have been to blame!
When you combine the glaring QA failures piling up with the obnoxious douchebaggery and law-flouting that Apple has engaged in with its app store, it's pretty clear that the company needs a major management housecleaning.
Apple loves to coddle and promote certain pets, who are often incompetent but for some reason curry favor with management. Look at the "Liquid Glass" fiasco and hideous UI regressions in Mac OS and iOS. This is what happens when you put an unqualified packaging designer in charge of UI at a company that's held out as the paragon of "elegant" design. Jony Ive was a pompous hack with one idea... or actually two: 1. "Thinner" 2. Less useful
We had a brief respite with his departure, but now... things might be even worse. And at a time when Windows has been degraded into unredeemable garbage... it's a grim outlook for popular computing.
I disagree with your take on that we should show all information to all users. It’s not always the case that they have someone on hand to help, and users do get anxiety when they do not understand or are presented with too many options. But what macOS should be giving is the full-fat answers and UI is asked for.
The rest of your comment I can’t argue against at all.
On the other hand, presumably Steve was happy with the insanity of the iOS settings app, where applications had their settings only accessible in another application.
I mean when the apps are small and have just a couple settings, you save having every app having a settings widget that takes you to another panel, etc.
(But a "Good" iOS app in my mind would still have a widget in the app to take you straight to the correct pane in Settings where you configure it.)
I kind of like the iOS settings application. If I want to change some settings for an application, I have one place I have to go to find it. I don't need to launch the app and try to guess where their designer decided to stick settings (probably buried deep in a hamburger menu). I don't have to guess whether a dark gray switch on a light gray background is "on" or whether a light gray switch on a dark gray background is "on" because the app's designer thought it would be cool to not use native controls.
I honestly wish this "central settings" app idea would spread to desktop operating systems.
KDE is so good. Every release they make tangible UX improvements to the point where now most subsystems are almost perfect. There's always been things where I think "oh well that could be better". The notification center, the kickoff menu, krunner, the desktop overview. And then it just got better and better.
Tahoe is SOOOO ugly! The huge rounded corners are atrocious. The fonts look terrible. The windows keep snapping, expanding and contracting with no obvious pattern. Yuck.
And iOS's transparencies are disastrous. They make so much of the test illegible.
I have updated as soon as possible and I if you asked me, I couldn't tell you what is different now. Everything I do on a daily basis still works exactly the same. If there are some weird more rounded corners somewhere now, I don't consciously notice them. The glassy effects look cool but after a week you don't even think about them any more.
> By 2035 I wonder if we'll be all running KDE or WindowMaker and the hell with modern OS GUI.
That is what I wished for back in the 2000's, eventually I went back to Windows starting with Windows 7, because I got fed up with laptop support.
As the main laptop OS, I never stopped having UNIX based systems on servers across many customer projects, or trying out the flavour of the month via local VMs, since hardware virtualization became a commodity.
> Desktop OS should be as forgettable as possible, as it's about having long stints of flow, not giving a feeling of "air" or "play".
100% agree, though i wonder how much an influence casual users are having on apple's marketing of macos...
its almost as if apple doesnt want to sell "trucks" anymore (as steve would say) and would prefer to morph macos slowly into a sedan like the ipad (cause that is where the money is)
> By 2035 I wonder if we'll be all running KDE or WindowMaker and the hell with modern OS GUI.
tbh this is probably me in 2026 or 2027 i think...
I think peak KDE was version 3. A project called Trinity Desktop Environment aim to maintain it but I never really tried it for fear of realising those glasses are indeed rose tinted.
It's basically a "when to rip the band-aid off" type of situation.
Briefly poked around w/ linux again for the first time in years (Omarchy, DHH's tune of Arch + hyprland), and hoo boy, it's come a long way! Nothing like the KDE/Gnome+X jankery of the olden times. Very polished, very slick, very nice.
I did try Omarchy on an old laptop and it was fairly painless to get started. I did develop an unease the more I read about DHH unfortunately and decided to bail.
If anything though, Omarchy shows it's not impossible to get a nice working environment on linux.
Steve understood better than anyone that having a finite amount of time to build means you can't please everyone. The vast majority of Apple's customers just do not care about the Keyboard settings UI or the clarity of unusual error messages.
Users do care they just don't have the words to explain what it is thats frustrating them. Just a silent "I find myself using this less" sort of thing.
Not for everything, but the excuse of "normies don't give a shit" is a bullshit one.
I wonder how many care that messages lights up like a Christmas tree on speed on iPadOS, battery life dropped 90%, calculator requires 32 GB of ram, offline maps stranded them in the woods, iOS can no longer keep two apps loaded at once, ocr screenshots broke, the magnifier “flashlight” button no longer fits on the screen, or the ai text suggestions in notes are simultaneously garbage and undeletable.
Those are just some of the bugs I hit. I’d guess most normal users hit 4-5 problems this upgrade cycle.
For my side gig I need to quickly take multiple pictures (with my iphone) of subjects that aren’t still or cooperative. This used to work fine. Now the camera just quits with no crash or notice so I think I’m taking pictures but I’m not. Closing the camera app doesn’t disable or stop the camera, I have to wait or reboot. But hey, I can take really cool photos I can view in the Apple Vision I don’t own.
I'm one of the rare remote in an office where most are full time there and I'm there one day a week.
I have no idea how they get anything done in there. I feel they only can focus before and after business hours.
So don't be so sure. Home has distraction when the mind is distracted. But once working I feel we are much more productive and capable due to long uninterrupted stints.
It does take discipline but that's what deadlines are for.
I haven't used windows for years but the shear amount of commentary on recent changes and the claims are so beyond beliefs...
It reads like a company that is only there to squeeze money out of existing customers and hell bent on revenues above growth. Like one of those portfolio acquisitions.
I just built a gaming PC after 10+ years without touching Windows, and I gotta say the experience is truly awful.
Small stuff such as: the keyboard shortcut that is setup for switching keyboards is wrong, the one displayed to me in the UI is the wrong one, I discovered it because the shortcut for the Discord overlay (Shift + `) was making me switch keyboard layouts, couldn't comprehend why until I noticed that shorcut consistently switched them while the one displayed in the UI did not. There's no way to change the shortcut, whatever I set up in the UI does not work but Shift + ` always works, no idea why.
Copy and paste has definitely surprised me sometimes, I was designing a custom livery for a sim racing game, copying images to use as stickers, the clipboard would paste very different images from many "copies" ago out of nowhere, I couldn't create a reproducible way to file a bug report, it works sometimes, it doesn't at all at other times.
I setup for updates to happen in the night, between 03.00-07.00, it doesn't matter, the computer rebooted a few times out of nowhere to apply updates, I didn't even get a notification about it, simply got the "Restarting" screen.
It's absolutely shoddy, as much as I have many complaints with macOS the past 8+ years it's nowhere as shitty of as an experience, I'm only a couple of months into Windows again, and it's way worse than I remember it from the days of Win2k/Windows XP/Windows 7.
I find most BSD users who say they use it on a laptop are just using a laptop-form-factor machine like a thinkpad that is plugged in, with a mouse not the touchpad, and connected via ethernet 99.9% of the time. There's nothing wrong with this, but it bears little resemblance to what I consider "using a laptop".
My experience with distros including Open- and FreeBSD on laptops has been universally negative. OpenBSD in particular is very slow compared to Linux on the same hardware, to say nothing of awful touchpad drivers and battery management.
I'm using openbsd on a several laptops at the moment, a dell x55, a thinkpad x230, and a thinkpad x270. Everything works on all of them - sleep, hibernate, wifi, touchpad, colume and brightness buttons, cpu throttling, etc.
On one of them I use a creative bt-w2 bluetooth dongle for audio output, openbsd removed software bluetooth support due to security concerns. The latest wifi standards are not supported on these models, which doesn't bother me. It's not the size of your network, it's what you do with it! I don't mind not having the latest flashy hardware - been there, done that.
I have to pay attention when I purchase hardware, and am happy to do so, because openbsd aligns much better with my priorities. For me that includes simplicity, security, documentation and especially stability through time - I don't want to have to rearrange my working configs every two years cuz of haphazard changes to things like audio, systemd, wayland, binary blobs, etc.
On OpenBSD right now with a Dell Latitude 7490. Works fine.
The reason I like the BSD is that they are easily understood. Have you tried to troubleshoot ALSA? Or use libvirt? Linux has a lot of features, but most of them are not really useful to to general computer user. It felt like a B2B SaaS, lot of little stuff that you wonder why they are included in the design or why they're even here in the first place.
For some reason I had a much easier time getting OpenBSD working on one specific laptop (a Thinkpad E585 where I had replaced the stock Wifi with an Intel card). A lot of Linux distributions got into weird states where they forgot where the SSD was, and there was chicken-and-egg about Wifi firmware.
OpenBSD at least booted far enough that I could shim the Wifi firmware in as needed. I probably picked the wrong Linux distribution to work with, since I've had okay luck with Debian and then Devuan on that machine's replacement (a L13)
probably because OpenBSD developers use laptops, so they port the OS to laptops all the time.
FreeBSD has a few laptop developers, but most are doing server work. There is a project currently underway to help get more laptops back into support again: https://github.com/FreeBSDFoundation/proj-laptop
I've been running it on most of my personal laptops since around version 10. It's a lot like how Linux felt in the late 90s. Depends on your hardware and what you want to do. But it's solid.
If you could handle Linux in the late 90s you can handle it.
So you're saying KDE and Gnome and xfce and enlightenment and openBox, etc, are all desktops that run like the 90s? These current versions, and many more, run on FreeBSD.
It was early and I couldn't think of all the desktop names so I Googled for "popular Linux desktops" and that's one of them it gave me. Apparently Linux runs a "most 1990s WM that ever 1990s'd".
No offense, but they're about as similar as apples are to oranges: yes they're both fruits, but they're very different kind of fruits.
Zorin is a more "traditional" OS, where things work like most PC operating systems, whereas Bazzite is an immutable OS with atomic updates. Immutable means the core system files are read-only, which makes it less susceptible to corruption and breakage (due to user error or malware). Atomic updates means updates either apply or don't: there's no partial/failed state that can break your PC.
Updates are also image-based, where your entire OS image gets updated in one go, kinda like how mobile OS's work - this means there's no chance of package conflict/version/dependency issues that can sometimes plague regular Linux distros like Zorin. This also means that major OS upgrades are trivial - they're treated like any other update. In Zorin and even Windows for that matter, major OS upgrades are always messy, and there's a chance something can break or get corrupted. You don't have that issue with immutable, image-based distros like Bazzite.
The only area where Zorin would be better is in low-level customisability - like say, you want to switch out your kernel to a custom kernel, or use a different DE, or change login managers etc. You can do that in an immutable system, as these are core components. But most people don't do this, so for regular users, an immutable system like Bazzite would be a much better choice.
I remember that yes, expensive operations could take a while, but the interface was much faster than my M1 Max Studio for the sole reson you actually do not have to wait for animations.
And not just for the reasons that animations were sparse, they also never blocked input, so for example if you could see where a new element would appear you could click there DURING the animation and start eg typing and no input would be lost meaning that apps you used every day and became accustomed to would just zip past at light speed because there were no do-wait do-wait pipeline.
The animations were there, but they were frame-based with the number of frames carefully calculated to show UI state changes that were relevant. For example, when you would open a folder, there would be an animation showing a window rect animating from the folder icon into the window shape, but it would be very subtle - I remember it being 1 or 2 intermediate frames at most. It was enough to show how you get from "there" to "here" but not dizziingly egregious the way it became in Aqua.
Truth be told, I do have a suspicion that some folks (possibly - some folks close to Avie or other former NeXT seniors post-acquisition) have noticed that with dynamic loading, hard drive speed, and ubiquitous dynamic dispatch of ObjC OSX would just be extremely, extremely slow. So they probably conjured a scheme to show fancy animations to people and wooing everyone with visual effects to conceal that a bit. Looney town theory, I know, but I do wonder. Rhapsody was also perceptually very slow, and probably not for animations.
There were also quite a few tricks used all the way from the dithering/blitting optimizations on the early Macs. For example, if you can blit a dotted rect for a window being dragged instead of buffering the entire window, everything underneath, the shadow mask - and then doing the shadow compositing and the window compositing on every redraw - you can save a ton of cycles.
You could very well have do-wait-do-wait loops when custom text compositing or layout was involved and not thoroughly optimized - like in early versions of InDesign, for instance - but it was the exception rather than the rule.
> Truth be told, I do have a suspicion that some folks (possibly - some folks close to Avie or other former NeXT seniors post-acquisition) have noticed that with dynamic loading, hard drive speed, and ubiquitous dynamic dispatch of ObjC OSX would just be extremely, extremely slow. So they probably conjured a scheme to show fancy animations to people and wooing everyone with visual effects to conceal that a bit. Looney town theory, I know, but I do wonder. Rhapsody was also perceptually very slow, and probably not for animations.
Done exactly this myself to conceal ugly inconsistent lags - I don’t think it is that uncommon an idea.
I'm think that ObjC's dynamic dispatch is reasonably fast. I remember reading something about being able to do millions of dynamic dispatch calls per second (so less 1 us per) a long time ago (2018-ish?), but I can't think how to find it. The best I could come up with is [1], which benchmarks it as 2.8 times faster than a Python call, and something like 20% slower than Swift's static calling. In the Aqua time-frame I think that it would not have been slow enough to need animations to cover for it.
My most durable memory is all the reboots due to programs crashing. Didn't help that a null pointer deref required a system reboot - or that teenage me was behind the keyboard on that front.
Preemption is a very nice OS feature it turns out (particularly once multi-core rolled around). Still, I recall os 8 and 9 being generally snappier than windows 98 (and a lot snappier than early builds of OSX)
How does preemption work on a processor that barely has interrupts and has no way to recover state after a page fault, in an OS that has to fit into a couple dozen kilobytes of ROM?
There were plenty of preemptive multitasking systems for the original 68000, and regardless page fault recovery was fixed from the 010 onwards.
And certainly was very not a problem on PowerPC which TFA is about.
Also not sure how you can say the 68000 "barely has interrupts" I don't even know what you're on about.
MacOS was broken because Jobs forced it to be that way after he was kicked off the Lisa team. Which had a preemptive multitasking operating system on the 68000.
Preemptive multitasking is unrelated to page faults. And the 68k handled page faults just fine starting from the 68010.
Space constraints were certainly limiting on the earlier models, but later ones were plenty capable. Apple itself shipped a fully multitasking, memory protected OS for various 68k Mac models.
By the late 80s, the only reason the Macintosh system was still single-tasking with no protected memory was compatibility with existing apps, and massive technical debt.
Later Mac ROMs were 512KB, same with the later Amiga Kickstarts (3.x) That was a lot of space for the late 80's and early 90's. Interrupts were supported (8, if I recall.) And 68000 machines didn't support virtual memory until the 68010 and later, so no issues with page faults.
I still remember the day teenage me got an Amiga 500 with a whopping 512K of RAM, and witnessed the power of multitasking, way back in 1988.
The Amiga had preemptive multithreading with multiple task priorities on the original MC68000. Preemption is distinct from memory protection or paging.
Those were the days and gone they have.
reply