I'd at least, you know, pretend we had a top-secret amazing model. By airing all of this publicly, they've basically admitted that Claude is the best there is.
As someone who owns a small farm — and actually enjoys land, growing things — I'm just saying this is not a "solution" for the vast majority of people.
Don't get me wrong – I think if people grew more of their own food it would be fantastic! And technology combined with alternative farming practices has the potential to be hugely transformative. I'm all for it. But the way things are currently, it's a monumental effort to grow food for sustenance on any meaningful scale, and would take enormous amounts of time, energy, and effort – leaving very little for other pursuits. I want people to also be able to make art, music, etc.
That car's interior is a great example of what made Ferrari so iconic and aspirational. It's not trying to be nice and comfortable. It feels stiff, brutalist, direct. It has an immediacy and a danger.
As a heavy Xcode user and native app developer: pretty excited to dig into this.
I recently have pivoted to mainly working in either Claude of Codex on the command line, because their ability to run autonomously for long, extended periods wasn't possible in the turn-based Xcode intelligence features.
That, plus the fact that the agents could run command-line utilities and just generally be more "agent-y".
My way of working has really shifted this year. I've been experimenting with LLMs for coding for years now, but the models released in the past year really feel like they've hit a point where they write code much closer (sometimes indistinguishable) from how I would write it.
One of my all-time favorite quotes is from Zen Mind, Beginner's Mind and it goes: “In the beginner’s mind there are many possibilities, but in the expert’s there are few.”
There's such a wide divergence of experience with these tools. Often times people will say that anyone finding incredible value in them must not be very good. Or that they fall down when you get deep enough into a project.
I think the reality is that to really understand these tools, you need to open your mind to a different way of working than we've all become accustomed to. I say this as someone who's made a lot of software, for a long time now. (Quite successfully too!)
In someways, while the ladder may be getting pulled up on Junior developers, I think they're also poised to be able to really utilize these tools in a way that those of us with older, more rigid ways of thinking about software development might miss.
Over the last 25 years of building commercial software, but being a programming enthusiast since I was 15 years old, I came to the conclusion that self-improvement (in the sense of gaining real expertise in a field, building a philosophy of things, and doing the right things) is in direct opposition to creating "value" in the corporate/commercial sense of today.
Using AI/LLMs, you perhaps will create more commercial value for yourself or your employer, but it will not make you a better learner, developer, creator, or person. Going back to the electronic calculator analogy that people like to refer to these days when discussing AI, I also now think that, yes, electronic calculators actually made us worse with being able to use our brains for complex things, which is the thing that I value more than creating profits for some faceless corporation that happens to be my employer at the moment.
Why are you so certain that LLMs/AI can't be used as a tool to learn and grow?
Like Herbie Hancock once said, a computer is a tool, like an axe. It can be used for terrible things, or it can be used to build a house for your neighbor.
It's up to people how we choose to use these tools.
There have always been young people who can quickly hack something together with whatever new tools are available. That way of working never lasts, but the tools do last.
When tools prove their worth, they get taken into to normal way software is produced. Older people start using them, because they see the benefit.
The key thing about software production is that it is a discussion among humans. The computer is there to help. During a review, nobody is going to look at what assembly a compiler produces (with some exceptions of course).
When new tools arrive, we have to be able to blindly trust them to be correct. They have to produce reproducible output. And when they do, the input to those tools can become part of the conversation among humans.
(I'm ignoring editors and IDEs here for the moment, because they don't have much effect on design, they just make coding a bit easier).
In the past, some tools have been introduced, got hyped, and faded into obscurity again. Not all tools are successful, time will tell.
This reminds of talking to my nephew at Thanksgiving years ago. He was studying for an exam after the holidays and I was looking at his screen open to a Google Doc which looked like his study notes except - they were being edited as I was watching - by someone else. I asked about it and he goes “we have a single Google Doc where all students collaborate on the study notes.” My mind was blown, I was also using Google Docs but not in a millions years would it cross my mind its utility for such a thing he and his classmates were using it for. Can’t wait to see what new blood “Juniors” brings to the table!
Collective cognition is effectively what all knowledge work is. The programmers are the dunces that can't keep it all in their heads and need explicit type systems and databases to manage state unlike the genius business analysts and SMEs
All students collaborating on notes kind of defeats the point no? As I see it study notes are reminders to link you back to when you were reviewing the material. If you never wrote the notes you wont get that connection back to the material.
The shared study notes represent shared understanding of the topics at hand. Different people grasp concepts in different way and seeing how other people think/understand/deduce/... (at least for me) makes a world of difference.
Like seeing a PR and going "holy s**, would never have dreamed of doing it that way" - I have learned A LOT in a looooong SWE career from that...
I was talking about this with someone today, that before perhaps there is an exactness you expect. But actually, what really matters is "good enough." And if AI written code takes you to "good enough" according to whatever metric you've set, then what exactly is the problem? Because a lot of the technical part of the job is taking X data, doing f(x) transformation to that data, and thus Y is born and handed to the next step. So if it passes whatever metric you have set to make sure that going from X to Y handles Z% of the problem space, and doesn't create downstream issues (probably this should be part of your metric), then you have done your job. And yes, of course sometimes the job will require you writing the code yourself because that level if precision is necessary. But why should we consider that always to be the case? And thus, actually, there are probably new programming languages and paradigms to consider that we haven't thought of yet that makes this kind of problem solving more efficient. Because right now we are not super effective at juggling both the human and the machine's problem space context. Except some experts who say they can orchestrate tens of agents all at once doing whatever. I dunno. I think right now is exciting and not hand wringing. A computer is meant to help you think. Why shouldn't new computational tools bring excitement?
... and the biggest problem is that the people who _do_ know how hard it is to build software are the ones whose input on the matter is most likely to be discounted as "sour grapes"/"fear of obsolescence".
I definitely agree with this. Older folks have to deal with the double whammy of being familiar with what they already know, plus there is a good bit of research that learning and absorbing new things just gets harder past mid-40s or so.
That said, I don't think this negates what TFA is trying to say. The difficulty with software has always been around focusing on the details while still keeping the overall system in mind, and that's just a hard thing to do. AI may certainly make some steps go faster but it doesn't change that much about what makes software hard in the first place. For example, even before AI, I would get really frustrated with product managers a lot. Some rare gems were absolutely awesome and worth their weight in gold, but many of them just never were willing to go to the details and minutiae that's really necessary to get the product right. With software engineers, if you don't focus on the details the software often just flat out doesn't work, so it forces you to go to that level (and I find that non-detail oriented programmers tend to leave the profession pretty quickly). But I've seen more that a few situations where product managers manage to skate by without getting to the depth necessary.
> Older folks have to deal with the double whammy of being familiar with what they already know, plus there is a good bit of research that learning and absorbing new things just gets harder past mid-40s or so.
Unfortunately, since the tech industry still largely skews young, reticence to chase every new hype cycle also feeds into the perception of an inability to learn new things, even after many prove to be fads (e.g., blockchain).
Long way from the "Think Different" era Apple. Been thinking about that campaign this week. Posters of people like MLK, which Jobs had to get permission for from each and every estate for the campaign. When Apple was solidly, unambiguously positioned as the computer for the rebels, the misfits, the crazy ones.
If nothing else, there was an opportunity to simply say that given the day's events, a movie screening didn't feel right. Or – "hey there's a gigantic winter storm covering half the country, and I have a lot of logistical stuff to take care of."
> Long way from the "Think Different" era Apple. Been thinking about that campaign this week. Posters of people like MLK, which Jobs had to get permission for from each and every estate for the campaign. When Apple was solidly, unambiguously positioned as the computer for the rebels, the misfits, the crazy ones.
"Think Different" was an advertising slogan that Apple Computers, a publicly-traded corporation with thousands of employees, paid an advertising company to create in the late 90s in order to promote a certain image of the company, because the leadership at the time thought it would help them sell more computers. And it clearly worked (or at least didn't abjectly fail) because Apple Computers is still around today and is an even larger corporate entity with a much larger market cap.
Whether a marketing campaign succeeds in appealing to you emotionally has nothing to do with whether rebels, misfits, crazy people, or any other category of person would be better off buying and using computer products made by Apple. I've personally never liked Apple products, I've always felt like they took a lot of practical control away from the end user in order to facilitate what Apple leadership thinks the end user _ought_ to want. So I avoid using their products, and I think other people should to, although I respect that the walled garden Apple provides is a computing product some people do find it useful to pay for.
Leadership at Apple computer was probably engaging in political activities some people at the time objected to when those ads were made, just as they are today.
I dunno. I also think about how Jobs swindled and mistreated Woz time after time, many of the nerds who cut their teeth on the Apple IIc were appalled by the road he took. By the time the Macintosh released, Apple had cemented themselves as a capricious OEM with no interest in serving every minority niche.
Actions like this feel fully contiguous with Jobs' personality, to me. He wasn't afraid to mistreat large swathes of customers, fans or employees if it meant that Apple could cosmetically pull ahead of it's competitors. He didn't feel obligated to fight fair or defend his moral righteousness, and neither does Cook. This is the exact same Apple you always knew, they've just quit virtue signalling.
reply