It really seems that nearly everything that's not a full native ObjC/Swift stack or is not a web browser (or based on one, like Electron) is not ready yet right now. It really seems Apple did not care enough to get especially golang and Rust stuff stable for their hardware release. I can't escape the impression that they didn't really shower the wider ecosystem in DTKs and software support, although I'd be happy to be proven wrong there.
Oh well, at least Rosetta2 seems to be working really well - you will be able to run a lot of software you need rather well despite not to the fullest potential. The execution on Rosetta2 is really good and that's important. But I think it does go to show that the "Pro" in "Macbook Pro 13" does not mean all that much. At least not if they're going to ship with the majority of pro software not being native, many popular developer toolchains still months to be ready, and very limited I/O and RAM options. The Macbook Air and Mac Mini I fully get for the first releases on new hardware, but the Macbook Pro 13 really feels odd in this lineup if the word Pro is supposed to mean anything.
Well...when was the last time the entry level smallest MacBook Pro was really pro? If it ever was?
Let’s face it, that MacBook Pro is mainly there to make their buyers feel pro, while not really providing performance benefits over the Air.
It’s like you have the stock car (MacBook Air), the “sports” version of that car which just has some stripes sprayed on and a red colored gear shift knob (entry level MacBook Pro), and then the actual race version of that car which has a tuned engine etc. (other MacBook Pro’s).
I don’t think whether it’s “pro” or not is well defined enough to litigate over, but the reviews have made clear the M1 air throttles after a few minutes of max CPU load while the Pro doesn’t, which seems like a big performance difference to me.
I mean, some tests shows that the throttling takes six or seven minutes of pure CPU hammering to turn on. For some tasks, I agree, that can make a difference — but in the context of these devices with these specs, I just don’t see if. Like, if you need that much sustained CPU usage without throttling, I don’t think the M1 is the right chipset for you. The M1X or M3 or whatever they call the ones that they put in the actual high-end machines and not the entry level stuff seems more apt.
The biggest advantage I see between the Pro and the Air, based on everyone I’ve talked to with both, is battery life. That might be worth the $200 or $250 depending on your storage configuration.
No, but that wasn’t so much Apple’s choice as it was a consequence of Intel making promises they couldn’t keep.
For a while, the Air has been the form-factor Apple expected to be getting entry-level-MBP perf from, and requesting chips from Intel to satisfy that; and for a while now, the response from Intel has been a chip that thermal-throttles so hard in that form-factor that the performance has bombed it down to a lower class of computer.
The base-model MBP, then, has been Apple’s compromise: it’s the result of them taking those chips that were supposed to be just fine running in an Air, and giving them enough chassis and fans to make them perform the way Intel originally promised they would.
In other words, the base MBP is “a MacBook Air” in terms of what performance Apple targeted the Air to achieve each gen; and the Air itself is the pretty design Apple’s IxD dept put out in anticipation of that target, mated to an altogether-worse processor in order to get it out of the gate.
Now that Apple has a chip with actual thermal headroom, this duality will go away. You’ll get base-MBP perf in Air chassis, and these overlapping categories will merge into one.
> and for a while now, the response from Intel has been a chip that thermal-throttles so hard in that form-factor that the performance has bombed it down to a lower class of computer.
Are we talking about the same computer, where the fan is not even thermally connected with the CPU heatsink[1]? That's Apple's fsckup, not Intels.
I would argue yes. The two port MacBook Pro, in my opinion, was launched to replace the MacBook Air when it came out in 2016 [1]. It didn’t have a Touch Bar, for example, and based on my conversations with Apple at that time, I feel very strongly it was meant to replace the Air in the lineup (the 12” MacBook was another attempt to replace the Air, that one I think we can blame a lot more of on Intel). Apple never told me this outright, but that was absolutely the impression I got about how it was being positioned against the MacBook Pro with four ports and it was how I reviewed the first release of that model.
For a variety of reasons, it didn’t work. Not only was the price higher, the port selection (just two TB3s at a time when the industry hadn’t moved en masse to USB-C, remember, this was four years ago) was really limiting. And of course, the keyboard drama.
It is my contention, though I have no proof, that Apple didn’t want to release the redesigned Retina MacBook Air in 2018, but had to based on continued sales of the older model and the lack of love for the Touch Bar free MacBook Pro. (Recall, even after the redesign, Apple was still selling a Broadwell-based MacBook Air, technically into 2019. That was essentially the same MacBook Air that was first released in March 2015.)
Once the MacBook Air was redesigned, the two-port MBP never made any sense, even with the re-added Touch Bar. In fact, every single year, when I participate in Jason Snell's Apple Report card [2], I comment on this weirdness in the lineup. I don’t think the two-port MacBook Pro needs to exist.
It's early days, but so far this transition has gone massively better than any previous similar processor transition. The PowerPC->Intel transition was much worse.
Not sure what you expected, but having seen previous transitions, this is smooth as butter. If you are a Pro, you know that jumping onto a platform early is fraught with potential gotchas.
> the Macbook Pro 13 really feels odd in this lineup if the word Pro is supposed to mean anything.
There has always been a bit of a blurry line between pro and non-pro Apple products. This model year it means just as much as it has on many other model years. Apple left the "higher end" Intel builds in their product line to address the needs of developers who want 32GB or RAM or many other configurations.
This Pro is exactly the machine that the developers porting Go or Rust over to MacOS will likely be using.
I don't believe I'm saying, suggesting or hinting that this is not going smoothly for a transition like this. If you read my comment as somehow saying this transition is not going smoothly, I'd like to understand why you'd think that and I'll update my comment.
What I'm commenting on is that anything that is not already completely bought into Apple's stack is not there yet, and that reveals where Apple's priorities are. Anything that's cross platform or depends on cross platform components needs to play catch up now. It's fine if they have priorities though?
> What I'm commenting on is that anything that is not already completely bought into Apple's stack is not there yet, and that reveals where Apple's priorities are.
Not remotely.
It reveals who priorities updating their software to Apple's new platform. Apple can't control who ports a given piece of software to their new platform and how quickly it gets done. The surface area is too large.
All Apple can do is get the tools out there for the people who are doing the porting.
Most likely Go and Rust aren't ported simply because porting languages is hard and time consuming.
> Anything that's cross platform or depends on cross platform components needs to play catch up now.
I'm not sure why this is remotely surprising or noteworthy. Apple needed Xcode, Swift, and Objective C running in order to build MacOS. There would not be a platform to try to port to if Apple's toolchain wasn't working prior to day 1.
Rust, Go, React Native, are all by necessity going to be rely on Apple's toolchain to run on Apple, so by nature they take longer to build.
> Most likely Go and Rust aren't ported simply because porting languages is hard and time consuming.
So, to be clear, Rust is ported. It's not a tier 1 target yet, but it does work. Time is not the only issue here, as elaborated downthread. For the gory details, see https://github.com/rust-lang/rust/issues/73908
> I'm not sure why this is remotely surprising or noteworthy.
It might not be for you, but then I'm curious why it's noteworthy enough to have two layers of comments about it. If you don't want to talk about that then by all means let's not.
Apple has priorities, and these have results. I think it would be interesting to talk about that, as Apple could have invested time and money in getting some more things going (I'm thinking virtualization and Docker related things especially - you know, the stuff they demoed at WWDC!), but they didn't. It's fine they didn't, but we can still talk about that can't we?
> It might not be, but then I'm curious why it's noteworthy enough to have two layers of comments about it.
You noted it, I was trying to explain something which seems to me exceedingly obvious.
> Apple has priorities, and these have results.
The reason these tools were built/ run first is due to the priorities and constraints of outside individuals, not Apple's. If you build a graphics editor or a text editor using Apple's toolchain and most of your clients run Apple, porting is likely high priority and not super hard.
A lot of these things you are complaining about are just really hard problems.
Docker requires a hypervisor which wasn't part of the A12Z processor they shipped in the DTK. It also depends on Go.
Go isn't available because porting languages to a new processor is non-trivial.
Rust has the above issues, plus didn't Mozilla lay off a huge chunk of the Rust team?
> Docker requires a hypervisor which wasn't part of the A12Z
processor they shipped in the DTK. It also depends on Go.
Apple did have prerelease hardware that supported virtualization, which they supplied to Parallels. Docker has not worked with this hardware based on their press release and GitHub issue [1][2], although they may have received some specs. In any case, Docker depends on Golang with won't release until February.
If Apple did make Docker a priority (which you'd expect given the namedrop at WWDC), this seems quite strange to me.
> Go isn't available because porting languages to a new processor is non-trivial.
I'm sorry, I don't buy this at face value for Go and Rust. I believe these teams could support a new architecture very well very quickly with the right tools and support, like how their stuff also runs on weird things like S390. They show day in day out their stuff is very flexible. By no means do I want to suggest it's a trivial matter, but these toolchains are made to be portable and currently support both much more exotic and very similar systems to aarch64 macOS at the same time - like x86-64 macOS and arm64 iOS. Rust's current bottleneck may be due to CI sure, but then the question becomes why weren't DTKs used for CI?
It seems that if Rust and Go developers were approached with the right tools and support, they wouldn't have had to figure things out now. Is it that bad that they have to figure things out now? Not really - but I do think it could have been avoided, and we wouldn't have had to wait for a golang release in February and a Docker release after that.
> Rust's current bottleneck may be due to CI sure, but then the question becomes why weren't DTKs used for CI?
Rust's CI is pretty demanding, both in general, as well as given that it's core to our stability story, we need it to be reliable, and something we can rely on for a long time. DTKs are by very nature not a long-term thing, but a short term stopgap.
> if you're going to ship with the majority of pro software not being native
Shipping this gets M1 hardware in the hands of developers who can then use it to test their software. You mentioned Rust, which is currently blocked on getting hardware hooked up to CI (https://github.com/rust-lang/rust/issues/73908)
Also, such an impressive release of hardware shows third party devs that Apple is serious about transitioning, and doing so quickly. Before this laptop was released we didn’t know what the performance delta would be.
Isn’t this the modus operandi with a lot of stuff Apple does — release new things that the market hasn’t quite adapted to yet, and let the market catch up?
Removal of the audio jack in for headphones comes to mind.
Other than Windows/Bootcamp being dropped, what is there to catch up with?
For the machines these machines replace, non-native apps still run significantly faster under Rosetta 2 than they do on the current hardware.
When native ARM versions of those apps are released, you'll get another speed boost.
I fail to see what is negative or needs "catching up".
Just because M1 Mac's can compete with the all but the fastest laptops and majority of desktops already doesn't mean that's their intended use. These are Apple's entry level models. And the spank the vast majority of other machines even when running code not yet optimized for them.
Putting aside the fact that the “Pro” label hasn’t really meant professionals for over a decade (I would say you can see the beginning of the distinction becoming “plus” rather than “pro” with the unibody MacBook/MacBook Pro from 2008 and 2009), I think you misunderstand how the DTK program works.
Anyone could request one and I’m not aware of any developer I know, no matter how small, not being able to buy one from Apple. I was able to get one and I don’t even have anything in the App Store at the moment. As for Apple gifting them to OSS projects, I mean, I guess that would be nice, but frankly the corporate stewards of Go and Rust can buy their own, just as Electron and others did. I imagine some Debian people may have been given loaner machines or stuff gratis, given the custom Debian build proof of concept at WWDC, but I have no insight into that.
The real challenges with AS are going to be for anything that uses virtualization or lots of lower level libraries that need to be compiled for ARM64 and to be honest, that was clear to anyone who watched any of the Apple Silicon sessions at WWDC and read the accompanying documentation.
You’re exactly right that many popular developer toolchains aren’t ready right now. Most of us didn’t expect that and some of us were screaming that loudly (and getting yelled at and called haters by fanbois even though we almost exclusively use Macs and Apple hardware) to prepare people for exactly this reality. The support will come over time and it’s also clear to me at least, that the way at least some stuff works, might not be as nice as the way it was under Intel or even PPC, just because of changing priorities with macOS, and we'll need to come to terms with that too.
You'll notice the 13” MacBook Pro that was replaced in the lineup, a device I’ve always found odd period (just get a MacBook Air), is the tweener device with two ports, and originally , no Touch Bar. This isn’t the much more powerful 13” MacBook Pro that got a big update on Intel alongside the fixed keyboard this May. This is the one that got a fixed keyboard but was still running a two year old 8th-gen Intel processor, AKA, the MacBook Pro you shouldn’t buy and should really just get a MacBook Air instead (the Intel MacBook Air refresh was running a newer processor than the two-port MBP).
Honestly, this is a huge boon for non Apple Silicon ARM64 projects and libraries because a lot of developers won’t do this sort of work for Raspberry Pi or Pinebook or any number of ARM boards, but they will for Apple. And that will trickle downstream.
There is a range of machines in the MacBook Pro lineup. They only replaced their lowest spec MacBook Pro with the new M1 machine. And the lower model always had 2 thunderbolt ports and 16 gigs of lpddr4 ram maximum. The “higher end” models still have Intel chips, and when they get replaced with Apple silicon we’ll see higher specs and by that point better compatibility from compilers and whatnot.
This is the first, and worst / “least pro” Apple silicon machine Apple will ever make. But it absolutely won’t be the last.
> I can't escape the impression that they didn't really shower the wider ecosystem in DTKs and support, although I'd be happy to proven wrong there.
That’s interesting. I was shocked at how many green check marks there are for a chip that was announced in June. It takes time to write and test an application. A lot of teams must’ve really prioritized it.
Also, Apple Silicon compatibility is not enough. One needs Big Sur compatibility too.
I don't think the chip's announcement time really should go into anyone's consideration time here, since it's Apple's own choice to go for these timelines and release a "Pro" product in November :)
I mean this point is largely moot by Rosetta covering all the basics well and still making the Macbook Pro 13 a serviceable Pro computer for a lot of (but not all) use cases. It just seems strange to me that popular developer things like Go, Rust, Docker, virtualization of any kind are still months away despite these being super important use cases of the Mac. Maybe Apple just doesn't feel that way, or have the numbers showing that my impression isn't true, but it feels strange that they're letting the community just figure it out by themselves now.
Is this really a knock on Apple though? It’s not like all this is up to them. Personally I want MATLAB support but Apple doesn’t get to dictate Mathworks schedule.
> I can't escape the impression that they didn't really shower the wider ecosystem in DTKs and software support, although I'd be happy to be proven wrong there.
From all appearances, practically anybody who applied for a DTK got one so in many cases I would take lack of a green checkmark as the dev not applying for a DTK.
> I would take lack of a green checkmark as the dev not applying for a DTK.
Not entirely true. A lot of software is simply hard to port or has a lot of dependencies which need to be ported.
A lot of developers already have a full plate and porting to a new platform is low on their list of priorities. As the user-base expands, it will move up on their list of priorities.
Oh well, at least Rosetta2 seems to be working really well - you will be able to run a lot of software you need rather well despite not to the fullest potential. The execution on Rosetta2 is really good and that's important. But I think it does go to show that the "Pro" in "Macbook Pro 13" does not mean all that much. At least not if they're going to ship with the majority of pro software not being native, many popular developer toolchains still months to be ready, and very limited I/O and RAM options. The Macbook Air and Mac Mini I fully get for the first releases on new hardware, but the Macbook Pro 13 really feels odd in this lineup if the word Pro is supposed to mean anything.