This is the same guy who wrote a piece talking about brain-drain on Apple's silicon team, and how the A15 proved that Apple's engineering prowess had faltered.
And then the A15 actually came out and Anandtech analysis showed it was a normal year for the A series chips, with reasonable, even impressive performance and efficiency gains year over year.
In other words, there's some interesting bits of information here since the guy does appear to have some level of supply chain / industry access, but any time he leans into his own analysis or editorial, I'd be skeptical.
What is interesting is the observation that cost-per-transistor is rising:
"2018 was Apple’s last major step down in cost per transistor due to TSMC’s N10 to N7 process node shrink. With the transition from N7 to N5, the cost benefits were relatively small due to SRAM scaling issues. N7 is entering its 5th year of production, and N5 is now at its 3rd, yet the only hint of per wafer pricing changes have been in the wrong direction... Due to the way the 3 major DRAM companies have slowly increased output, cost per bit of DRAM has not really fallen."
Sophie Wilson (inventor of the original ARM instruction set) has been saying that cost-per-transistor has been rising for several years.
That's not a new observation, as you mention. However as trailing-edge nodes improve and mature, their cost-per-area should still tend to drop over time. So, in a way, Moore's law might still be full effect despite developments "running ahead of it" at the leading nodes.
Sophie's analysis contemplates a set unit volume and fixed costs related to design and verification and tape out.
Mine is on pure cost/transistor based on wafer costs as a player like Apple has the units to spread fixed costs over.
It's true that the new core slipped from A15 to A16. The improvements while real were just clock bumps. I'm interested to see if M2 will use the new cores or not.
Look at the in depth analysis I did of the die shot and SOC floorplan. Or even read the article I originally wrote I was right on the CPU architectural gains. Apple pushed clockspeed up slightly, but IPC was less than 5%.
I wasn't disputing your data point. Indeed, Apple's own provided comparisons indicated that we were going to see smaller IPC gains for the A15 Big core.
I was calling out your analysis. You took the lower than usual IPC gains, and some departures for Nuvia, and surmised that Apple's engineering team must be on the decline and bleeding irreplaceable talent.
It wasn't a terribly unreasonable take given the information at the time before the A15 was even released to the public, but it was pre-mature and a bit hot. Hell that headline was blazing!
When the iPhone 13 actually released, we saw there was much more to the A15 than just pushing pure IPC--the focus for the design was clearly efficiency, and they frankly nailed it. It certainly does not look like the work of an in-decline, listless engineering team.
My original article you are talking about explicitly stated heterogenous compute is the future. It talks about transistor budgets going to LLC, GPU, media, and ISP. Much of the CPU efficiency gain comes from that doubled LLC size.
Still vague. Again, if you read the anandtech analysis, nothing indicates a decline in Apple's engineering team, which is what the writer of this story had posited.
I don't think there's necessarily ill-intent on the part of the writer. I just think he's missed the mark on some of his past analysis, so I bring some skepticism to anything he writes that goes beyond the facts he has access to.
I think the large push by manufacturers into large phones is primarily due to battery life and power hungry CPUs.
but CPUs themselves have been great for a long time. The A13 is the fastest CPU on the planet for rendering javascript (though not sure if faster than M1, as they probably have the same pedigree), so surely some of that CPU can be sacrificed.
It would be nicer to have a slower/more efficient CPU and a smaller phone.
That some people like big phones is a happy accident, or a genius marketing play, or both.
> That some people like big phones is a happy accident, or a genius marketing play, or both.
I think around the time of the iPhone 4 there was a bunch of hubbub from Apple about how bigger phones aren't as usable with one hand, so they weren't making them. Then they go and release the Max a few years later. :)
I think people regard phones in different ways, with the phone power users wanting a mini tablet, and other people wanting something much smaller. There is a significant contingent of people who practically live on their phone, so I can understand wanting a huge screen.
> I think people regard phones in different ways, with the phone power users wanting a mini tablet, and other people wanting something much smaller. There is a significant contingent of people who practically live on their phone, so I can understand wanting a huge screen.
My mother wants a big phone, "biggest that still fits in my purse" as she puts it. She isn't a power user, and she doesn't use the phone all that much, it's her iPad that she lives on. I think the reason big phones won out over small ones isn't because of some sneaky marketing push, it's because the largest part of the market wants big phones instead of tiny screens that fit in one hand.
It's similar to the car enthusiasts who insist a manual brown diesel wagon would sell well, but of course that's just the loudest 10 people on the Internet.
This. I will never forget the 4S for Siri - it was the last 'meaningful' product release from Apple.
Everything else has been an optimization function - different sizes and colors to extract from all consumer price points... etc. Unfortunately I get the feeling the A/M series are a vertical integration decision first, followed by the Apple product mission.
The way you can tell the M/A processors are primarily optimizations is in the marketing and API improvements. Instead of trumpeting, write-once-run-everywhere, for iOS, iPad, and Mac, they tried in 2020 and it feel on deaf ears so they just gave it up - and now it's primarily a battery life discussion (longer lasting horse?)...
To me, the reason a developer can't understand the messaging is precisely because the device classes are intermingled (calling back to capturing the demand curve rather than building for a consumer purpose [the complicated question to parent's consumer preferences, with a Jobs required answer]) which complicates UI principles / code for developers. And really, a lot of apps are meant to live in each device class - not every one of them is a global life service - which Apple must get right if they are to do AR right.
I will concede though, as of today, the optimizations do appear to serve the customer in a 'just-as-good' fashion. So my critiques are only a 'maybe they left some on the table' - they are still the best, bar none.
It has been observed that finfet devices have lower reliability than planar transistors; it appears that 28nm planar would have better reliability than anything Apple is making now.
Of course, the CPU would be larger and perhaps less power-efficient. Higher reliability also might not be a design goal, as there is no reason to outlive the term of software support.
> I think the large push by manufacturers into large phones is primarily due to battery life and power hungry CPUs.
If that was all there was to it, they could just bring back the "brick" form factor that was all the rage pre-smartphones, just with a smaller touchscreen.
That's certainly part of it. Since 5G modulations actually only provide 15-20% increased thoroughput compared to 4G almost all of the advertised "increases in speed" for 5G networks have to come from using new spectrum. And the only remaining spectrum, well, sucks.
So another part is the complete fiasco that are the mixed up, rarely overlapping, mm-wave 5G band definitions across the world meaning more discrete antennas are needed in each phone. Additionally the requirement for multiple antennas for each of these actual band for the higher frequencies because they have to beamform. And the power requirements for the frequency multipliers/etc and other frontend RF stuff for each set of these.
What makes you thing the mini is EOL? I know it didn't sell that well, but that would surprise me. I think at worst they'll use start using the current mini form factor for a new SE.
Sidebar, apparently the Mini is a flop, but ironically in my social circle of engineers, 90% of iPhone users have a Mini. I hope it’s demise is just a rumor.
Every engineer I know has a laptop or desktop for “proper computing” and most rely on their phone primarily for status updates and information.
Whereas in the wider population the appeal of a full computer is more limited: people are more likely to throw cash at a bigger screen for better media consumption and gaming, and maybe have a cheap laptop for times where you really need a keyboard and desktop software.
cash at a bigger screen for better media consumption and gaming
I think you are underestimating how much people are using their phones for content creation. My daughter uses her phone for her entire video production pipeline. She records and edits videos, does compositing and animation, mixes audio, does photo editing etc. all on her iPhone.
I've tried several times to show her the advantages of using a 'real' computer with 'real' software, but she just finds it slow and annoying.
Even the mini is too large to use with one hand unless you have very large hands. Probably half of American men and the majority of women are not able to use their thumb to reach the top and bottom of the screen while holding the phone in one place securely. I'm pretty surprised that this doesn't seem to be a concern for most people, I guess everyone's content to use two hands on their phone whenever using it?
Yeah. I bought a 13 Mini recently so I could use my phone with one hand, discovered I still couldn't (and I have large hands), and returned it. I don't see much point in a small-ish phone that people can't use with one hand, so I'm not surprised it's flopping. IMO they needed to either commit to a truly small phone, or give up on it. Rumor is they're doing the latter.
Guess it'll be pop-sockets for me from here on out!
You can swipe down on the very bottom of the screen (bit awkward at first), but then you pull down the whole screen to reach the top parts easily. Helpful video: [0]. This way you can really do everything with 1 hand.
Was hoping not to have to jump through any hoops; if I'm going to jump through hoops, I'm gonna stick with a pop-socket. They work great for what they do
That is also my experience, plus many friends that have Android phones now eyeball the iPhone mini because no such thing exists in Android land... I myself made the switch because of this a year ago. I really like the size of my iPhone 12 mini, about the same screen as the OnePlus3 I came from but a lot smaller phone.
I wish it had promotion. I wish they didn't associate "mini" with "non-pro" - I am sure battery life and space are the reasons, but I find it very hard to consider getting a phone with only 60hz refresh rate after having a 120hz screen.
I have a 12 Mini and while I really like the size, I am not sure I will get the next Mini (if it exists) because the battery life has been disappointing.
I'm inclined to agree, and using the last gen part makes a lot of sense from a cost point of view, though it can make low power devices tricky. Apple is probably already at a point where it can make two different chips of the same generation meant for iPhone and the base model iPad.
Perhaps something like P-series for the "pro" model iPhones and a much smaller (transistor count, die size) lower power A-series for the main line. They could use the same core designs, similar to what they do with M1 and M1 Pro/Max.
There are lots of things I could nitpick in this article such as the $40 increase in BOM ( No, just No ). But I have a hypothesis as to why only the iPhone Pro gets A16 (4nm).
The M2 will be using the same 4nm and uArch, GPU, NPU design from A16. ( I am hoping it would break the 2000 GB5 barrier ) And Apple will slowly rollout M2 on Mac over the course of next 18 months. Starting with Mac Pro first. The whole Mac Lineup are currently in huge demand, and if I remember correctly Tim Cook himself mentioned the Mac is breaking shipment record. One could extrapolate using sales data and I expect this to be something like 25M unit rather than the usual 18-20M unit. Apple wants to capture these market as soon as possible. Considering they could have lost a lot of Mac users over the last 3 - 4 years. ( They haven't updated the Active Mac user since 2018, some numbers just dont add up ).
Making sure they have enough capacity to feed whatever Mac sales they have. As some have reported they are still waiting for their October / November MacBook Pro Order.
> To be clear, the work is a proof of idea: The researchers haven’t meaningfully scaled the strategy. Fabricating a handful of transistors isn’t the identical as manufacturing billions on a chip and flawlessly making billions of these chips to be used in laptops and smartphones. Ren additionally factors out that 2D supplies, like molybdenum disulfide, are nonetheless dear and manufacturing high-quality stuff at scale is a problem.
Raises a question I've had for a while - is Moore's Law based on current production silicon or cutting edge research silicon that doesn't exist in consumer products yet?
The products in this article could be 5, 10, or 15 (or infinity) years away from large scale manufacturer. Do they count?
These "research" transistors have never had any bearing on the reality of economically manufacturing products. Here's a paper way back 50 years ago they were already making tiny transistors with electron beam processes https://ieeexplore.ieee.org/document/5391506 - it would be about another 15 years before commercial chips were made with this transistor dimension, and that's 15 years at the height of Moore's law.
Not to say they don't inform what might be possible and advance knowledge of the physics, but it's not much of a predictor about what might become economically viable to make.
And then the A15 actually came out and Anandtech analysis showed it was a normal year for the A series chips, with reasonable, even impressive performance and efficiency gains year over year.
In other words, there's some interesting bits of information here since the guy does appear to have some level of supply chain / industry access, but any time he leans into his own analysis or editorial, I'd be skeptical.