Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The price was likely too high, though that is debatable. However the real take away is if you want something like this to work out you need to invest in to for years. There is nothing wrong with getting the size of the market wrong by that much - it happens too often for anyone to call it wrong. It isn't clear what was predicted, but marketing should have predicted a range of units sold (and various price points having different predicted ranges!).

They didn't have the app ecosystem - no surprise. However the only way to get that ecosystem is years of investment. The Windows phone failed a couple years latter for similar reasons - nice device (or so I'm told), but it wasn't out long enough to get a lot of apps before Microsoft gave up on it.



> There is nothing wrong with getting the size of the market wrong by that much - it happens too often for anyone to call it wrong. It isn't clear what was predicted, but marketing should have predicted a range of units sold (and various price points having different predicted ranges!).

Shout out to the Itanium sales forecast: https://upload.wikimedia.org/wikipedia/commons/8/88/Itanium_...


And its inverse, the IEA solar energy forecast: https://en.wikipedia.org/wiki/File:Reality_versus_IEA_predic...

(This version of the graph is pretty old, but it's enough to get the flavor. The rate of new installations is still increasing exponentially, and the IEA continues to predict that it'll level off any day now...)


If they keep predicting that, eventually they’ll be right!

(It’s hard to harvest more power from a star than a Dyson sphere is capable of)


Reminds of something I heard: Of the 3 most recent recessions, analysts predicted 20.


Very soon we will produce more solar electricity than all of the word's consumption. A "problem" that is even more severe than it looks like, because we consume energy when the Sun is under the horizon too.

So, yeah, in a few years they'll be right. Even if for just a short time while the rest of the economy grows to keep up with the change.


Those 2 charts are amazing! At least the Itanium people adjusted their curves downward over time, looks like the IEA just carried on regardless!


It wasn't the Itanium people so much as the industry analysts who follow such things. And, yes, they (including myself) were spectacularly wrong early on but, hey, it was Intel after all and an AMD alternative wasn't even a blip on the radar and 64-bit chips were clearly needed. I'm not sure there was any industry analyst--and I probably bailed earlier than most--who was going this is going to be a flop from the earliest days.


  an AMD alternative wasn't even a blip on the radar
Aside from it not being 64bit initially uh.. did we live through the same time period? The Athlons completely blew the Intel competition out of the water. If Intel hadn't heavily engaged in market manipulation, AMD would have taken a huge bite out of their marketshare.


In the 64-bit server space, which is really what's relevant to this discussion, AMD was pretty much not part of the discussion until Dell (might have been Compaq at the time) and Sun picked them up as a supplier in the fairly late 2000s. Yes, Intel apparently played a bunch of dirty pool but that was mostly about the desktop at the time which the big suppliers didn't really care about.


Opteron was a much bigger deal than you're making it sound. Market share was up to 25%.


But initial Opteron success was pretty much unrelated to 64-bit. As a very senior Intel exec told me at the time, Intel held back on multi-core because their key software partner was extremely nervous about being forced to support a multi-core world.

I'm well aware of Opteron's impact. In fact, the event when that info was related to me, was partly held for me to scare the hell out of Intel sales folks. But 64-bit wasn't really part of the equation. Long time ago and not really disposed to dig into timelines. But multi-core was an issue for Intel before they were forced to respond with Yamhill to AMD's 64-bit extensions to x86.


> As a very senior Intel exec told me at the time, Intel held back on multi-core because their key software partner was extremely nervous about being forced to support a multi-core world.

That's one way to explain it. Alternatively, one might say that FSB-based Netburst servers would not benefit much from multi-core because the architecture (and especially FSB) has hit its limitation. Arguably, Intel had no competitive product on the mass server market until 2006 and Core-based Xeon 5100 introduction. Only enormous market inertia has kept them afloat.

> In the 64-bit server space, which is really what's relevant to this discussion, AMD was pretty much not part of the discussion until Dell (might have been Compaq at the time) and Sun picked them up as a supplier in the fairly late 2000s.

That was one relatively small (servers number-wise) segment of the market. Introduction of Opteron servers and Windows Server 2003 64-bit has created a new segment of mass 64-bit servers which have very quickly taken over entire 32-bit (at that time) mass server market. That was the real market that Intel wanted for themselves with introduction of proprietary Itanium but failed to acquire it because of the compatibility issue. High-end mainframe-adjacent market segment indeed belonged to Itanium for many years after, but that wasn't the goal of Itanium. Intel wanted to be a monopoly on the entire PC&server market with no cross-licensing agreements but failed and had to cross-license AMD64 instead.


It’s understandable why companies try and sometimes succeed at creating a reality distortion field about the future success of their products. Management is asking Wall Street to allow them to make this huge investment (in their own salaries and R&D empire), and they need to promise a corresponding huge return. Wall Street always opportunities to jack up profits in the short term, and management needs to tell a compelling story about ROI that is a few years in the future to convince them it’s worth waiting. Intel also wanted to encourage adoption by OEMs and software companies, and making them think that they need to support Itanium soon could have been a necessary condition to make that a reality.

I don’t know what factors would make IEA underestimate solar adoption.


> I don’t know what factors would make IEA underestimate solar adoption.

The IEA is an energy industry group from back in the days where "energy" primarily meant fossil fuels (i.e. the 1970s), and they've never entirely gotten away from that mentality.


There are trillions of dollars on the line in convincing people not to buy solar panels or other renewable sources.

Remember all the conspiracy theories about how someone invented a free energy machine and the government had to cover it up? Well they're actually true - with the caveat that the free energy machine only works in direct sunlight.


The IEA's purpose is to boost fossil fuels + nuclear?


How often are they reality distortion fields vs leadership trying to put on a face to rally the troops and investors? How do you do the second without the first?

Something I ponder from time to time, while trying to figure out how to be less of a cynic and more of a leader.


> Management is asking Wall Street to allow them to make this huge investment (in their own salaries and R&D empire), and they need to promise a corresponding huge return. Wall Street always opportunities to jack up profits in the short term, and management needs to tell a compelling story about ROI that is a few years in the future to convince them it’s worth waiting

Explain Amazon, Uber, Spotify, Tesla, and other publicly listed businesses that had low or even negative profit margins for many years.

The idea that Wall Street only rewards short term profit margins is laughable considering who is at the top of the market cap rankings.


The section of my comment you quoted directly addresses this! Wall Street can be convinced by a compelling story.


one thing I found amazing about the IEA chart is how similar the colors of each year was making it very difficult to see which year was which. the gist of the chart was still clear though


Holy cow was that forecast bad!

It reminds me of a meeting long ago where the marketing team reported that oil was going to hit $400/bbl and that this would be great for business. I literally laughed out loud. At that price, gasoline would be about $18/gal and no one could afford to move anything except by ox cart.


> At that price, gasoline would be about $18/gal and no one could afford to move anything except by ox cart.

Just for some rough math here - I’m currently paying around $1.20/L for gas, and crude oil cost is roughly half of that, so if crude went up by 6x, I’d be looking at $5/L for gas. Gas is currently about 20% of my per-km cost of driving, so that price increase at the pump would increase my per-km cost by about 60%.

FWIW that’s roughly the same per-km cost increase that people have voluntarily taken on over the past decade in North America by buying more expensive cars.

(Though this does apply to personal transportation only, the math on e.g. transport trucks is different)


The issue isn't person transport it is shipping and home heating and agriculture

I drive electric so like to imagine myself sheltered from gas price increases but I know grocery costs would explode


Especially if you live were gas cost a buck twenty a liter


Well it's that high because of taxes, so if crude goes up the total price will go up proportionally less than places that have more of the gas cost comprised of non-taxes. (Some of the taxes are flat, and some get waived when gas gets expensive.)


> by buying more expensive cars

Not to mention less efficient cars.


Not to mention, cars.


How can you possibly say that crude is half of the pump price? The economics are incredibly complex and murky, and the price of gas doesn't move with any sort of linear relation to crude except in very long timeframes. Regional refining capacity is way more important.


The price of gas isn't immediately and directly impacted by the price of crude because of futures contracts. This naturally means gas prices will move to match the price of crude over time. It's a feature of the current system, not an indication that the price of gas isn't heavily reliant on gas. Nobody is making gas with spot prices.


> How can you possibly say that crude is half of the pump price?

I googled for a couple sources on the breakdown of the price of gasoline, and they seemed to be in agreement that the raw cost of crude is somewhere around half. (And broke refining out separately.)

I'm sure it's not perfect, but it seems fairly reasonable. (And it can be off by quite a lot and still not make a huge difference to the cost-per-km of driving.)


> How can you possibly say that crude is half of the pump price?

Look at gas prices in your area. Look at the price of crude. Divide.

How could you possibly not be able to estimate the fraction?

And yeah ideally you use an average number over some months and you sample the crude earlier than the gas but those are minor tweaks.


That's assuming the other costs (refining energy costs, transport, the company's gross margin) are uncorrelated to the price of crude oil, which seems unlikely


A) Just calculating the percentage doesn't assume that.

B) They shouldn't correlate by a particularly large amount in a competitive environment. For an approximation as rough as "half" and assuming no other changes it's not a big deal.


Itanium needs a lot longer discussion than can be covered in an HN comment.

https://bitmason.blogspot.com/2024/02/the-sinking-of-itanic-...


I think Bob Colwell's account is the clearest short synopsis.

https://www.sigmicro.org/media/oralhistories/colwell.pdf

'And I finally put my hand up and said I just could not see how you're proposing to get to those kind of performance levels. And he said well we've got a simulation, and I thought Ah, ok. That shut me up for a little bit, but then something occurred to me and I interrupted him again. I said, wait I am sorry to derail this meeting. But how would you use a simulator if you don't have a compiler? He said, well that's true we don't have a compiler yet, so I hand assembled my simulations. I asked "How did you do thousands of line of code that way?" He said “No, I did 30 lines of code”. Flabbergasted, I said, "You're predicting the entire future of this architecture on 30 lines of hand generated code?" [chuckle], I said it just like that, I did not mean to be insulting but I was just thunderstruck. Andy Grove piped up and said "we are not here right now to reconsider the future of this effort, so let’s move on".'


I’m curious what kind of code his 30 lines were - I’m betting something FP-heavy based on the public focus benchmarks gave thst over branchy business logic. I still remember getting the pitch that you had to buy Intel’s compilers to get decent performance. I worked at a software vendor and later a computational research lab, and both times that torpedoed any interest in buying hardware because it boiled down to paying a couple of times more upfront and hoping you could optimize at least the equivalent gain back … or just buy an off-the-shelf system which performed well now and do literally anything else with your life.

One really interesting related angle is the rise of open source software in business IT which was happening contemporaneously. X86 compatibility mattered so much back then because people had tons of code they couldn’t easily modify whereas later switches like Apple’s PPC-x86 or x86-ARM and Microsoft’s recent ARM attempts seem to be a lot smoother because almost everyone is relying on many of the same open source libraries and compilers. I think Itanium would still have struggled to realize much of its peak performance but at least you wouldn’t have had so many frictional costs simply getting code to run correctly.


I think you're right. The combination of open source and public clouds has really tended to reduce the dominance of specific hardware/software ecosystems, especially Wintel. Especially with the decline of CMOS process scaling as a performance lever, I expect that we'll see more heterogeneous computing in the future.


Nice insight, thank you.


This form versus substance issue is a really deeply embedded problem in our industry, and it is getting worse.

Time and again, I run into professionals who claim X, only to find out that the assertion was based only upon the flimsiest interpretation of what it took to accomplish the assertion. If I had to be less charitable, then I’d say fraudulent interpretations.

Promo Packet Princesses are especially prone to getting caught out doing this. And as the above story illustrates, you better catch and tear down these “interpretations” as the risks to the enterprise they are, well before they obtain visible executive sponsorship, or the political waters gets choppy.

IMHE, if you catch these in time, then estimate the risk along with a solution, it usually defuses them and “prices” their proposals more at a “market clearing rate” of the actual risk. They’re usually hoping to pass the hot potato to the poor suckers forced to handle sustaining work streams on their “brilliant vision” before anyone notices the emperor has no clothes.

I’d love to hear others’ experiences around this and how they defused the risk time bombs.


> “You're predicting the entire future of this architecture on 30 lines of hand generated code?"

It’s comforting to know that massively strategic decisions based on very little information that may not even be correct are made in other organizations and not just mine.


Everybody does it. Information only comes because you made your strategic decision, never before it.


There were a bunch of other issues but, yes, the compiler was a big one from which a number of the other issues stemmed.


I don’t think it is that simple. Itanium was for years supported for example by RHEL (including GCC working of course, if anybody cared enough they could invest into optimising that), it is not like the whole fiasco happened in one moment. No, Itanium was genuinely a bad design, which never got fixed, because it apparently couldn’t be.


Well, yes, the market didn't care all that much for various reasons. (There were reasons beyond technology.) RHEL/GCC supported but, while I wasn't there at the time, I'm not sure how much focus there was. Other companies were hedging their bets on Itanium at the time--e.g. Project Monterey. Aside from Sun, most of the majors were placing Itanium bets to some degree if only to hedge other projects.

Even HP dropped it eventually. And the former CEO of Intel (who was CTO during much of the time Itanium was active) said in a trade press interview that he wished they had just done a more enterprisey Xeon--which happened eventually anyway.


We're not living through this again at all with generative AI, right?


A small boardroom locked in groupthink, misled by one single individual’s weak simulated benchmark, with no indication of real world performance or customer demand?


The plan was to artificially suppress x86-64 to leave customers with no real alternative to Itanium. The early sales projections made sense under that assumption.


I had heard that it wasn't suppression as much as just not making it a thing at all, and that AMD used the opportunity to extend x86 to 64-bit, and Intel was essentially forced to follow suit to avoid losing more of the market. It also explains why the shorthand "amd64" is used; Intel didn't actually design x86_64 itself.


There was apparently earlier Pentium 4s that supported some version of a 64bit isa, support for which was fused off before sending to customers in order to convince people to move to Itanium.

https://www.tomshardware.com/pc-components/cpus/former-intel...


I have some very old servers that have the Pentium 4 architecture with amd64 capability.


I've still got a couple small business models along these lines that are over 20 years old now. Still running possibly because I always turn them fully off when not using them. No hibernation, sleep or other monkey business.

One Dell has an early 64-bit mainboard but only a 32-bit CPU in that socket, just fine for Windows XP and will also run W10 32-bit (slowly), mainly dual booting to Debian i386 now since it retired from office work. Puts out so much heat I would imagine there is a lot of bypassed silicon on the chip drawing power but not helping process. IIRC a 64-bit CPU for that socket was known to exist but was more or less "unobtanium".

Then a trusty HP tower with the Pentium D, which was supposedly a "double" with two x86 arch patterns on the same chip. This one runs everything x86 or AMD64, up until W11 24H2 where the roadblocks are unsurmountable.


Interesting! I had no idea that was a thing


To this day, I don't know if Intel thought Itanium was the legitimately better approach. There were certainly theoretical arguments for VLIW over carrying CISC forward--even if it had never been commercially successful in the past. But I at least suspect that getting away from x86 licensing entanglements was also a factor. I suspect it was a bit of both and different people at the company probably had different perspectives.


Internal inertia is a powerful thing. This was discussed at length on comp.arch in the late 1990's early 2000's by insiders like Andy Glew. When OoO started to dominate intel should have realized the risk, but they continued to cancel internal projects to extend x86 to 64-bits. Of which apparently there were multiple. Even then, the day that AMD announced 64-bit extensions and a product timeline it should have resulted in intel doing an internal about face and acknowledging what everyone knew (in the late 1990's) and quietly scuttling ia64 while pulling a backup x86 out of their pocket. But since they had killed them all, they were forced to scramble to follow AMD.

Intel has plenty of engineering talent, if the bean counters, politicians and board would just get out of the way they would come back. But instead you see patently stupid/poor execution like then still ongoing avx512 saga. Lakefield, is a prime example of WTFism showing up publicly. The lack of internal leadership is written as loud as possible on a product where no one had the political power to force the smaller core to emulate avx512 during the development cycle, or NAK a product where the two cores couldn't even execute the same instructions. Its an engineering POC probably being shopped to apple or someone else considering an arm big.little without understanding how to actually implement it in a meaningful way. Compared with the AMD approach which seems to even best the arm big.little by simply using the same cores process optimized differently to the same effect without having to deal with the problems of optimizing software for two different microarch.


Sophie Wilson (ARM instruction set designer) was very enthusiastic over her "Firepath" architecture that had VLIW aspects.

It was targeted at DSL modems, and I think the platform has faded and is now somewhat obscure.

https://royalsociety.org/people/sophie-wilson-12544/

https://old.hotchips.org/wp-content/uploads/hc_archives/hc14...


Intel and AMD have a cross licensing agreement where they pay each other the right to use various IP. One of the things Intel pays AMD for is x86_64.


x86_64 patents have expired by now, so they do not in fact pay for them.


Windows phones were incredible, the OS was the most responsive at the time by far. No apps though. They were building in Android app support when they pulled the plug.


Upvoted as my experience was similar. I owned 3 windows phones over the years and they were always an absolute joy. The UI was very polished, the call quality was terrific, the camera was awesome, and it did have plenty of apps even if it was a tiny percentage of android or iPhone. To be honest though, I've never been one to care about apps. My experience was anyone who actually took the time to play with one loved it. The hard part was getting people to give it a try. AT&T also did an awful job at the store too as none of their employees knew anything about it.


I worked as a Sales Consultant for AT&T wireless during this period. They really did do a great job training the employees. We attended day long trainings and we were each given windows phones as our work phones. I loved my Samsung and Nokia Windows phones and was quite knowledgeable. The issue was that we were commissioned-based employees. What do you think sales people pushed: the iPhone with an entire wall of accessories or the Windows phone with two cases? Employees needed to have their commission structure altered to benefit significantly more from each windows phone sale if this was ever to succeed. This is why iPhone competitors failed initially, the sales people took the path of least resistance and more money, just like most would.


Thanks for your story, I had no idea.


While I agree that Windows phone was actually quite nice, I wish they didn't have to kill Meego to make it by planting a mole CEO at Nokia.

If you think Windows phone was great you should have seen the Nokia N9. Still one of the best phones I ever owned.


The Nokia N9 was also the last phone by Nokia to be made in Finland. After that, and the whole brand licensing to HMD thing happened, Nokia-branded phones were made in China going forward. Such a shame.


Glad to hear this sentiment, even all these years later. We got there finally, we really did. But oh my, was it a journey. The effort (and investment ms put in) moving mobile computing/devices forward during that time is (IMO) an under song but major part of the work required to get to the modern day cell phone/embedded device.

(I worked at ms starting during ppc/tpc era through wm)


I really appreciated my brief experience with a Lumia - snappy UI, built in radio tuner, and a handful of apps. Not only was the UI responsive, it moved and flowed in a way that made it a joy to interact with. I’d say iPhone is the closest in smoothness, but nothing beats the windows phone UI experience - a sentiment I never thought I’d have.

I was talking to a coworker about Lumia a while ago when I was using it semi-regularly, and he told me he was friends with “the sole Windows Phone evangelist for MS”. We had already seen the signs of WP going out but it was just sad to see how little MS put into the platform. They have pockets deep enough - I saw Windows Stores in public years after I thought they would shutter lol


I thought it was fascinating, agood value proposition, a necessary diversification of the market. I almost wonder just looking primarily at Google's example if a major key to success is just toughing it out and finding an identity and finding a niche in the early years. I feel like this could have been something meaningful and like the plug was pulled too quick. To keep going back to Amazon Prime which played the long long game before becoming kind of a flagship offering.


I always say that many of the things we take for granted today came from Windows Phone

At the time everything was app-based: you are looking at a photo and want to share it? Why, of course you should switch over to the messaging app in question and start a new message and attach it. As opposed to "share the picture, right now, from the photos app"

Dedicated access to the camera no matter what you were in the middle of doing, even if the phone was locked

Pinning access to specific things within an app, for example a specific map destination, a specific mail folder, weather location info

Dedicated back button that enforced an intuitive stack. Watch someone use an iPhone and see how back buttons are usually in the app in a hard to reach place. This leaks into websites themselves too

I still miss the way messaging was handled, where each conversation was its own entry in the task switcher, instead of having to go back and forth inside the app


Sorry for such a delayed response, life. :-)

But I wanted to agree with you very much. Lots of behind the scenes/tech stuff as well. Some of our protocols and technical approaches have lived on and very broadly. Exchange ActiveSync, for example. One technology that didn't live long (for obvious reasons) but I still had a lot of fun working on was recognizing when a phone was being dropped to automatically seat the hard drive heads to prevent head/disc damage. How else were you going to fit 2GB of mp3s on your phone if it didn't have a spinning drive?


The only Windows Phone people I know either worked for Microsoft, or were Microsoft superfans. (And the one friend who liked to just be a contrarian - this time he was right, but he's usually wrong)


I got one because I absolutely hated the duopoly between Google and Apple and wanted to see a third player. It was a pretty good phone. I ended up making quite a bit of money porting apps to it over the years as well.


I bought one cheap at Costco as a travel phone, and I enjoyed using it enough to make it a daily driver once I got home.


In my case I was a Windows user for work and Linux fanboy at home. I just hated the android experience at the time (phone before my Lumia was the original Galaxy I think which was a piece of garbage) and enjoyed playing with a Lumia at the store.


This made some memories pop. I was on the camera and photo app team. I was not an integral part at all. I think most of my code never made it into the app because being part of that org was a shocking experience. I came from building web apps in an org that got shut down to writing mobile apps that used the Windows build system. My psyche was not prepared.

But I remember I worked with 2 of the smartest people I’ve ever worked with - guy named Mike and guy named Adam. To this day I miss working with them.


We pulled out an old Windows Phone from a drawer at work a few years ago. I had never used one before but I was actually quite impressed with the fluidity and design of the UI. The design was a little dark but I could understand now what it had it's fans.

Ironically Microsoft is a company that knows that apps make the platform more than anything else and they botched it so badly.


They shot themselves in the foot right out the gate by trying to copy Apple's $99 annual fee for developers to publish their apps. Whatever initial enthusiasm there was for Windows Phone quickly disappeared when they added that requirement. When they finally figured out it wasn't going to be a new revenue stream, they reduced it for a while instead of eliminating it. When they finally realized just how badly they had messed up and removed all the fees, most developers had already moved on and never gave Windows Phone another look.

It reminds me of the failure of Windows Home Server. It was removed from MSDN because the product manager said developers needed to buy a copy of it if they wanted to develop extensions and products for Home Server. Very few bothered. However many dozen licenses the policy lead to being purchased was dwarfed by the failure of the product to gain market share. Obviously that wasn't only due to alienating developers but it certainly was part of it.


> When they finally realized just how badly they had messed up and removed all the fees

Apparently this didn't even happen until 2018, and only then as a limited-time promo! https://www.windowscentral.com/microsoft-slashes-windows-pho...

To be sure, as noted in this 12-year-old Reddit thread on the program https://www.reddit.com/r/windowsphone/comments/1e6b24/if_mic... - part of the reason for a fee-to-publish is to prevent malware and other bad actors. But it's not the only way to do so.

First-movers can get revenue from supply-quality guardrails. Second-movers need to be hyper-conscious that suppliers have every reason not to invest time in their platform, and they have to innovate on how to set up quality guardrails in other way.


I personally point the blame on their constant breaking of SDK and API surfaces. From 7 to 8 and then to 10, so many APIs that were in use just broke and had no real 1:1 equivalent. I also think the death of Silverlight had a hand in it.


Not to mention that when they moved to SDK 8, you could only develop from a Windows 8 machine, that famously popular OS. So many unforced errors, many seeming to stem from denial that Microsoft does not possess the Apple Reality Distortion Field


What I don't understand is all this MBA training and everyone thinks they can copy the crazy margins that Apple has pulled off while being 12-24 months behind them. Be that matching the ipad's price point with obviously inferior hardware and no ecosystem like HP/Webos, or tossing up little fee's that act as roadblocks in the apple ecosystem to avoid noise/trash and end up just slowing they growth of the app market everywhere else.

And it continues to this day, when one looks at the QC/Windows laptop pricing, or various other trailing technology stacks that think they can compete in apples playground.


Up until 2011 I was still using one of those Samsung phones with the slide out keyboard, maybe an Intensity II or something. My first smartphone was a Windows phone, an HTC Titan. I really liked the phone and the OS - I thought it was very well done. The only problem: the app store was complete shit. There were barely any apps and the ones that were there were trash barely discernible from malware.

After about a year I bought a Nexus 4 instead.


WebOS was incredible on phones too. Android and iOS basically mined the Palm Pre for ideas for years. In 2010 I had a phone with touch based gesture navigation, card based multitasking, magnetically attached wireless charging that displayed a clock when docked.


Indeed.

As part of a carrier buyout a ~decade ago, my then-partner was given a "free" phone. IIRC, it was a Nokia something-or-other that ran Window 8 Mobile.

The specs were very low-end compared to the flagship Samsung I was using. And as a long-time Linux user (after being a long-time OS/2 user), I had deep reservations about everything from Microsoft and I frankly expected them to be very disappointed with the device.

But it was their first smartphone, and the risk was zero, so I didn't try to talk them out of it.

It was a great phone. It was very snappy, like early PalmOS devices (where everything was either in write-once ROM or in RAM -- no permanent writable storage) were also very snappy. The text rendering was great. It took fine pictures. IIRC, even the battery life was quite lovely for smartphones of the time.

Despite being averse to technology, it was easy enough for them to operate that they never asked for me help. And since they'd never spent any time with the Android or Apple ecosystems, they never even noticed that there were fewer apps available.

Their experience was the polar opposite of what I envisioned it would be.


I was a developer for Carrier apps. It was by far the best mobile developer experience by a landslide.

Really staked my career on it because of that. Whoops.

Wasn't until react launched that I felt there was finally a better system for frontend development.


A long time ago I was given an Android, Apple, and MS-windows phone to evaluate as company phones for the company I worked for. the MS-windows phone crashed almost straight out of the box. and crashed again. and again.


My Nokia Lumia 521 running Windows was the best phone I've ever owned. But when MS bought Nokia, they pushed out an update that made it really slow and buggy.


My experience with Windows phone around 2010 was exact opposite, very slow and clumsy. I recall I tried a HTC phone on WM 6.5, far behind iPhone 3GS


That was Windows Mobile, which was the end of the line of the old Windows embedded line vs Windows Phone, the brand new OS made for modern (at the time) smartphones.

WP7 was the first of the new OS


Windows Phone 7 was another OS. Windows Phone 8 was the next totally incompatible OS just couple years later.


I had Lumia 950, still my favorite phone.


> They were building in Android app support when they pulled the plug.

That then became WSL1


It also had the best “swipe” text typing mode for Turkish. iPhone got it very recently and it’s close to useless and Android one was meh last I checked.


I’d say for English too. I don’t know about non-standard keyboards, but WP swiping was better than both the stock iOS keyboard and gboard.


Windows Phone was good if you liked staring at "Resuming..." screens all day.


You don't have to be snarky. If you actually have something to say, just say it so people can understand what you're even talking about.


Okay: multitasking in windows phone was rubbish. You would see a loading screen all the time when switching between apps that lasted seconds. Of course that was still better than the pile of garbage that Android was/is, so it was your only option if you, like me, weren’t able to afford an iPhone. But that’s doesn’t mean I’m going to pretend I miss it.


Thanks! I've owned one windows phone (I liked the UI) and multiple android phones and don't remember anything like that. Maybe it was a problem on some earlier (or cheaper) phones since I waited a bit before buying a smartphone.


> The price was likely too high, though that is debatable.

To me it feels like even in the modern day, products that would be considered okay on their own are more or less ruined by their pricing.

For example, the Intel Core Ultra CPUs got bad reviews due to being more or less a sidegrade from their previous generations, all while being expensive both in comparison to those products, as well as AMD's offerings. They aren't bad CPUs in absolute terms, they're definitely better than the AM4 Ryzen in my PC right now, but they're not worth the asking price to your average user that has other options.

Similarly, the RTX 5060 and also the Intel Arc B580 both suffer from that as well - the Arc card because for whatever reason MSRP ends up being a suggestion that gets disregarded and in the case of the entry level RTX cards just because Nvidia believes that people will fork over 300 USD for a card with 8 GB of VRAM in 2025.

In both of those cases, if you knocked off about 50 USD of those prices, then suddenly it starts looking like a better deal. A bit more and the performance issues could be overlooked.


The major complaint I have with the 5060 is it offers me no reason to update my 3060 Ti. It's 2 generations out and is somewhere around a 10% performance increase at roughly the same power envelope.

It seems like the only trick nVidia has for consumer cards is dumping in more power.


There was another reason behind the Windows phone failure and the lack of apps - Google blocking Microsoft from using its platform native APIs. Microsoft weren't allowed to use, for eg, the YouTube API natively, so the "native" Windows OS app for YouTube had to use roundabout methods of getting YouTube data.


I remember doing some apps for Windows Phone and it really seemed they hated devs. Constantly breaking small things and then the switch to 10 made me give up. It was a nice OS though


Nokia made some pretty nice phones there for a while, and the OS looked pretty usable by Microsloth's standards.

I blame Ballmer, he's like Steve Gate's less intelligent but at least as evil brother.


> There is nothing wrong with getting the size of the market wrong by that much

Remember that the Apple Watch did this. The initial release was priced way outside of market conditions--it was being sold as a luxury-branded fashion accessory at a >$1k price point on release. It was subtly rebranded as a mass-affordable sports fitness tracker the next year.


I believe you are mistaken, in several aspects:

1) Entry level watch models were available for about $400 right away, which is still more or less the starting point (though due to inflation, that's a bit cheaper now, of course).

2) Luxury models (>$1K price) are still available, now under the Hermès co-branding.

The one thing that was only available in the initial release were the "Edition" models at a >$10K price point, but there was speculation that this was more of an anchoring message (to place the watch as a premium product) and never a segment meant to be sustained.

https://en.wikipedia.org/wiki/Apple_Watch


The luxury watch was released in April 2015. The cheaper stainless steel model wasn't released until the fall event a few months later.

But I was talking about branding and marketing; sorry if that wasn't clear. At release the Hermes and "Edition" models were the story. The Apple Watch was the next fashion accessory. You couldn't even buy it at an Apple Store -- you could get fitted, but had to order it shipped to store. But the Hermes store next door had the expensive models in stock.

It wasn't until 2016 that Apple partnered with Nike and changed their branding for the watch to be about health and fitness.


Yes, I agree that health and fitness are a much bigger part of the branding now than they were initially (but the basic features were there right from the beginning — I remember sitting in town halls, with "pings" ringing out at 10 to the hour, and everybody standing up for a minute).


That comports with my memory. I have no idea what Apple's internal sales projections were. But there was a ton of nerd and tech press criticism to the effect that young people didn't wear watches any longer so obviously this was a stupid idea for a product.

Even if I'm not really sold for day-to-day wear because of the limited battery life, I do have one.


Entry level watches are available from China for €40, with everything but Maps. Huawei/Honor Magicwatch 2 e.g.


Sure. My point was that entry level APPLE watches never changed much in their price point.


To me that was the issue, they wanted a 'me too' product without the belief behind to back it.. it was a fine device at the time, a little nicer than all the android tablets around.


What I find interesting about your comment is that iPhone launched with out an ecosystem and 4 years later a. App Store was tabled stakes.


The iPhone opened up the smartphone market to many many more people.

We had smartphones before, but it didn't need to convert their tiny userbase to be a success (and I know some people who stuck with PocketPC-based smartphones for quite a while, because they had their use cases and workflows on them that other smartphones took time to cover).

Once the smartphone for everyone was a category, it was much more fighting between platforms than grabbing users that weren't considering a smartphone before. And after the initial rush that takes much more time to convince people to swap, and obviously app support etc is directly compared. (e.g. for me personally, Nokias Lumia line looked quite interesting at some point. But I wasn't the type to buy a new phone every year, by the time I was actually planning to replacing the Android phone I had it was already clear they'd stop supporting Windows Phone)


I got a Treo in 2006 mostly because I had a badly broken foot and needed an alternative to carrying a computer on some trips. Didn't get an iPhone until a 3GS or thereabouts in around 2010.


Apples app store was 3 years old at that point and white hot. The Samsung Galaxy was 2 years old then. If they wanted to go to market with an unpolished product differentiated with a few nifty features, they'd need to spend months paying loads of money to devs to fill out their app store to have a chance.


And Apple only sold 10 million iPhones the first year out of 1 billion phones that were sold that year. Jobs himself publicly stated his goal was 1% of the cell phone market the guest year


> is years of investment.

Or just don't be greedy and have an open store ecosystem that doesn't seek to extract money from it's own developers.

> to get a lot of apps

Phones are computers. For some reason all the manufacturers decided to work very hard to hide this fact and then bury their computer under a layer of insane and incompatible SDKs. They created their own resistance to app development.


Clearly you have never actually used a WebOS device. They supported app sideloading out of the box and were easy to get root on via an officially supported method. There was an extremely popular third-party app store called Preware that offered all sorts of apps and OS tweaks.


When I was a little kid I "jailbroke" my palm pre, and had all kinds of cool tweaks and apps loaded. I wish I could remember the name of this funny little MS-paint style RPG... WebOS was a great OS, shame what happened to it.


People really overestimate how much people care about indy developers or how little the 15-30% commission actually makes.

Most of the popular non game apps don’t make money directly by consumers paying for them and it came out in the Epic trial that somewhere around 90% of App Store revenue comes from in app purchases from pay to win games and loot boxes.

If the money is there, companies will jump through any hoops to make software that works for the platform.


That seems like a reversal of cause and effect.

Indie developers were (and to an extent still are) pretty important on computers. People made (still make) a living selling software for double-digit dollars direct to the customer, and many of them were very well known.

The App Store model provoked a race to the bottom because everything was centralized, there were rules about how your app could be purchased, and pricing went all the way down to a dollar. The old model of try-before-you-buy didn't work. People wouldn't spend $20 sight-unseen, especially when surrounded by apps with a 99 cent price tag. It's not so much that people don't care about indie developers as that indie developers had a very hard time making it in a space that didn't allow indie-friendly approaches to selling software.

No surprise that such a thing ended up in a situation where high-quality software doesn't sell, and most of the revenue comes from effectively gambling.


If every single indie developer disappeared and didn’t make software for computers - to a first approximation, no one would notice a difference.


This did happen and you're right, no one noticed a difference.


We say all of this on top of a mountain of open source software. This isn't about market love of "indie developers." It's the basic software economy we've known and understood for decades now.

It was 30% commission for the time frame we are discussing and an investment in hardware tools and desktop software on top of all that. It used it's own proprietary system which required additional effort to adapt to and increased your workload if you wanted to release on multiple platforms.

So users don't get to use their own device unless a corporation can smell money in creating that software for them? What a valueless proposition given everything we know about the realities of open source.

You've fallen into the same trap. This is a computer. There's nothing magic about it. The lens you view this through is artificially constrained and bizarrely removed from common experience.


Yes the mountain of open source software is on the server and for developers. Regular users have never cared about open source ur being in control of thier computers.


That open ecosystem needs years of investment to develop. A few people will take the risk and make a first app, but a lot will wait longer.


I think you're genuinely forgetting how starved people were for phone applications when these devices first came on the market.

Developers were absolutely willing to make the investment. Billions of devices were about to come online.


Most of those developers were looking for revenue, though, and there’s a really wicked network effect rewarding the popular platforms. By the time the first WebOS device launched in 2009 Apple had already shipped tens of millions of iPhones and Android was growing, too. By the time decent WebOS hardware was available, there just weren’t many developers looking to target a user base at least an order of magnitude smaller – even Android struggled because not as many users were willing to actually buy software.


Makes me think about the VR market. Tons of hardware, very few apps. It's interesting.


I think microsoft made a valiant effort with windows phone. They kept it in the market for years and iterated, they threw big budgets after it, they made deals with app developers to bring over their apps.

You can point to missteps like resetting the hardware and app ecosystem with the wp 7 to 8 transition and again with 8 to 10, or that wp 10 was rushed and had major quality problems, but ultimately none of that mattered.

What killed windows phone was the iron law that app developers just weren’t willing to invest the effort to support a third mobile platform and iOS and Android had already taken the lead. They could have added android app support and almost did, but then what was the point of windows phone? It was in its time the superior mobile OS, but without the apps that just didn’t matter.

This is what makes apple’s current disdain for app developers so insulting. They owe their platform success to developers that chose and continue to choose to build for their platform, and they reward that choice with disrespect.


Windows phones were out for years, no?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: