Hacker Newsnew | past | comments | ask | show | jobs | submit | geophile's commentslogin

For a long time, I had a MBP (this is in Intel days), with a Linux VM. It was like a reverse mullet, party in front (multimedia), work in back (dev).

And then:

    - Butterfly keyboard
    - Touchbar
    - M-series CPUs, which, while technically awesome, did not allow for Linux VMs.
So I switched to System76/Linux (Pop OS) and that has been wonderful, not to mention, much cheaper.

- No esc

See I'm a ends justify the means guy:

The more people forced into the beautiful world of capslock is escape the better!


Your website has stained my screen. lol

background-image: radial-gradient(circle at 12% 24%, var(--text-primary) 1.5px, transparent 1.5px),

radial-gradient(circle at 73% 67%, var(--text-primary) 1px, transparent 1px),

radial-gradient(circle at 41% 92%, var(--text-primary) 1.2px, transparent 1.2px),

radial-gradient(circle at 89% 15%, var(--text-primary) 1px, transparent 1px);


FWIW, On Reddit, I am seeing more and more discussions on the Linux subreddits or people getting fed up with Windows and switching to Linux. Usually, it's the Windows 11 upgrade that finally did it.

There is a good parallel here with Myspace and Facebook. Myspace added an ad network & was hammered by spammers around the same time Facebook was opening up user registration to everyone. Facebook had no ads. Myspace was dead.

This time Linux has very good game support to the point where some games have a higher FPS on Linux. It will be so expensive for Microsoft to attempt to turn this ship around, and it will likely still fail.

This is happening at the same time AI agents have gotten really good, so users will just use local AI agents to configure and troubleshoot the rough stuff about Linux. And then they will customize it so much they will never be able to go back to Windows.

Ubuntu is just fine for 99% of non tech users. Windows has so many anti-patterns, tricks, and OneDrive rugpulls now that Ubuntu is actually much safer and simpler for non-techies to use (I can also make the case it beats iOS in that department too.)


This seems like a good time to remind everyone of a letter by David Packard, to his employees. There is more morality, common sense and insightful business advice here than in any 1000 business titles you would care to name.

https://aletteraday.substack.com/p/letter-107-david-packard-...

I think that OPs essay identifies that something bad happened at HP but completely misses what it was. Look at this quote:

    Around 1997, when I was working for the General Counsel, HP engaged
    a major global consulting firm in a multi-year project to help 
    them think about the question: “What happens to very large companies that
    have experienced significant growth for multiple successive years?”
OP says that the findings and recommendations included: "the decade long trend of double-digit growth was unlikely to continue", and "the company [should] begin to plan for much slower growth in the future."

OP then goes on to talk about fighting for resources for investments, a "healthy back and forth" on these tradeoffs, and then losing the "will to fight" following this report. "The focus became how not to lose".

Unlike OP, I did not work at HP. But I have seen up close startups, middle-sized companies, and huge companies, and the transitions among these states. So I feel justified in saying: OP has missed the point. And in particular, he makes no reference to that letter from David Packard.

Look at this quote from the letter:

    I want to discuss why a company exists in the first place. ...  why 
    are we here? I think many people assume, wrongly, that a company 
    exists simply to make money. While this is an important result of 
    a company's existence, we have to go deeper and find the real 
    reasons for our being. ... a group of people get together and exist
    as an institution that we call a company so they are able to accomplish 
    something collectively which they could not accomplish separately. 
    They are able to do something worthwhile—they make a contribution 
    to society .... You can look around and still see people who are 
    interested in money and nothing else, but the underlying drives 
    come largely from a desire to do something else—to make a product—to 
    give a service—generally to do something which is of value.
I think this is the essence of what it means to do useful and interesting work in any technical field. Unfortunately, there are many, many examples of companies that have lost their way, forgetting this key insight. HP was certainly one of them. I would argue that Google and Microsoft are examples too. Boeing, for sure.

And sadly, there are very, very few companies that actually embody Packard's ideas. I think that JetBrains is such a company, familiar to many HN readers. Another one that comes to mind, from a very different field, is Talking Points Memo -- an excellent website that does news reporting and analysis, mostly on US politics. It started as a "blogger in a bathrobe", and 25 years later, it is a small, independent news organization, supporting itself mostly through paid subscriptions by a very loyal readership.

To me, the saddest part of the essay is this:

    In the last few years more and more business people have begun to
    recognize this, have stated it and finally realized this is their
    true objective.
(This is right before the "You can look around ..." section quoted earlier.) It seems to me that very, very few business people recognize the way to run a business, as outlined by Packard.


No, "we" are not replacing OOP with something worse. "We" are replacing layers of stupid shit that got layered on top of, and associated with OOP, with different renderings of the same stupid shit.

I have been programming since 1967. Early in my college days, when I was programming in FORTRAN and ALGOL-W, I came across structured programming. The core idea was that a language should provide direct support for frequently used patterns. Implementing what we now call while loops using IFs and GOTOs? How about adding a while loop to the language itself? And while we're at it, GOTO is never a good idea, don't use it even if your language provides it.

Then there were Abstract Datatypes, which provided my first encounter with the idea that the interface to an ADT was what you should program with, and that the implementation behind that interface was a separate (and maybe even inaccessible) thing. The canonical example of the day was a stack. You have PUSH and POP at the interface, and the implementation could be a linked list, or an array, or a circular array, or something else.

And then the next step in that evolution, a few years later, was OOP. The idea was not that big a step from ADTs and structured programming. Here are some common patterns (modularization, encapsulation, inheritance), and some programming language ideas to provide them directly. (As originally conceived, OOP also had a way of objects interacting, through messages. That is certainly not present in all OO languages.)

And that's all folks.

All the glop that was added later -- Factories, FactoryFactories, GoF patterns, services, microservices -- that's not OOP as originally proposed. A bunch of often questionable ideas were expressed using OO, but they were not part of OO.

The OOP hatred has always been bizarre to me, and I think mostly motivated by these false associations. The essential OOP ideas are uncontroversial. They are just programming language constructs designed to support programming practices that are pretty widely recognized as good ones, regardless of your language choices. Pick your language, use the OO parts or not, it isn't that big a deal. And if your language doesn't have OO bits, then good programming often involves reimplementing them in a systematic way.

These pro- and anti-OOP discussions, which can get pretty voluminous and heated, seem a lot like religious wars. Look, we can all agree that the Golden Rule is a pretty good idea, regardless of the layers of terrible ideas that get piled onto different religions incorporating that rule.


I'm the kind of person that sees a bowl as a large cup without a handle.

Likewise, I see these patterns as equivalent style choices, since the problem fundamentally dictates the required organization and data flow, because the same optimal solution will be visible to any skilled developer, with these weak style choices of implementation being the only freedom that they actually have.

For example, these two are exactly the same:

    state = concept_operation(state, ...args)
and

    class Concept:
        def operation(self, ...args)
            self.state = <whatever with self.state>
and an API call to https://url/concept/operation with a session ID where the state is held.

I suspect people who get emotional about these things haven't spent too much time in the others, to understand why they exist with such widespread use.

It's like food. If you go anywhere and see the common man eating something, there's a reason they're eating it, and that reason is that's it's probably pretty ok, if you just try it. There's a reason they're eating it, and it's not that they're idiots.


A voice of sanity. The submitted article has zero meaningful content.

This comment i posted in an earlier thread rehashing Inheritance vs. Composition for the gazillionth time is highly relevant here - https://news.ycombinator.com/item?id=45943135

OOD/OOP has not gone away, has not shifted etc. but is alive and well; just packaged under different looking gloss. The meta-principles behind it are fundamental to large scale systems development, namely; Separation-Of-Concerns, Modularization, Reuse and Information-Hiding.


I thought structured program was about language support for control flow, simplification and formalization of control flow and single return. Not ADTs.


I thought I expressed that: "I came across structured programming. The core idea was ... Then there were Abstract Datatypes, ..."


I can see what you mean, I interpreted everything to be elaboration to your introductory remark:

> "We" are replacing layers of stupid shit that got layered on top of, and associated with OOP, with different renderings of the same stupid shit.

Meaning that both structured programming and ADTs are different names for the same "stupid shit", meaning the same ideas as OOP. I agree with this for ADT, that is really just the same thing under another name, but I failed to see how structured programming has something to do with OOP.

I now see, that the paragraph wasn't to be read like that.

---

This:

> All the glop that was added later -- Factories, FactoryFactories, GoF patterns, services, microservices -- that's not OOP as originally proposed. A bunch of often questionable ideas were expressed using OO, but they were not part of OO.

> The OOP hatred has always been bizarre to me, and I think mostly motivated by these false associations. The essential OOP ideas are uncontroversial. They are just programming language constructs designed to support programming practices that are pretty widely recognized as good ones, regardless of your language choices. Pick your language, use the OO parts or not, it isn't that big a deal. And if your language doesn't have OO bits, then good programming often involves reimplementing them in a systematic way.

is really the summary under every explanation or criticism of OOP. It is way more eligible for expression than most blog-posts, but it is so concise that it kind of doesn't even warrant to be that.


> And while we're at it, GOTO is never a good idea, don't use it even if your language provides it.

Good luck with that if you're a C programmer.


These are not the same thing. The GOTO people complained about and what the famous article "GOTO considered harmful" is about, is called longjmp in C. Nearly all C programmers will agree with you that you shouldn't use longjmp. The goto of C has less freedom for control flow than try-catch constructs in other languages.


Well sure, but don't use it to implement if/while/for.


This is a very minor but pleasant surprise. An action like this is beyond what I thought the US government (my government, sadly) was capable of. It is kind of puzzling to me that this issue, like every other one, didn't get politicized, with right wing talking heads bemoaning progress of any sort, appealing to the good old days, when America was great, the days that MAGAs want to return to.

It's a good start. Now let's do metric.


I came to source code control reluctantly. CVS, SourceSafe, others I’ve forgotten. One of them was very expensive, very complex, took months of customization, and then it scrambled our bits following a disk crash and the vendor had to piece things back together. An expensive nightmare.

I finally started using Subversion, and it finally clicked. Easy to understand and use. It did everything I needed, and it was intuitive. But git was gaining popularity and eventually that was the only “choice”. And I don’t get git at all. I can do a few things that I need. I often have to consult google or experts for help. While I get the concepts, the commands are incomprehensible to me. I hate it.


Subversion & Mercurial were decent. SourceSafe is utter trash. I've learned to use Git, but I've always used an IDE; I hate the CLI commands.


I got tinnitus in my late 20s. Forty years later, it's still there. Research into the causes, and treatments, has been disappointingly slow.

I would really like to experience total silence at some point, but that seems very unlikely.


Turbo Pascal was completely amazing. I remember resisting it for a long time, because IIRC it implemented non-standard Pascal. But the competitive tools were less powerful and far more expensive, (e.g. the Microsoft tools). And then I tried it, and was completely blown away. I no longer cared about the non-standard stuff. I had a fast intuitive IDE running on my original IBM PC.

As for modern IDEs, Intellij has been orders of magnitude better than any competition for more than 25 years (I think). I have stayed away from Microsoft products for a very long time, so I can't comment on VSCode and its predecessors. The main competition I remember was Eclipse, which I always found to be sluggish, unintuitive, and buggy. The fact that it wasn't even mentioned in this article is telling.

JetBrains, the company that created Intellij (and then PyCharm, CLion and many others) is one of those extremely rare companies that defined a mission, has stuck to it, and excelled at it for many years, and has not strayed from the path, or compromised, or sold out. It is so impressive to me that they maintain this high level of excellence as they support a vast and ever-growing collection of languages, coding standards and styles, and tools.


> so I can't comment on VSCode and its predecessors

Vscode is a pale imitation of its predecessors.

Visual c++ was amazing and remains my favorite ide ever.

It was also the spiritual successor of the Borland TUI IDEs because MS stole all of Borland’s top compiler engineers.


Except for resource usage.

I chose it becaue I don't have access to neovim on my cloud desktop and ideavim is a superior solution to any vim like plugins for vscode. It is struggling with 4 cores and 16GB of ram with only a few projects open at a time. Some of it is due to being win11 with the amount of security malware installed by my company but still vscode doesn't seem to make it suffer that much.


Visual Studio still supports WinForms including the graphical form designer, which is very close to the OG Delphi experience in late 90s (esp. since WinForms is such a blatant rip off VCL).


You are missing a step there, before Windows Forms there was WFC, Windows Foudation Classes (not to mix with the other WFC from .NET), used in J++, one of the reasons for Sun's lawsuit.

Alongside events, and J/Direct the percursor to P/Invoke.

https://news.microsoft.com/source/1998/03/10/microsoft-visua...

It was WFC that was the rip off, WinForms is basically WFC redone in C#.


Indeed, but very few people remember J++ or WFC, so I simplified that story.

Either way, someone coming from VCL will see a lot of familiar things in System.ComponentModel.


Have you tried NetBeans? It's more intuitive and snappier than e.g. Eclipse.


I did, very briefly. It did not compare to Intellij.


Try to do JNI development in Intellij, with debugging across languages like in Netbeans.


Disappointingly shallow article.

One point though: It is not necessarily the case that visual imagery is the only alternative to an inner dialogue. In both cases, that dialogue, or those images, are things you are aware of. There is something going on behind the scenes to generate those experiences. An alternative to dialog/images is just nothing being generated. I.e., there is something going on subconsciously, but there is nothing related to that activity that breaks through to awareness.

We all experience this, in which some "inspired" breakthrough just suddenly appears in your mind. That breakthrough must have come from somewhere in your subconscious mind.

Anecdotally, this is a conversation that my wife and I have occasionally, about our different mental landscapes. She is a very organized person, with lots of lists and internal (and sometimes external) dialog. I need to let problems just simmer in my mind, without paying attention to them, and I eventually get an answer. She believes that my mind is usually empty. My view is that both of us think subconsciously, but the different is that she has some mental dialog/imagery accompanying her subconscious thinking.

More generally, maybe the real thinking is always subconscious, and what we call thinking (the awareness of reasoning) is just accompanying imagery.


Go away kitten, your comment appears to be low-quality.

"Generate in a parallelogram" is pretty obvious, unless you are truly clueless. Every post of johndcook that I find posted to HN is interesting, and definitely of high quality.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: