Hacker Newsnew | past | comments | ask | show | jobs | submit | more gxonatano's commentslogin

I know there's a place for distros like these, designed to be familiar to users of Windows or MacOS, but to me it shows Linux at its laziest: where exciting new ideas in system and UI design are skipped over, in favor of bad design ideas from 1995 (looking at you, Start menu). On MacOS and Windows you're stuck with whatever OS UI those respective corporations decide you get—the Apple menu, the Start menu, floating window management, and so on—and there's nothing inherently good about those paradigms; they mostly just exist for legacy reasons. On Linux, you have the freedom to customize everything, and to so it just seems sad that so much good development effort is going into building systems that value familiarity over innovation.

Put differently, I find it sad when user-friendliness is valued over user-centrism. Linux is full of software that is user-centric more than user-friendly: look at Vim, for instance, which is famously difficult to quit, yet is designed to be ergonomic and efficient in a way which puts the user first. The Vim philosophy (modal editing, ergonomic arrow keys, etc.) has even been extended to web browsers (Qutebrowser, for instance), and to window managers (i3, sway, etc.). These types of programs, in my opinion, are where Linux really shines.

Most people commenting here, however, describe this familiar/innovative or friendly/centric dichotomy in terms of user archetypes: "techie" and "normal" people. That feels unnecessarily essentialist, implying that "normal" people aren't curious enough to learn something unfamiliar, like a new style of user interface. But if we always assumed that, we'd never have had any innovative interfaces at all: mouse-driven desktop interfaces, smartphone touch screens, or any of it.

Of course, Linux distros are diverse enough to have something for everyone. I just think that conventional, familiar ones like this represent a missed opportunity.


> in favor of bad design ideas from 1995 (looking at you, Start menu)

For what it's worth, that's the point when your comment jumped the shark. I knew then that this was just a rant.

The Start menu was a _superb_ piece of design, as was Win95 in general. If nothing else, the existence proof of this is the sheer number of other desktops that imitate the design:

KDE; GNOME 1/2; MATE; Xfce; QNX Neutrino Photon; Inferno; OS/2 Warp 4; BeOS Tracker; Enlightenment; Moksha; XPde; Fvwm95; IceWM; JWM; Lumina; LXDE; LXQt; Cinnamon; GNOME Flashback; EDE; Budgie; UKUI; Deepin; Aura; FyneDesk.

I could probably find more, but 24 should do for now. Even combining forks, there are over 20.

You may not like it, and that's a legitimate view I am not arguing with, but billions of people use desktop interfaces modelled upon it, representing the combined work of thousands of developers, reimplementing it in dozens of languages.


> the existence proof of this is the sheer number of other desktops that imitate the design

That's where you're wrong. The desktop environments that imitate Win95 elements do it to provide something familiar for their users. The KDE team is not sitting around going, "you know what was designed really well? The Start Menu!" In fact, many of the desktop environments you mention (GNOME Flashback, Cinnamon) were a conservative reaction to the new GNOME 3 design which broke from the Windows aesthetic. The Wikipedia page for Cinnamon, for instance, says it aims to "follow traditional desktop metaphor conventions" and aims for a "gentle learning curve." They're explicitly choosing familiarity over innovation.

> The Start menu was a _superb_ piece of design

Not really. It achieves a reasonably clean look, but at the expense of excessively hierarchicalizing programs and documents. GNOME's Activities panel allows you to click "Activities" then click the program you want to run. Even better, you can just tap the Super key, type a letter or two of the program, and press enter. On Windows 95, I remember trying to launch a calculator, and clicking Start, then clicking Programs, then clicking Utilities, then clicking Calculator. In 1995, lots of people were complaining about the Start Menu, how clunky it was and how it slowed down common tasks. GNOME 3's approach is better, as is MacOS's Launchpad, as well as lots of other desktop launchers.

> billions of people use desktop interfaces modelled upon it, representing the combined work of thousands of developers, reimplementing it in dozens of languages.

The idea that pervasive ideas are somehow good, just because they're popular, is a well-known logical fallacy called Argumentum ad Populum. The Start Menu was never good. It was just popular. One does not follow from the other.


> Even better, you can just tap the Super key, type a letter or two of the program, and press enter.

Most DEs do that today, including KDE, Cinnamon, Xfce (at least in some config), and Windows itself (although that last one does to much including web searches when you do this).

The categories in the start menu have its advantages, that's how I discovered what program exists and does what as a child in a KDE based distro. I fail to see what's wrong with a start menu + this feature you are citing. It does discovery + efficience very well. Gnome 3's Android-like icon grid without categories (from what I recall and from the screenshots I see) seems awful for discoverability seems awful. If you don't know the icon or the name of the program you are looking, the icon grid seems awful (although I recognize keyword search should get you there, but keyword search + the categories that other DE provide seems more useful - and today, it's rare to have deep hierarchies like in win95)


I disagree on all points. I suspect that you lack historical context to this design, and may be too young to thoroughly grasp it. I have read that younger people (millennials and younger) tend not to think in hierarchies and find them complex and difficult.

Tell me, what pre-Windows 95 GUI designs are you familiar with? I don't mean know slightly, I mean know well.


I agree that it's important for user-centrism to be a focal point of Linux, however I am also happy that distributions like this exist, yes you do lose some of the _magic_ of Linux by replicating a workflow a user is already used to, however this is perfect as non-technical people simply just don't care about that and just want to check their emails, social media, do some shopping, research stuff and watch some content online. They don't need to learn user-centric workflows like Vim to do such a thing.


As a longtime Mac user, I kinda wish it had a start menu. So much useful stuff easily accessible. But also, none of those choices affect my productivity all that much.


As a longtime Mac user, just use Command-Space.


I'm both. (Mac and Windows since 1988, before Linux existed.) The point of the start menu is that you can search; the point of Spotlight is that the computer searches.

With Spotlight, you're telling the computer to run something you know is there, without bothering looking for it. You need to know it's there.

With a dedicated app launcher, such as say the macOS Launchpad, you can explore what apps are available to you. Once you know, you can quickly open it with cmd+space and 2-3 letters.

You can't open things that aren't there. You need to find what's available.

They are different tools for different purposes, which is why Launchpad is also there.


Exactly, except the problem with Launchpad is it's cumbersome and takes up the entire screen. I never use it, just have to know exactly what I'm looking for and use cmd-space.


> just have to know exactly what I'm looking for

And how do you acquire that knowledge? Browsing. Looking and reading, and remembering.

Mac OS X had no mechanism for this, but iOS does. iOS's Springboard launcher is lifted directly from the Dashboard in OS X "Tiger". Apple simplified it for the phone to only show apps. Then later they grafted it back in its simplified phone form -- Dashboard having been removed in the meantime.

Before that you had to browse the filesystem. To do that you need to know where to look.

That's how it worked on classic MacOS, and Windows 1 and 2, and DR-GEM, and AmigaOS, and RISC OS, and basically all other 1980s GUIs.

(Proprietary Unix left you with a terminal. Job done.)

The innovation in Windows 3 was having an app launcher program with groups. It was called Program Manager. It had groups, because it's quicker to look in the group related to what you want than in all apps. ProgMan was stolen from OS/2 1.1 by the way.

Win 95 had a further innovation that built on that. It shrank Program Manager down from a full-screen app to a single button, that opened up on a hierarchical list, and that list had icons in it because some people are more visual and recognise icons better than names.

Me, I'm a reader, I want words not pictures. Pictures waste my time and my screen space. That's why it's important to offer a choice. GNOME takes away choice. The GNOME devs have a Vision and you must use it. The KDE devs don't have a vision. They have nearly as many visions as developers, and they try to accommodate all of them.

Not everyone: just the devs. Examples:

* I use widescreens. We all use widescreens now. I want the title bars on the side, like in wm2, not on top. That's not an option.

* I liked BeOS. I think title bars should be tabbed, like in web browsers. That's not an option.

* I hate hamburger menus. I want menu bars. There is no global option for that. You can't have it.

* I hate CSD. I want a title bar I can middle-click to send behind all other windows, like KDE 1, 2, 3 and 4 did, as well as every other non-GNOME desktop. I also liked Windowblinds on classic MacOS: the ability to roll up windows into the title bar. Again, like in some older KDE versions. There's no option for that any more.

There is important choice, accommodating different needs and usage patterns, and there is cosmetic choice, merely affecting how things look but not the underlying mechanisms of how they work.

Supporting diversity of usage is more important than diversity of appearance.

Both the full desktops that natively support Wayland fail to do this.


Realistically you shouldn't need thousands of applications so remembering the names of the 20 or so apps I use isn't a big deal.


Good for you.

I've been doing tech support since 1988. I've seen many many PCs with hundreds of apps, some maybe heading into thousands, sometimes with custom hackery to get different versions running in parallel and stuff.

Yes this is a thing. It is common.

And the typical user does not know what an "app" is, or what OS they are using. I've lost count of the number of people that told me their computer was running Word or Office (not Windows), or who think they access the Web via Google because they don't know what a web browser is.

Here is some proof, in case you don't believe me.

https://www.youtube.com/watch?v=o4MwTvtyrUQ

If you can name the 20 apps you use the most often, you are the elite of the elite, the top 1% of 1% of computer users.

Normal people are not like that. They don't know what they use or what it's called or what OS they run or what an "OS" is, and they outnumber us by approximately a million to one.


I used to work tech support as well... For this case if I was on a user machine and needed to browse for apps I would open Finder.app and navigate to /Applications, this is what I did even with Launchpad existing because it's more convenient to navigate and search as it's just a standard window instead of some full screen thing. This would mean I could continue reading documentation alongside the finder window.

> sometimes with custom hackery to get different versions running in parallel and stuff.

Most users just aren't doing this, they open up their Mac, install a few applications and then just use it as intended; they don't need to customise things and wouldn't care to spend the time even if it benefited them. If they are capable of implementing these custom hacks and things they are likely intelligent enough to navigate via Finder and Spotlight for almost all use cases.

> If you can name the 20 apps you use the most often, you are the elite of the elite, the top 1% of 1% of computer users.

I don't think so, most people I know whether that's friends or colleagues (some technical, some not) use maybe 3–5 apps:

Browser Word Processer Spreadsheets Notes Task Management (reminders, asana, jira or something) Music Player (some just use the browser)

people don't have 20+ applications to remember day in day out.

Hell even as a power user I only really use:

Terminal Browser IDE Creative Tools (Logic, Final Cut, Compressor, etc.) Notes Reminders Music Player


It seems to me that your answer is really spelling out my point, illustrated with examples.

It says:

* "I'm a terminal user"

* "I am a programmer"

(Which means, "I am therefore more technical than 99.9% of people"

* "I use an OS carefully handcrafted to make the file manager an acceptable app browser"

(Because what you describe is impossible on any form of Windows or non-Apple Unix)

* "I use a sophisticated tool which can find things both by name and by description"

You are not arguing my point; you are in fact reinforcing it.


The only UNIXes that I consider ever caring for the whole experience as a full stack, for users and application developers alike, were Irix, Sun NeWS, Solaris, NeXTSTEP and its evolution as OS X.

Sure you can argue Linux distributions can also offer something similar, the problem is which flavours and for how long, which brings us to shipping the Linux kernel underneath Java and Web frameworks, as being the most successful approach thus far.


> the problem is which flavours

Whatever distro which is pre-installed on the computer you buy, or whatever your geek acquaintance picks, or whatever you pick as a geek yourself (any mainstream distro will do).

> for how long

The lifetime of the computer in all of the cases, or, in the last case, until you want to try out something else.

Seems decent in any case.


Worked very well for netbooks, with OEM specific distros.

It was a vision on what something like Android would be.


Not sure what's your point, netbooks like the eee PC are starting to become ancient history anyway.

When it comes to our current era, Dell, Lenovo and HP computers sold with Linux are fine, and there's KDE-related hardware that seem nice too [1] (at KDE they understood that it's important to be the default OS, so they are pushing towards this). system76 too I've heard. Obviously choice is more limited than for Windows (although macOS is doing well with limited choice too), more biased towards pro, but there are decent options. The installed distros are quite standard too, there's some customization but not more than what we see on Windows computers.

[1] https://kde.org/fr/hardware/


> walk up to any computer

Windows users seem to think their OS is ubiquitous. But in fact for most hackers reading this site, using Windows is a huge step backwards in productivity and capability.


However the facts speak otherwise? Windows at 70%+ versus 4.1% for Linux globally. https://gs.statcounter.com/os-market-share/desktop/worldwide


> But in fact for most hackers reading this site

https://survey.stackoverflow.co/2024/technology#1-operating-...


idk... I gave up years of trying to switch to Linux as my main OS after the obvious difference in stability, support, ecosystem, and...yes even responsiveness in many apps.


Surely you're hinting at Linux, in which case this runs fine with WINE


After getting frustrated with how unusual and convoluted BASH syntax tends to be, especially with involved scripts, I've tried almost all of the alternative shells: Elvish, Fish, Oil, Xonsh (Python!), Emacs's eshell, and even the Haskell-based shell repls like Turtle and Shelly. The only one I really stuck with was Nushell. It's friendly, pretty, intuitive, easy to read, heavy on pipes, and super powerful for data analysis. Plus it's modern replacement for a lot of tools, like `ls`, `jq`, `curl`, and so on. Writing a little command-line program is a joy in Nushell.


Almost every time I see claims to the effect of "learn any language" or "learn all languages," on HN or otherwise, it's about a smallish subset of languages. No matter how many languages you think you support, you don't support them all, so why make such a wild claim?


There are hundreds of these note taking programs now, and they all suck. Maybe they're fine if you don't really know how to use a computer, but if you're a developer or some other kind of hacker, and you use a text editor, like Vim/Neovim, Emacs, VS Code, or something else, your notetaking system should be in your text editor. These flashy new note taking programs basically just stick a crappy WYSIWYG text editor inside a web browser (Electron, yuck) and add a few opinionated components like tags or encryption, then try to charge you $9/mo for that, in perpetuity. But you've had the power within you this whole time: encryption with gpg, sync with git, search with grep, editing with your $EDITOR. If you need more, there are some editor plugins that supercharge note taking: vimwiki and vim-notes for vim; org-mode and org-roam for Emacs; Foam for VS Code, and so on. No need to reinvent the wheel.


America does not underestimate that difficulty. Donald Trump does.


If you read indiscriminately from lots of different news sources, bad ones and good ones alike, and you have an iPhone and you do most of your reading on it, this might be useful. But I rarely read news stories which appear in 100 different news outlets, to begin with—they're usually articles which are only on the New York Times: book reviews, commentaries, original data analysis, investigative reporting. The stories that do appear in a few different places are usually syndicated, anyway, from AP or Reuters, so they're virtually identical. Particle appears to be sorting news article groups according to how many different times an event is being reported on, in different news outlets, which to me is the opposite way I want my news sorted. Besides that, I'm fairly picky about where I get my news: I subscribe to certain papers because they've developed good reputations for journalistic integrity, they've won Pulitzer Prizes, and they employ fact checkers. I avoid news sources that don't meet these criteria. Particle seems to lump them all together. Rather than de-noising news, "by using AI to help people understand more, faster," it appears to be introducing noise where there need not be any.


As an atheist, I couldn't agree more. The fastest way to become an atheist, as it's often pointed out, is to read the Bible, which explains why about 95% of the Christians I've known have never read it. When asked, they say things like "the _whole_ thing?" or "cover to cover?" as if it were the unabridged Oxford English Dictionary, instead of just a single volume.

As a work of literature, with an unparalleled influence on Western culture, it's required reading for anyone interested in literature, philosophy, history, or related disciplines. As a blueprint for how to live your life, however, any reasonably ethical person will find its teachings to be at odds with modern ethics. More often than not, it's morally repugnant, and unapologetic about it, too. You won't hear about how backwards its morality is by going to church, though, and so you need to read it for yourself. Even Jesus, one of the most upright citizens in the book, is revealed to be as full of hate and vengeance as he is a "prince of peace."

Of course, as with everything, it really matters what edition you get. The New Oxford Annotated Bible is a good starting point. The Norton Critical editions (in two volumes, the old and the new testaments) are great, too, and include some of the source materials. (Bet you didn't know that a lot of the Bible was plagiar...ahem... adapted from much earlier texts, and from other religions.) The Skeptic's Annotated Bible is a great edition, too, and annotates everything from a secular perspective.


That is a pretty shallow perspective. With all due respect.


In what way is it shallow? I'm advocating for reading the whole Bible, and reading it deeply and critically: by reading the best available scholarly editions, by reading it alongside critical commentary, and by reading it alongside other contemporaneous works of fiction. It seems to me that's a reasonably deep perspective. The shallow perspective is the one usually practiced by Christians: reading the bible piecemeal, cherry-picking passages, reading it in isolation from other fiction of the time, and reading it uncritically as an ethical manual.


Next we need JS and CSS keywords in Esperanto, so that we're not privileging English-speaking regions over others.


My name, Ĝonatano, contains a ĝ, which is an uncommon letter outside of my language, Esperanto. But when I go to set my username to "ĝonatano," I'm often told that usernames "may only contain letters or underscores," as if ĝ weren't a letter. (You can see that I've approximated it in my HN username, but I don't need to do that on web services that correctly understand that letters exist outside of ASCII and Latin-1.)


To be fair, Esperanto is, as far as I can tell, not very widely used. The letter ĝ mostly returns Esperanto results. Using that letter in a place where others may need to communicate or type the letter would be a severe burden on almost anyone else you interact with, outside of Esperanto communities.

I'm sure there are plenty of people who share your frustration with accented letters, ñ, umlauts, etc, though. I'd hope that most systems can handle those letters, although I wouldn't hold out hope that Ĝ/ĝ would be high on the priority list.


> as far as I can tell, not very widely used

Well, it's the most widely spoken international language, spoken in over a hundred countries, by an estimated 2-5M people. There's a rich literature (probably 30-50K books), vibrant music scene, and support in open source software (Linux, Firefox, Google products) is usually pretty good.

But the issue is not how widely Esperanto, or any other language, is spoken. If you assume that languages should only be supported according to their number of speakers, you leave no room for useful languages, bridge languages, auxiliary languages, or growing languages. Even if Esperanto had only 100 speakers, it'd be worthwhile to support, if it's easy to learn, and easy for non-speakers to understand.

It's not a "severe burden" to consider non-ASCII letters as letters. Unicode is pretty straightforward to work with, and if you want to support more than just English, it's a necessity. There's no need to have a "priority list" of letters you consider more or less important than others. That attitude comes across as very Anglocentric.


What is the definition of an "international language" that makes Esperanto the most widely spoken one? Isn't Arabic an international language, for instance?


Esperanto is probably the most widely spoken international auxiliary language[0].

[0] https://en.wikipedia.org/wiki/International_auxiliary_langua...


For that matter, isn't English?

It's an official language in many countries, such as South Africa, India, and Ireland, each of which has other official languages.


Especially in Asia, where it's not exactly a primary language for anyone. It's funny that you'll be in a room with Chinese, Egyptians, Pakistanis, Indonesians, Ukranians, and everyone will speak English. I once saw a documentary which went deep into Borneo, via rivers, to a tribe that didn't wear shirts, and they spoke fluent English too.


I think the point is that of intentional international languages (i.e., international in the sense of not being tied to any particular country), Esperanto is the most widely used one.

That still is going to be a quite small number against English, Spanish, French &c. in terms of being a lingua franca.


> It's not a "severe burden" to consider non-ASCII letters as letters. [...] That attitude comes across as very Anglocentric.

Maybe I didn't communicate my thoughts clearly - the reason I call it a "severe burden" is because people won't know how to type it or how to pronounce it. I doubt many people have the ability to type the letter, and would have to copy-paste it. Even on Mac, where most diacritical characters are an opt+key away, the "ˆ" does not apply to the letter "g", resulting in "ˆg". "ĝ" would need to be treated the same as, for example, "¯\_(ツ)_/¯" - where users generally google it and then copy-paste it. Sure, there are ways to allow for easier retrieval (ex. I have "@shrug" set up to make the shrug), but most people will very rarely encounter "ĝ" or similar, and won't have a shortcut set up.


You also can't put in Cyrillic or CJK characters. It's a user name, not a human name, you should be fine just using the 26 ASCII letters for it. Basically anything that is a computer-centric string should be only ASCII and nothing else, because supporting all of human writing is a never-ending task.


It's also a dangerous one. For example, are a number of variants of "a" that are different characters in Unicode but are often indistinguishable in most fonts and/or at small font sizes: https://util.unicode.org/UnicodeJsps/confusables.jsp?a=abcde....


Are you a native speaker of Esperanto?


that would be so nice


When I do a Ctrl+F search for “Gonatano” one of the search results is the actual name as typed with the circumflex. I think that is kind of a handy feature of the browser I’m using but at the same time it is sort of weird since it technically is not the same name without the circumflex, right?

Also not all database systems would think the non-circumflex version is equivalent to the circumflex version. Does anyone have thoughts or ideas about how or why they should be treated equivalently?

I also recognize this can get kind of political. There was a push in California recently to let people have accented letters in their name. Apparently it is legally not allowed. And yet some people claim their California birth certificate does contain accented letters.


Postgres has a module called unaccent[0] that removes diacritics for filtering. I expect your browser is doing something similar. While not appropriate when looking for exact matches, when doing user-input based searches, this should probably be the norm, as the user may be unaware of the accents or how to input them correctly on their keyboards.

Dove deep on this years ago when implementing a filter for wines and wine regions.

[0][https://www.postgresql.org/docs/current/unaccent.html]


> but at the same time it is sort of weird since it technically is not the same name without the circumflex, right?

Assuming you have a "standard" keyboard, it's not weird at all for your browser to match the diacritic when you type the non-diacritic character since presumably the diacritic would be difficult to type. Firefox's search feature even has a [_] Match Diacritics checkbox which you can enable or disable.


This is absolutely the desired default behaviour for ctrl+F in a browser. e.g. I frequently read French, and don't normally want to have to put in accents in my search term when I'm searching text for a word containing an accent.

Firefox has a "Match Diacritics" checkbox right next to the "Match Case" box when you ctrl+F so you can configure as desired.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: