Hacker Newsnew | past | comments | ask | show | jobs | submit | MomsAVoxell's commentslogin

Kids not being informed about the war crimes of their state, and other states.

> Kids not being informed about the war crimes

Interesting to frame this as a bad thing. As a parent, I would take that as a feature, not a bug. To me this is very suspicious why there seem to be so many people here, who I am assuming are mostly adults, advocating so strongly strongly for <16 olds told be on social media, as if it was something they need.


You sound like a Russian government official.

Haha or a person who's been around lots of children of all ages.

An under 16 year old not seeing the social media version of war crimes is a good thing. And that's the upper limit of the age range of this ban.


I'm completely fine with there being a microphone in the thing. It's literally a remote eyes/hands interface, so it being an eyes/ears/hands interface is perfectly acceptable.

I felt the same, of moving to Los Angeles in the 80's and seeing the AIDS crisis take its toll on the street life. It was harsh, man. But then, the 90's happened.

I think Apple have jumped the shark, personally. Sure, trillion-dollar business and all that - but at the folk level, they have become the very thing they were always resisting: a tired old monopoly enforcing principles on their customers which are not in the customers' best interests.

OS vendors have lost the plot. Where a company decides to try to build an operating system for mass acceptance at scale these days, they build an ad delivery platform - not an operating system. The interests of far too many third parties have been elevated at the kernel-extension layer, and lower, and this is as troubling as it ever was.

Its the 21st century and people still don't understand how to manage the filesystem, having given all agency to the task to the backend/cloud, which harvests their data instead of granting the user more agency. In fact, most people have less agency over their data - and simply do not care about it - because they have been lulled into accepting the state of affairs by OS vendors who simply don't want to write a better Finder/File Explorer for the end user - choosing instead, to write an operating system for ad agencies to harvest user eyeballs.

Apple have traditionally avoided the usual pretence of 'ads in the start bar' by leveraging their platforms, and this is starting to fall apart at the seams. Convergence is going to be a joke, and will turn off a lot of computer users until a generation is raised, who will just accept the doctrine of their masters, and in so doing, lose knowledge to the generations.

I yearn for an OS vendor to build an operating system that really makes the user control over their computer and their data, a number one priority. Apple isn't it. Microsoft certainly isn't it. There are multiple Linux OS vendors who could be it, if only they'd get their hardware act into shape. There are hardware vendors struggling to attain this goal, too.

My next laptop won't be an Apple, after 30+ years of adoption of the platform. I fear the future that Apple is laying out ahead of us - just as I feared that of Microsoft and Oracle and IBM too, through the decades.

If there is hope, it lays with the (low-end open source hardware/software-agency-protecting) proles.


I have concerns too, but the file manager on macOS is not among them. The Finder has barely changed since its OS X 10.2 incarnation over two decades ago, except for gaining features (many of which were demanded by power users). A few settings need toggling on a fresh install (turning on status bar and path bar are musts, as is ~/Library visibility), but that’s the worst of it. Neither it nor the rest of macOS do much to go out of their way to obfuscate the filesystem.

iOS still needs work despite its file manager having become much more capable, but part of that comes down to the differing filesystem arrangement where user documents are kept within app bundles. If raw filesystem access were enabled, that model wouldn’t make a whole lot of sense even to many who are familiar with navigating filesystems. I could see the argument that it should be switched to a traditional desktop OS model, but that’s a deep architectural change.

Windows on the other hand… Explorer just keeps getting slower even if it’s not losing functionality, and Windows has always been poor when it comes to misrepresenting or obfuscating the filesystem. I hate trying to track down where files have been deposited in Windows boxes, and I would agree that it’s been contributing to users not understanding filesystems.


People being able to organise their lives with their computers has been a thing since the beginning of the personal computer. The filesystems we have were never really the 'best' - just the most viable.

The filesystem UI has been abandoned in favour of newer, better abstractions, such as 'just throw it all at the Cloud and let our analysis software give you a front-end to it, eventually..'

I think users not understanding filesystems isn't really a computing problem, but a literacy one. In some senses, computing becomes the victim of itself.


I would seriously question if “throw it at the cloud” is actually better or if it only became popular because that’s what’s most profitable for companies to push users into using. Local hierarchical storage doesn’t give nearly as many opportunities for rent seeking and data harvesting.

Cloud storage certainly has a convenience factor which is worth considering, but at the same time most people don’t actually need everything available everywhere at all times and only have a handful of files that can intentionally be synced as needed. I really don’t believe that supplanting traditional filesystems with a big bag of data that lives in the cloud is the right answer.


I think the point is that agency of the user over the locality of, and control over their data is based on decisions made by operating system vendors who, having 'given up' on trying to get users to understand the difference between folders and files, has figured out its better to just put everything in the cloud and 'own the tags and other abstractions' which come from a subscription service.

In any case, we see eye to eye on the convenience factor - it is inescapable, the success rate is clear - but we are looking at the edge case of things anyway, no? The future of Apple is an interesting one - we long term users surely can have an opinion. (Hacked my first Apple in 1981, haven't stopped since.)


Unfortunately the hardware is peerless. The 15" Air is outstanding. No fan, no lag, no sleep/wake issues, no trackpad issues. Impossible to use a PC laptop after such quality. The software though is a steaming pile of foolish UI decisions.

Yes, I can imagine.

It is all I can do, to consider the alternatives but rather make my own mind up about what should happen.

If I, ever, were to see humanity under my thumb from so far away, I would delight in knowing I was far enough away to see just how fragile it is, and always will be.

We humans are a special lot. Thumb sized.


I wrote a “multimedia engine” in those days, also a booter which used no MS-DOS calls, BIOS and assembly only, to render vector graphics on both CGA and EGA-based systems .. the engine was used to produce 3 educational titles before the publisher went under.

Definitely fun days, working out how to be a bit faster than the BIOS and not use a single bit of DOS.


Eat ones own dog-food, or in other words, get the company cooking something great together and sharing the results.

A great company basically opens on its first day and 48 hours later there are a ton of well fed customers who come back, not incidentally, again and again for what they perceive, is great.

But apropos feeding customers, if you can't 'eat your own food' dog or otherwise, why expect the user to want to do it ..

Use it. I agree.


This is evidence that there is a prior element to this 'problem', which is that - in order for Technology to exist, Ethics have to be aligned well enough to deliver, effectively, the result of the technology: a product.

The user, ethically, is another piece of evidence - especially in real time and at huge scale.

So you are so right about the user. The user comes first, the technology second, and when the service of the latter benefits the former, greatly, at scale, the people problems become, well, people solutions - i.e., the user.


The Santa Monica spot was, personally, a bit of an eye-sore after about 8 years. I kept wishing someone else would rise to the flamboyance, but nobody ever really did. Well, I'm wrong of course, but I never did see such a striking spot until I got to Europe, or whatever ..

I've walked past the 'folding into itself' building in Prague quite a few times, and never once realised it was a Gehry. I have also walked a fair bit in downtown LA, during summer, and seen some hot spots without understanding the nature of Gehry's ability, to reflect.

He lived a long time to have built a lot of interesting places for his fellow humans to reflect, and live in.

An architect of light, mostly.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: