Hacker Newsnew | past | comments | ask | show | jobs | submit | jfim's commentslogin

That's what's meant by efficiency, it's allocating it to the place that has the highest return on investment.

As you point out, in practice what's efficient is what can capture the highest return, not necessarily the highest return per se. If say investing in education had high returns society wide but those returns couldn't be captured, that's not an efficient use of private capital.


As somebody doesn't consider himself a capitalist, wouldn't it be fair to say it is "the most efficient" in precisely one thing: capital reproducing itself?

And if so, why is that necessarily a good thing? Why should that be our goal as society as opposed to things like minimizing child mortality, increasing literacy rates, making sure we don't have a ton of our fellow humans living on the street in misery etc etc - things that make the lives of our fellow humans better? Why is capital growth the metric we have chosen to optimize for? Surely there's better things to optimize for?

Excuse the polemic, but infinite growth with no regard for anything else is the ideology of a cancer cell - and to me that is increasingly what it feels like when we are wasting all these resources on a dying planet just to make numbers go up.


Ultimately that money is made by people choosing to spend their money on something, because it helps them, because they like it, because they need it for whatever reason (real or imagined). That's what grounds the financial markets: eventually someone is buying a thing because they want the thing, and all the rest of it is basically just figuring out who can make the thing, how many people want the thing and how badly, and whether the stuff used to make the thing could make a different thing that people want more. Financial markets can depart from that reality for a while, but mainly because of a collective belief in some falsehood about the above (everyone really badly wants AI, right?).

Number go up infinitely is due to inflation and that's basically just an incentive to not hoard cash indefinitely, and instead use it for something useful. But the only thing that uses up is numbers. Everything else is because people, on average, want more stuff and are willing and able to work hard to get it.

(Of course, this generally means that the markets chase the desires of those who have something valuable enough. People who don't will be marginalised by this mechanism, for sure. And of course there's lots of opportunity for people to steal or abuse powerful positions in the market to the detriment of others. Which is why a free market is not the be all and end all of organising a society, and other organisational structures exist to regulate it and to allocate resources in a less transactional manner)


But isn't that counter to the very article we're commenting on? Everyone is shoving half baked AI junk into everything because that's what makes number go up on the stock market, but I'm pretty sure that's not actually what most people would want those resources to be used for.

I'd posit that markets are completely detached from the real world and are more of a speculative/religious element than an indicator of any ground truths.

Edit: I just realized I missed a sentence of yours where you kinda spoke to this. I still believe that this is more of a rule than an exception - there is nothing inherently tying markets and reality together - they're mostly about people making bets on what the next big hype is; not on what is actually useful to anyone.


Most people or most people with money? Ultimately it's the people making investment decisions you need to convince in order to get the money from them. Then the reality check is whether you can give them that money back and more. Those investors should in theory be self-interested in making sure that this is actually useful, because if they don't they'll lose their money, but in practice they are not superhuman and prone to fads and echo-chambers (especially because it's a relatively smaller group that can move a lot of money around, and short-term investing rewards running with the crowd to some degree), but there's not a single group of people that would not fall victim to poor decisions in this regard (whether you're imagining a centrally managed economy, the electorate, or some hypothetical benevolent dictator).

If you care about minimizing child mortality, increasing literacy, pulling people up out of poverty, you should be a capitalist, as it's empirically the best way to meet those goals. This seems to be a hard thing for many to understand or accept because it is largely a second order effect, the capitalist primarily concerned with their own personal gain but winds up improving the lives of others as a side effect.

This is the essence of Adam Smith's often misunderstood invisible hand metaphor. Of the individual he observed: "By pursuing his own interest, he frequently promotes that of the society more effectually than when he really intends to promote it." Second order effects stack up and improve quality of life for more people better than trying to do so explicitly.

Multiplying capital creates abundance and that abundance allows for improved standards of living for and the means to spend excess resources in support of charitable endeavors. Growth is good because it means more abundance and opportunity. I would argue that pursuit of growth is not an ideology but a force of nature. Life is opportunistic and will expand to wherever there is fertile conditions, and often adapt even when they are not. We are part of nature and understand this intuitively, seeking growth opportunities. As an example, one is better off being part of a growing company (more wages and opportunities) than one that is stagnant or declining (fighting for scraps and survival).


>If you care about minimizing child mortality, increasing literacy, pulling people up out of poverty, you should be a capitalist, as it's empirically the best way to meet those goals If you look at it empirically, the majority of people brought out of poverty (and I suspect the other metrics but am not as familiar with them) in the past few decades have been in China as the result of deliberate policies by the CPC.

Then why are all of these metrics worse in the US than even in notoriously poor communist Cuba?

Optimizing for capital returns is a simplification of the real world, where it allows for comparing whether it makes more sense to put one's money into opportunity A or B.

There's a lot that's not captured by solely looking at dollars, like the examples that you bring up, such as quality of life, human welfare, and so on.


It's been a while since I touched Scala but wasn't that a thing in previous versions, minus the braces not being present?

Yes, that's all just as it was, and in places braces were not required / interchangeable so this is more of an optional compiler choice than a real change

They could use the json.org license: https://www.tldrlegal.com/license/the-json-license

It's literally the MIT license with an added clause of only using the software for good, not evil.

Obviously, corporate attorneys will advise not to use the software since good and evil aren't really well defined legal terms. It's also not open source using the osi definition.


It's also unenforceable, therefore useless.

The smell from those though is quite something

Because people aren't going on AWS for EC2, they go on it to have access to RDS, S3, EKS, ELB, SNS, Cognito, etc. Large enough customers also don't pay list price for AWS.

A lot of people do use AWS for EC2.

Of the services you list, S3 is OK. I would rather admin an RDBMS than use RDS at small scale

> Large enough customers also don't pay list price for AWS.

At that scale the cost savings on not hiring sysadmins becomes much smaller, so what is the case for using AWS? The absolute cost savings will be huge.


In absolute numbers maybe it's a lot, but I doubt even 10% are EC2 only.

Even "only" ECS users often benefit from load balancing there. Other clouds sometimes have their own (Hetzner), but generally it's kind of a hard problem to do well, if you don't have a cloud service like Elastic IP's that you can use to handle fail over.

Generally everywhere I've worked has been pretty excited to have a lot more than just ecs managed for them. There's still a strong perception that other people managing services is a wonderful freedom. I'd love some day if the stance could shift some, if the everyday operator felt a little more secure doing some of their own platform engineering, if folks had faith in that. Having a solid secure day-2 stance starts with simple pieces but isnt simple, is quite hard, with inherent complexity: I'm excited by those many folks out there saddling up for open source platform engineering works (operators/controllers).


>Even "only" ECS users often benefit from load balancing there. Other clouds sometimes have their own (Hetzner), but generally it's kind of a hard problem to do well, if you don't have a cloud service like Elastic IP's that you can use to handle fail over.

Pretty much everyone offers load balancing and IPs that can be moved between servers and VPSs. Even if you have to switch to new IPs DNS propagation will not take as long as waiting out an AWS shutdown.

10% of what? Users, instances/capacity...? If its users then its a lot higher because it it gets commoner the smaller users are.

> There's still a strong perception that other people managing services is a wonderful freedom.

The argument is really about whether that is a perception of a reality. If you can fit everything on one box (which is a LOT these days) then its easy to manage your own stuff. if you cannot you are probably big enough to employ people to manage it (which is discussed in other comments) and you still have to deal with the complexity of AWS (also discussed elsewhere in the comments).


I'd be shocked if 10% of users only use EC2. And as you say, for the most part I expect these to be pretty small fry users.

I've used dozens of VPS providers in my life, and a sizable majority had not advertised any load balancing offerings. I can open tickets to move IP addresses around. But that takes time. And these environments almost always require static configuration of your IP address, which you need some way to do effectively during your outage.

Anyone who declares managing their own stuff to be easy is, to me, highly suspect. Day 0 setting stuff up isn't so bad, day 1 running will show you some new things, but as time goes on there's always new and surprising ways for things to break or not scale or not be resilient or for backups to not be quite right. You talk about employing people to manage for you, but one to three persons salary will buy you a lot of elasticache and rds. As a business, it's hard to trust your DBA's and other folks, to believe the half dozen people really have done a great job. Where-as you know there have been many people-decades of work out into resiliency at AWS or others, you know what you are getting, and it's probably cheaper than having your own team for many many people.

I want to be clear that I am 100% for folks buying hardware and hosting themselves. I think it's awesome and wild how good hardware is. But what we run atop is way way way too often more an afterthought than a well planned cohesive system that's going to work well over time. Thats why I am hats off to the open source platform engineering works out there. I think we're getting closer to some very interesting spaces where doing for ourselves starts to be viable, in a way that's legitimate & runnable in ways that everyone-figuring-it-out-for-themselved of the past was always quite risky and where the business as a whole or external systems reviewers rarely had a good ability to evaluate what was really going on or how trustworthy it was.

I aspire for us to outdo the perception that other people managing for us is a great freedom. I really long for that. But the kind of "meh it's not that hard" attitude I see here, to me, dissuades from the point: it undermines how hard and what a travail it is to run systems. It is a travail. But it's one that open source platform engineering is advancing mightily to meet, in exciting & clear ways, that the just throwing some shit up there past always made murky.


The last 18 years of tech companies I’ve worked for used AWS EC2, every single one.

A training corpus that includes the images from Google image search probably helps a lot.

Counting letters is tricky for LLMs because they operate on tokens, not letters. From the perspective of a LLM, if you ask it "this is a sentence, count the letters in it" it doesn't see a stream of characters like we do, it sees [851, 382, 261, 21872, 11, 3605, 290, 18151, 306, 480].

So what? It knows number of letters in each token, and can sum them together.

How does it know the letters in the token?

It doesn't.

There's literally no mapping anywhere of the letters in a token.


There is a mapping. An internal, fully learned mapping that's derived from seeing misspellings and words spelled out letter by letter. Some models make it an explicit part of the training with subword regularization, but many don't.

It's hard to access that mapping though.

A typical LLM can semi-reliably spell common words out letter by letter - but it can't say how many of each are in a single word immediately.

But spelling the word out first and THEN counting the letters? That works just fine.


If it did frequency analysis then I would consider it having a PhD level intelligence, not just a PhD level of knowledge (like a dictionary).

The automatic wipers on non Tesla cars use infrared sensors that have existed for decades, so they're a well known quantity. The Tesla wipers use the front cameras to detect rain, and those cameras are focused at a distance that's far enough to be able to see other vehicles, so they're not focused on the windshield, which is why they're unreliable.


Hadn't heard of jj, it's a source control tool that advertises that it's fast and compatible with the git on disk format. https://github.com/jj-vcs/jj



It is a scalding hazard though. At 70°C skin burns occur in about a second or so.

The thermostatic valve makes it so that the water that comes out of the water heater is at a more reasonable temperature.


I was surprised at how cold "hot" water actually was. I thought it was 60-70, but apparently what feels "hot" is around 45-50. Especially for me, that finds anything beyond my shower's "middle" heat as uncomfortably hot, I must be showering with around 40 C water, which is basically "hot day" hot.


Going less than 55 °C tap temperature / 60 °C in the tank is bad though, at least in larger installations - otherwise, you risk legionella and other microbial infestations [1].

[1] https://www.verbraucherzentrale.sh/pressemeldungen/lebensmit...


Oh definitely, I'm purely talking about "feel".


Then again, I grew up with thermostatic valves in the shower mixer, and a "child safety" latch to restrict it to even less hot water in normal use. I don't see why putting the mixer in the water heater is that different.


It's supposed to be HOT water, not lukewarm... I have my water heater set to 150F because it makes the hot water last longer, especially in the winter when the incoming supply is barely above freezing, but that doesn't make the tank bigger.


I think what this "smart tank" does is mix the super-hot with cold.

If the automatic mixing feature malfunctions to super-hot, then it could be risky...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: