Hacker Newsnew | past | comments | ask | show | jobs | submit | 88913527's commentslogin

While I agree with the principle of "clean markup", whenever I open up the DOM of websites I did not create (and I do so with regular frequency), I generally do not see what I'd consider clean. The author is correct but it's not that frequent of a practice; I wish it were so. Care for our craft is an uncommon thing.


react-query's useQuery hook makes the proper data sharing and optimistic updates a breeze.


react-query (or @tanstack/query as its now called) has a few things which make it a little awkward to use beyond very basic fetches:

1. No concept of normalization. Fetching a list of entity type vs fetching a detail of one of those entities means you end up fetching the same data twice since the cache is only per query, not entity. And good luck trying to mutate its cache yourself to force it to cache by entity.

2. No good way to get the data out of react-query and into, say, my mobx/redux store.

3. Its query cache is global, but the `select` cache is per component/subscriber.


Leaders have a responsibility to ensure their teams deliver. When the teams agree to async communication --and the channels remain silent-- where is the accountability when there is no deliverable and no visibility into the scenario that led to no output? To be clear, this doesn't mean 100% sync. But a 10 minute daily sync can ensure the team is focused on producing something of business value. Maximal autonomy within some guardrails keeps the ICs and management at their best.


> Leaders have a responsibility to ensure their teams deliver. When the teams agree to async communication --and the channels remain silent-- where is the accountability when there is no deliverable and no visibility into the scenario that led to no output?

A good leader then talks to the team as the professional adults they are to understand why there isn't communication at the cadence that was agreed upon and discusses why such communication is important.


If there's no deliverable and no visibility, that's a failure of management, either to define the deliverable to a more granular extent where an update could cover the progress, even if async, which would also fix not having visibility. A developer piping up async and saying 'no updates, still working' is still an update and gives visibility.

If they (managers) want to lord over the process and do work for work's sake, then they should write out requirements that say "thou shalt commit one line of code by X date". It just comes down to the individuals doing the 'management' being lazy if they're in this position expecting their 'reports' to drop everything and context switch to a meeting when they can't be arsed to do the managing.


> If there's no deliverable and no visibility, that's a failure of management, either to define the deliverable to a more granular extent where an update could cover the progress, even if async, which would also fix not having visibility. A developer piping up async and saying 'no updates, still working' is still an update and gives visibility.

That's really the bare minimum that a developer can say and still get away with when you have a lenient manager. HN tends to take the most adversarial take on managers possible, but it's not always necessary.

But, when you've worn both hats (manager and managed), sometimes employees simply don't want to do much work. Again, you can say "that's a management problem", but some people are bad actors and are very good at hiding it in "corporate speak". The internet is full of stories of coasting --- I can even attest to doing it many times and easily getting away with it. Getting someone in a room, or on camera, and actually running through their work can cut through a lot of bullshit and bullshitters. Some developers think "coding" takes precedence over the business -- but some of their "coding" time is completely useless to the business perspective. It seems crazy to insist that managers have no stake in trying to figure out if that's the case.

Sometimes you also need to have a meeting because in an async setting some people have no idea what they're doing. Even with clear goals and deliverables, stuff can easily fall through the cracks.


I personally take an adversarial view on management because the large majority of the 'managers' I have had in the past were useless to the point of providing negative value, to both the technical and to the business side, so my views are colored by that fact. Not all of them of course, and they were the good ones. But most were just managers in name only.

A coasting employee may be a problem of the employee's making, but it is still a failure of management. If an employee isn't doing anything then you fire them. But it's still a failure of management because that wasn't fixed right away. And again, if someone doesn't know what they are doing (if that's truly the case why did you hire them, but that's another conversation), even async, a keen manager should be able to pick up on that and provide coaching, connections, or other resources--you know, actually 'manage' this person. Failing all that, then cut them loose. But it's still the manager's problem.


If they're coasting then you have a 1-on-1 conversation and if needed fire them. A manager who needs some type of public shamming ritual to do that isn't a good manager. There's a manager fallacy of focusing on minimizing the bad versus maximizing the good. In my experience, it's better to focus on making your best employees more effective versus trying to make your worst employees slightly less ineffective.


Most lived experiences with off-shore talent are due to labor costs. There are great offshore engineers but many work for companies who aren't hiring at the top end of the local market: they're hiring off-shore to save. You get what you pay for. And that leads to impressions, even if incorrect ones.


"You get what you pay for" is fair, but its also worth pointing out that in some places "money goes further".

In my city, I can go out, eat at a steakhouse, 3 courses, with wine, 2 people, and the total bill is $30-$40 total, not each). Nice sit down restaurant, good food, linen napkins.

Consequently highly skilled, senior engineers can be paid < $100k and still live like a king. If the exact same person lived in the US, or worse in an expensive part of the US, you'd pay more, probably 5 times more.

Once you embrace remote work (WFH) you quickly discover this very real geographical swing in value-of-money.

Of course -most- remote workers are crap. Most local workers are crap too. The remote-hiring problem is as hard as the local-hiring problem, probably harder. But the cost-savings are immense, and the long-term PR is significant. (Yeah, we're laying off 10% of support, but their all foreigners - kinda skips over the point that they're -all- foreigners to begin with)

You get what you okay for, but that bag of silver you have turns into a bag of gold elsewhere.


We're at the point where many newer engineers haven't had real hands-on DOM experience but are expected to deliver applications built in React. You need to know at least one level abstraction below your current one to use the tool effectively. This all tracks with how the industry's changed, how we hire, how we train, and so on.


> You need to know at least one level abstraction below your current one to use the tool effectively.

I'm not sure that's true in general.

For example, for some tools there's no single 'one level abstraction below'. Let's take regular expressions as a simple example. You can use them effectively, no matter whether your regular expression matcher uses NFAs or Brzozowski derivatives under the hood as the 'one level [of] abstraction below'.

(Just be careful, if your regular expression matcher uses backtracking, you might get pathological behaviour. Though memoisation makes that less likely to hit by accident in practice.)


>For example, for some tools there's no single 'one level abstraction below'. Let's take regular expressions as a simple example. You can use them effectively, no matter whether your regular expression matcher uses NFAs or Brzozowski derivatives under the hood as the 'one level [of] abstraction below'.

If you're OK with the ocassional catastrophically slow regex: https://swtch.com/~rsc/regexp/regexp1.html , sure.

But you do need to understand the abstraction of strings, code points and so on, if you want to do regexes on unicode that doesn't stop at the ASCII level.

In general yes: it's not an absolute law that you need to "know one layer below". You can code in Python and never know about machine code.

But knowing the layers below sure does help making better decisions if you want e.g. to optimize this Python code. Knowing how the code will be executed, memory layouts, how computers work, access latencies of memory, disk, network, etc sure does help.


> If you're OK with the ocassional catastrophically slow regex: https://swtch.com/~rsc/regexp/regexp1.html , sure.

I already addressed that in my original comment. Approaches based NFAs and Brzozowski derivatives don't have these flaws; but you don't need to know anything about how they work to use them.

You just need to read one blog post that tells you to avoid regular expression matchers that use backtracking, and you are good to go. You don't even need to understand why matching via backtracking is bad.

> But you do need to understand the abstraction of strings, code points and so on, if you want to do regexes on unicode that doesn't stop at the ASCII level.

Why?


>You just need to read one blog post that tells you to avoid regular expression matchers that use backtracking, and you are good to go. You don't even need to understand why matching via backtracking is bad.

Yeah, no.

You might not be able to avoid using your standard lib's regex, or your project's chosen regex dependency - based on team/company policy. So it's not as simple as "use a regex engine that doesn't has this flaw".

Then if you want to avoid the cost, you need to know what backtracking is, to the level of understanding which kind of expressions can give you those performance issues.

>Why?

Because there are tons of factors that can affect your regex experience with unicode, normalization, different lower/upper case treatment, composite characters that don't match even though it looks like you typed the same character in your query, handling new unicode characters (ASCII 7/8 bit has been fixed for decades) and so on.


> You might not be able to avoid using your standard lib's regex, or your project's chosen regex dependency - based on team/company policy. So it's not as simple as "use a regex engine that doesn't has this flaw".

Well, yes, if someone forces you to use tools that have flaws, you need to learn about the flaws so you can work around them. Like when using a shoe as a hammer.

I'm not sure that proves anything about abstractions?

See also https://blog.codinghorror.com/the-php-singularity/

> Because there are tons of factors that can affect your regex experience with unicode, normalization, different lower/upper case treatment, composite characters that don't match even though it looks like you typed the same character in your query, handling new unicode characters (ASCII 7/8 bit has been fixed for decades) and so on.

Thanks.


> (Just be careful, if your regular expression matcher uses backtracking, you might get pathological behaviour. Though memoisation makes that less likely to hit by accident in practice.)

Doesn’t that exactly demonstrate why using the tool effectively requires an understanding of the implementation (or possible implementations) behind the abstraction?


Understanding is perhaps sufficient, but not necessary.

You can just consult a list that tells you which regular expression matchers to avoid (like eg Perl), and which ones are good (like grep), and you are good to go. No need to understand anything.


It does. Their argument is a farce. By now they've had many chances and made many attempts to illustrate their point if they had one. They don't have one, but they somehow don't know it. What can ya do?


It does, and I had exactly that same reaction! Regexes are a leaky abstraction, like all of them.


Understanding is perhaps sufficient, but not necessary.

You can just consult a list that tells you which regular expression matchers to avoid (like eg Perl), and which ones are good (like grep), and you are good to go. No need to understand anything.


Regular expressions are probably not a good example because you will eventually write a regex that has catastrophic backtracking behavior. I encountered one a few months ago, so it’s not at all uncommon or difficult to encounter. If you’re curious enough, you’d end up reading about how regexes work under the hood.

A better example might be that of a compiler, where you (very rarely) need to look at the asm output or encounter a case where the compiler generates incorrect code and you need to debug why.


> Regular expressions are probably not a good example because you will eventually write a regex that has catastrophic backtracking behavior.

Nope, that will never happen to me. No regular expression has that behaviour.

There are some bad implementations of regular expression matching that have these problems for some regular expressions. But you can avoid those bad implementations without understanding anything.

> I encountered one a few months ago, so it’s not at all uncommon or difficult to encounter.

That only happens, if you use a regular expression matcher that uses backtracking. Sane regular expression matchers take linear time on all input strings.

> If you’re curious enough, you’d end up reading about how regexes work under the hood.

There's no one single way regular expressions work under the hood. You had the misfortune of using a matcher that uses backtracking and is prone to catastrophic exponential runtimes.

There's multiple different ways to implement regular expression matchers. But a user doesn't have to care or understand anything (they just need to avoid the buggy ones).

Sure, if you read up on how regular expressions work under the hood, you can learn to avoid those bad matchers. (Or you can learn how to live with your batch implementation, if you are feeling masochistic.)

But that's entirely optional: You only need to read one blog post that tells you to use grep and avoid eg Perl. You don't need to understand why backtracking is bad for regular expression matching; as long as you avoid those bad matchers.


I would say that knowing that there are such things as different families of regex compiler, what their internal methodologies and consequences are, so that you can avoid certain otherwise non-obvious problems that arise not from something you did wrong, but something a layer lower did wrong (you wrote a valid regex according to all the rules in the manual, and got a bad result) qualifies exactly as an example of needing to have some understanding of how the layers below work.

In fact it wasn't necessary for me to qualify that with "I would say". I do say, and it simply does. You've made exactly no argument this entire thread.

Maybe everyone doesn't need to know everything, but the skin of stuff anyone needs to know is thicker than 0, and has no absolute boundary layer either, you just generally need to know less the further from your own work. But that never goes all the way to 0. You have to rely on other people to have done their jobs, but you still at least have to understand what those jobs are, that they exist and how you ultimately interact with them and how they impact you.

You said so yourself several times which makes this all farcical.


I’m not sure if this response was simply for the sake of replying, with the claims of writing perfect code all the time or how backtracking implementation is uncommon or insane (most languages use backtracking implementations of regexes).


Huh? I'm not making any claims of writing perfect code. If you have a sane regular expression matcher, there are no catastrophic cases to avoid.

Regular expression libraries that don't use backtracking are available for many languages. Yes, some languages have bad libraries that use backtracking, too. But bad libraries will always exist, they aren't an excuse when there are more good libraries around then ever before.


You have a point, but it seems clear to me that your parent commenter was referring to more pervasive abstractions like frameworks and probably programming languages.

Unless you're coding in Perl or sed or whatever then regular expressions arent really a primary abstraction. And even when they are, I don't see how the implementation wouldn be accessible as a lower level. They're not really a layered abstraction.


I am saying that regular expressions are an abstraction that has many different implementations. So there's no single underlying implementation you need to understand.

The only thing you need to watch out for is regular expression matchers that are prone to exponential blowup. But you don't need to understand anything; you can just consult a list of which ones are bad and which ones are good, and then avoid the bad ones; without any understanding of the mechanics necessary.

> You have a point, but it seems clear to me that your parent commenter was referring to more pervasive abstractions like frameworks and probably programming languages.

Probably. I was just looking for the simplest example that the maximum number of people would be familiar with.


I guess that makes the point of: perfect abstractions vs leaky abstractions.


Some part of the process -- anywhere from specs to assembly-- had some cost cutting that led to the outcome. The root cause of this problem is something financially motivated.


While I don’t disagree with you that it is possible, it isn’t a very “engineering” approach to declare that is the case without doing a root cause analysis. Stating it as fact is as bad as the MBAs…


You don't know that.

Cost cutting could have been a factor. Or the root cause might have been something entirely different.

It's quite possible to choose to spend more money on a process or method you believe is higher quality, but still discover it has some specific problem that the previous cheaper version didn't.


It works because they have AWS. If the entirety of the business was only the consumer site, it's looking more and more like Wish with each passing day.


What's an alternative to Amazon? eBay is worse. Walmart.com has the same problem with third party vendors. There are few other "buy anything" webstores. I also noticed during some Christmas shopping that buying from Amazon was actually cheaper than the actual brand sites themselves, in particular, adidas shoes and a sodastream were cheaper on Amazon than on sodastream and adidas dot com. Exact same products. I don't know why.

Also, Wish takes 40 days to ship. Ebay and other sites take 4 days. If I order something from Amazon right now I'll literally have it in 4 hours. Doesn't matter for all purchases but it's so convenient.


Amazon stopped being better than eBay around 2019 and has fallen hard. The search sucks, getting a good quality item is hard, reviews are games. Yeah returns are easy, but not getting what you want and needing to return is a PITA. Returns are also just as easy on eBay and not nearly as needed.

Amazon with Prime still edges out on shipping speed but eBay is close.

eBay acquaints you more with the seller and if you are buying from a seller with "Top Rated" status you can be sure you will get a good product shipped quickly.

If Amazon can clean up bad sellers / fake reviews and fix the search they can get back to being the best.

Personally I find better prices and quicker delivery or pickup for most common goods at Walmart and for the Chinese stuff that Amazon is filled with there's Ali, Temu, etc.

When I need something more niche I prefer eBay but still search Amazon, but again, their search sucks.


I disagree. eBay isn't great, but the quality of goods feels higher than on Amazon. At least the brands are known to me and have a history. Amazon is now just a front for cheap junk from a revolving door of manufacturers who look to me like they are avoiding scrutiny.


It’s like the Churchill quip about democracy: Amazon is the worst online retailer, except for all the others.


I have used Target and Costco shipping to great effect.


Retailers like B&H and Best Buy have worked great for me. Additionally, it’s become much easier to buy direct as Amazon’s quality as a retailer has degraded.


> I don't know why.

Because Amazon does not let you sell your product cheaper elsewhere.


I know this sounds crazy, but you can buy straight from the brand's website if they have one. If you use a credit card or PayPal you still get buyer protection.


Like I said Amazon actually had better prices


Well, sure, but who knows if you're actually getting the genuine product?


It makes me think I’m the only person that’s never had an issue ordering things from Amazon! I have prime and sort by recommended and I’ve never had a bad experience. Is it only for particular types of items?


No, it’s everything. Can’t think of anything that wouldn’t count. Our issues are sponsored results crowd out the first page, strange no name companies crowding out brand names, can’t sort by retailer, or any usefu sort or filter mechanism.

It’s all gamed to have you click on a sponsored result. If you buy from third party better check the condition, terms, and return policy. Oh and it might be fake or previously used.

That said, I still find it useful but only because like someone else said, every other “a to z “ company is worse. That said, Amazon is competitive at price point still but it isn’t where I go to find products organically. I find recommendations elsewhere then comparison shop. Walmart at least you can sort sold by Walmart, BB/Home Depot work fine as well. Amazon has its place but it’s not a good or enlightening experience.


> No, it’s everything. Can’t think of anything that wouldn’t count. Our issues are sponsored results crowd out the first page, strange no name companies crowding out brand names, can’t sort by retailer, or any usefu sort or filter mechanism.

If it stopped there it would be almost okay, but then the page for the brand might get you items comingled with counterfeits.


Yes, I'm surprised at the amount of negative feedback. I buy quite a bit of stuff from Amazon and have never had a bad experience so far (my account has been active since the days when Amazon sold only books, circa 1997).

Items delivered work as advertised. Reasonable delivery times (I live in Singapore). I buy from both Amazon SG and US stores.

Are there some sort of filter where US based Amazon customers get targetted with shoddy products and foreign customers less so? Genuinely curious.


It depends on what you want.

Amazon in Germany has two problems in my eyes:

It's not possible to filter out the marketplace. The marketplace is full of dropshippers with highly inconsistent delivery times. There are some things I just want to buy from Amazon directly.

Chinese fire-and-forget brands crowd out real brands. Fine if you e.g. just want cheap gloves, but a terrible experience if you want more than that.


I've never had an issue either. I'd bet most people don't.


Disappointment in the permanent increases in prices. All else equal, people want deflation after a period of elevated inflation levels.


This is definitely part of it. People need to adjust to a new price reality, and while they think they want deflation they certainly do not want what a deflationary period entails is happening in the economy.


The Fed will fight across the board deflation like crazy because that's how the economy ends up in a depression.


We did have high inflation for a year, and were told it was temporary, before tools were deployed. Would a sustained period of deflation be met with a similarly lagged response?


My understanding (I am not an expert) is that tools for dealing with deflation are much less effectively deployable and have ripple effects that can significantly damage the real economy.


This is reasonable, but the problem I encounter is how stifling it seems to ask others to structure their work so specifically. By way of comparison, getting compliance on conventional commit messages is a challenge, and that's an appreciably smaller ask than this.


Oh, for sure. This is how I structure my own PRs, but I've certainly never bothered to ask a coworker to do so, I just appreciate it when I see it.

That said, OP is in an environment where it sounds like this kind of structure is already the cultural norm.


From another one who tries to do the same (but doesn't enforce it):

Thanks!


A quarter of households have net worths greater than $600,000. https://dqydj.com/net-worth-percentile-calculator/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: