Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This long article doesnt address the most critical point.

If employees have an ethical issue with working there, why don't they quit?

This isn't a new dillemma. People working for tabacco companies have always been in the same boat.



> If employees have an ethical issue with working there, why don't they quit?

Always super easy to say for someone else. Why haven't most of the people in the US government quit by now? In fact, insecurity about getting an equivalent job is a real thing. Not wanting to abandon a project you've worked on and expertise you've developed is a thing. Not wanting to abandon people you've worked with is a thing. Thinking you can do more good from within is a thing.

Even for those who consider quitting in protest, waiting until just after the next bonus/vesting day seems like a reasonable concession to other psychological or financial needs. Then that date rolls around, and all of that inertia is still there. Expecting others to make a hard decision that you've never been in a position to make yourself seems unrealistic and unhelpful.

[ETA: I don't mean to say this particular person has never been in that position. I don't know them. I've even forgotten their name. But I've seen this sentiment expressed many times and it always seems facile.]


agreed with this. for me personally, leaving FB was a confluence of directional concerns which arose for me in 2019 (as described in the leaked internal post), with a concomitant desire to spend my time on new, more human purposes. this was balanced against me enjoying my time working with the remarkable people on my team, a few logistical considerations, a surprise global pandemic, and also years of trust and good faith that FB had earned from me as both employer and societal phenomenon. i spent a little time scoping out stuff within FB as well (it's a big place), but factoring that out it still took me almost a year. the inertia is clearly there.

but i consider myself lucky! there are many people who experience all sorts of work-related insecurity with real gravitas. folks on work visas. folks paying off student loans or other debt. folks who had been working much worse tech jobs before and feel their FB offer was a lucky break. also the many folks at FB not on the tech side who don't earn stereotypical tech comp.

i have no idea how many people there are who satisfy these conditions or similar ones, but my guess is that if you're ever thinking to yourself, "why don't more people just quit?" it's because you're underestimating the scale of folks in that group.


> This long article doesnt address the most critical point.

> If employees have an ethical issue with working there, why don't they quit?

The article opens with a story of someone who quit.


It took him 9 years of collecting a fat paycheck and enabling Facebook to find his moral compass.

Is it wrong to be skeptical of these millionaire developers who ride off into the sunset to retire but leave a final "oh btw, lots of problematic stuff at my former employer" message?


It's not wrong to be skeptical. I do think there's a sort of boiling frog effect, though. Internal company culture never matches public perception, and you're always able to tell yourself a story that the public misunderstands what drives the company. After all, you hang out with your colleagues and you know they aren't bad people. But then some big controversy snaps you out of it and you realize that the company's internal collective fiction is no more accurate than the public perception, and perhaps you are the baddies after all.


hello, i'm the ex-employee in question and can speak authoritatively on the subject of myself :)

i joined FB for the first time as an intern almost a decade ago. i suspect i have a substantively different view of the company and the people who built it—even compared to many other employees, and certainly compared to folks who have not been in the ~room where it happens~ (heh).

even accounting for some amount of insider bias, i think there's still a material discrepancy between how the public viewed certain major FB "scandals"—via the lens of media spin and profitable reporting—and how many folks like myself, who were privy to additional context and private information, viewed them. to some people, the recent discord around hate moderation might seem like just more of the same FB badstuff. not so for me!

even folks at FB who don't work directly on certain products and policies are often immersed in discussion about them (unless they aggressively unsubscribe). these discussions, again, have their bias, but i hope we can concede that it's still a lot of passive brain cycles being spent swimming among these topics. folks develop deeper intuitions for how troubling X or Y publicized issue actually is, relative to all the things (of all flavors) that happen at FB and among its users. there's also a ton more discussion of legitimately positive societal work and how to extend those successes, whereas those rarely make good media narratives.

the good doesn't cancel out the bad—that's never how it works—but you do have to consider both together at every decision point.

in the leaked audio of my internal video post, i say that my long road to the door started in 2019. i still stand by that inflection point, and feel that before the, FB—while clearly on the back foot for some time due to rampant abuses by powerful groups of its product—was on balance moving in a direction i supported.

you may have made a different choice—or you may feel you would make a different choice, though you may not have the full slate of inputs right now to be sure you would, in situ. but, at least for me, i have reasons why the questions around the 2016 election or fake news or data breaches were things which, to me, seemed categorically different from the questions concerning hate moderation.

feel free to DM any earnest questions about why these things don't all congeal into one mass of problem for me (and i suspect other current and former FBers).


Thanks for this, Max. Easy upvote.

I'm still of the opinion that you chose personal gain over doing what you knew to be the right thing. It's something everyone (me included) does many times over. But the scale of wrongdoing by Facebook is exceptional.

Equally as possible is that our intuitions about "the right thing" are wildly different. I think that democracy, societal cohesion, and personal privacy are important and that Facebook has permanently damaged all three.

Which do you think it is?


i think you may be setting up a few false dichotomies here.

> the scale of wrongdoing by Facebook is exceptional

absolutely—but i think you (and certainly, many many people in the world) may be missing that this is in large part due to facebook's scale, period, not any specific wrongdoing-at-scale. facebook is enormous and has enormous impact. enormous quantities of good and bad things happen on facebook and through facebook every day. it's not enough to /just/ point to its "scale of wrongdoing" to say that it's "wrong" to associate with it. every government in the world causes harm at scale. should we embrace true anarchy? are we all culpable for participating in society? i mean, maybe yes to that last question; but it's not a very interesting answer, possibly because it's not a very interesting question.

i think you have to consider things holistically. facebook does harm—that's bad! does it do good at scale, also? how much of the harm is facebook's "responsibility", at least from the perspective of assigning moral culpability? (b/c obviously it's better to treat all fouls as your responsibility, because you control only your own actions.)

consider the 2016 election mess on social media. do we pin the blame squarely on social media here? if so, why? i think social media was caught off-guard. social media started its life as small circles and communities, grew into a media and meme distribution platform, and then somehow became hijacked by powerful entities fueled by state money (including states themselves) as a dezinformatsiya and propaganda side hustle. was it FB's responsibility to predict and prevent that, even when so few others did or could? if so—are those other people and platforms not morally culpable for not making enough noise or action? are the state actors themselves not morally culpable for their atrocious agency?

in late 2016 and early 2017, i read lots of opinions and articles on the NYT about how FB didn't defend against disinformation and propped up negative political content. why did the NYT write so few opinions and articles about how the NYT published story after story of tabloidal but-her-emails drama, or how the NYT gave the trump sitcom team so much free press?

facebook is very big and powerful but it is not beyond exploitation. should we hold it morally culpable for being exploited in these ways? facebook is not only the messenger, but a lot of the times it is, and perhaps we should not be so quick to proverbially shoot it.

> I think that democracy, societal cohesion, and personal privacy are important and that Facebook has permanently damaged all three

dovetailing off the above: are the instruments of the state not causing this same damage? are traditional media conglomerates not causing this same damage? are political entities accepting massive cash injections not causing this same damage? are certain social institutions like megachurches not causing this same damage? are social norms sculpted by late capitalism not causing this same damage?

do you curse your server for bringing you bad food?

this question is rhetorical. the answer is obviously yes, people do that all the time, but like, maybe they shouldn't.

so for me the question is, how much of this damage do i feel that facebook is "immediately" responsible for, versus "secondarily" responsible for. i brought up the 2016 election above. my view is that FB was in large part taken by surprise and exploited in the course of those events.

obviously, i have insider bias here. but much as both-sides-ing every single argument is not actually a "neutral" position, neither is the position of being on the "outside" and not having insider bias. neutrality is a fiction; there's nothing truthier about being on the insider vs. on the outside.

but my position gave me (and other FBers) access to information about motion, decisions, and human actions which informed my evaluation of culpability. much of the "scandals" in FB-the-darling-child-of-the-media's history had a similar feel. facebook had some posture, it largely worked alright and allowed for good or neutral stuff to happen, conditions changed, "small" (but still at-scale) but high-profile harm was incurred due to some bad faith actors, and facebook responded. not interesting to me, someone who watches this process happen literally every day.

employees didn't have trust in FB because they were given literal kool-aid to drink; it's because they had extra information and sometimes it led to obvious conclusions that are utterly non-obvious without that information.

but i think facebook's response to hate moderation has a different texture. it's had years to adapt to the new reality of constant assault from political forces (though i would never, ever expect or want perfection here). but rather than pushing back—as it did in pretty much all past privacy or data breaches—it seems to be adjusting to explicitly allow for and tolerate some of this behavior which i consider "bad".

why i stayed at FB past the moment at which i developed this concrete concern is for entirely selfish reasons. but don't conflate that inflection point with the whole history of FB's narrative, as told by the media. you may feel the whole story has a uniform texture, but i don't, and i suspect other FBers do not, and i suspect it's because there are good reasons to feel that way.


You're right. He quit. However he seems to be the exception.


Maybe, I guess it's hard to know without knowing their retention metrics. As one anecdote, I was in the negotiation phase of an Oculus offer earlier this year and turned it down the same weekend of that infamous post[1] and I know a few people in my network whose employment status has quietly changed away from Facebook.

[1] Less because I was certain they should moderate it, and more because I thought it was weak how Zuckerberg threw Jack Dorsey under the bus, and how cynically the company steers the conversation to free speech to avoid taking responsibility for the adverse effects of product decisions. artfulhippo made the point I'm trying to make up-thread: https://news.ycombinator.com/item?id=23929769


FB employees launched very strong virtual protest in recent times. Some had even skipped dessert after their meals. I do not know what further moral courage one can show beyond this.


> People working for tabacco companies have always been in the same boat.

That's not a very fair comparison - there aren't exactly any huge positives associated with tobacco use. So it's not like someone could work for a tobacco company and say "sure some people die from consuming our product but on the other hand look at all the good we're doing..."

That's very different than Facebook where yes there may be some negatives but there's also positives (e.g. it's certainly raised countless millions for various charities, helped people keep in touch with others and discover (or rediscover) relationships, and spread the awareness of important social issues).


The positives you listed are features of most social networks, not just Facebook. If Facebook disappeared overnight, another platform(s), hopefully more ethical, would quickly take over its role, and it could be a net positive for the world.


That seems very optimistic to me. Ethically, Facebook is the perfect exemplar of an internet-oriented company. No better, perhaps, but no worse either because it's such a low bar. It seems at least as likely that any replacement would turn out to be even scummier, simply because it would absolutely have to hire from the same talent pool and would have had even less time to figure out what all the ethical dangers are.

But it's simply not going to happen, so this is all moot.


Majority of people stay there. It's likely for one of 2 reasons: 1. They're morally bankrupt and prefer $ 2. This is very complex issue, and there's no simple right or wrong solution.


> If employees have an ethical issue with working there, why don't they quit?

$ > Ethics


$1.5 million dollars.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: