Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
When does moderation become censorship? (disqus.com)
28 points by tomkwok on May 23, 2016 | hide | past | favorite | 58 comments


Basically never, at least on the internet? No one owes anyone else a soapbox. My space, my rules. There's nothing stopping anyone from rolling up a site and saying whatever they want to say, but there's absolutely no reason I have to give space on my site, no matter what my site is, for anyone else's viewpoint.


You are contradicting yourself, and did not even answer the question. What you should have said was "as the site owner and moderator, whenever I damn well please". But instead you said it never happens.

I have almost zero sites I login to for a reason. I would think that as soon as a website becomes embroiled in taking a political viewpoint to the extreme of not allowing the opposite side to speak is when censorship comes into play. If we are talking individual attacks obviously that would and should fall into the moderation category.


I think what trowawee meant is that they never equate deleting comments (or whatever) with censorship. It sounds like you do. This is a semantic conflict re: what "censorship" means, precisely.

Personally, I think that deleting someone's comments is censorship, but that doesn't mean it's a bad thing. Censorship gets a bad rap, but purging certain viewpoints from a certain venue is not inherently evil or negative. For example, Hacker News does it with the shadowbanning system. I don't mind because dang is a thoughtful moderator.


Also censorship has a specific denotation, i.e. that is is performed by an agent of the state, and specific connotations, i.e. that the censored is unable to express themselves in other ways (among other connotations). A private citizen deleting your comment on a webpage they own and administrate matches neither the denotation nor connotations of "censorship", except in the broadest theoretical reading. And that reading that would appear to prohibit any maintenance of comment sections whatsoever.


> Also censorship has a specific denotation, i.e. that is is performed by an agent of the state

This is not correct; TV networks have censors that operate internally. The government doesn't operate television censors; it is limited to punishing the networks after the fact.

> A private citizen deleting your comment on a webpage they own and administrate matches neither the denotation nor connotations of "censorship", except in the broadest theoretical reading. And that reading that would appear to prohibit any maintenance of comment sections whatsoever.

This is a confused argument; you seem to be proceeding from the premise that censorship is bad, and therefore a phenomenon which is not bad can't be censorship. And then that if it was censorship it somehow wouldn't be allowed. (Observe that the definition of "censorship" cannot in itself prohibit or mandate anything.)

It's much more productive to agree on a definition of a term, and then argue over whether any particular example is good, bad, or a mixture of both, than to agree that a term must be good, and then argue over whether certain things are metaphysically able to fit under that term.


Thanks, I see this kind of (il)logic on HN a lot and it's infuriating. "I think X is bad, this is not bad, therefore not X" or "I think Y is good, since this is not good, this is not Y".


To be totally fair, the article endorses this premise, insofar as the only difference between "moderation" (in the sense of the piece) and "censorship" is that one is good and the other is bad.


1. When a television employee is censoring according to the rules of the state, they are operating as an agent of the state.

2. No, I'm not. I'm preceding from a definition of censorship as "the practice of officially examining books, movies, etc., and suppressing unacceptable parts." Getting into "the dictionary say..." arguments is mostly boring, but most of the definitions do include some element of official imprimatur. Someone removing comments from their personal site fails to meet that official element.


Media censors take the rules of the state into account, but they are primarily concerned with pressure groups, not the state. They are most definitely not state agents; that would violate the constitution.

The publisher of a website removing comments from it meets the same soft officialness standard as a newspaper declining to run articles on a particular topic, or a television network refusing to air episodes in which someone disagrees with the group but doesn't end up suffering for it (not an invented example, by the way -- that was a real element of a code governing children's cartoons).


The majority of censors are working to prevent state censure; they are primarily concerned with the state. The primary work they do is removing profanities and indecent images at the behest of the state.


"The primary work they do is removing profanities and indecent images at the behest of the state."

I'm sure that's the primary function of the Chinese government's Central Propaganda Department.

Censorship is when someone or some organisation who has power over you, control your ability to access information.

From the government regulating the content internet users can access deleting content contrary to the government's goals, to administrators of personal websites deleting comments by people espousing views, considered disagreeable by the owner of said website.


Except the government's power to censor you goes way farther than somebody on a private site deleting your comment. You're saying popguns and artillery exist in a continuous spectrum, and I'm saying they're completely different categories and should thus be treated differently.


The diameter of star R136a1 goes so much farther than a typical Red Dwarf. They might be different categories of stars, but they're both stars nonetheless.[1]

I'd say a personal website eliminating disagreeable views from comments is actual censorship, even if it's ability to silence views is a few magnitude orders smaller than state censorship.

If state censorship is an artillery, personal censorship is at least a zip gun, and it can injure someone.[2]

[1] https://en.wikipedia.org/wiki/R136a1#/media/File:The_sizes_o...

[2] https://en.wikipedia.org/wiki/Improvised_firearm#/media/File...


Who can it injure? The state can actually legally prevent you from saying something ("That is slander/libel") at the threat of punishment. I can just prevent you from saying that on my website. I can delete your comment on trowaweeritesgud.com, but I can't stop you from saying whatever ("Kumquats are a better fruit than mangos!") in the comments here, or on NPR, or from rolling up your own site and saying it there. These aren't just (massive) differences in scale, they're also massive differences in effectiveness of the injunction. That's what moves it into its own category.


You can't stop comments in NPR, and China can't stop people accessing information in a foreign country, bringing it back to China and providing it in person in a secret room either. They are massively different in scale, but that is not a reason to put them in different categories (same as with stars, there are massive differences between them in scale, but stars nonetheless)


> most of the definitions do include some element of official imprimatur.

It's more correct to say most include some element of superior right or ability, often not attributable to the state.


I think that's incorrectly narrowing the concept of "censorship" to mean only "government censorship," when other kinds exist (e.g. https://en.wikipedia.org/wiki/Corporate_censorship).


Mostly because, in an era where you can stand up a website on a worldwide communication network for less than $50, I kinda think that the definitions of censorship have narrowed.


>>Also censorship has a specific denotation, i.e. that is is performed by an agent of the state

Uh, no. Censorship means the suppression of speech or information. Said suppression does not require involvement from an agent of the state to qualify as censorship.

https://en.wikipedia.org/wiki/Censorship


Suppressing speech requires the power to make you stop. Deleting a comment on my site isn't suppressing your speech, because you can still speak where you want to except for on my site.


It suppresses speech on your site. That's what matters.

Same with government-enforced censorship: North Korea can censor speech, but only within its borders. Just because North Koreans living in the US can speak out against Kim Jong-un doesn't mean that what North Korea is doing doesn't count as censorship.


No, I'm not. If you're commenting on my site, I can delete your comment for literally any reason, and it isn't censorship. It's my space, and I can make whatever I want of it. You always have the freedom to go spin up a site and do or say whatever you want within the bounds of the law.


I think you have a point if you're talking about a personal website, but I think things get a whole lot blurrier when you're dealing with places that are communities, have an ethos of free expression, or have taken on the characteristics of public forums. I think it's an oversimplification to view those solely through the private property lens.


It's still private property, tho. I get that the internet has twisted up our perceptions so it seems like, say, HN is a public space. But it isn't. Somebody is paying to host this, and ultimately they get the final say about what shows up here because of that. I frequently disagree with Graham and dang. and frankly a ton of the YC folks, and if they decide, hey, the hell with this guy, he can't say we're a bunch of stuck-up overly-wealthy nincompoops on our site, that's 100% their right, and it's not censorship, because this is their website. Even if it has the characteristic of a public forum, it is not a public forum, and that distinction is meaningful.


It's still censorship, even if it's within their legal rights as owners to do it.


Unless net neutrality takes over.


That is not what net neutrality means.


My idea of moderation is:

1. Outline rules for a community, make them clear and understandable

2. Enforce those, and only those, rules.

3. If something objectionable happens, and the rules don't cover them but you think they should, then change the rules to how you see fit.

4. Handle the fallout of the rules changing.

Steps 3 and 4 are where things mostly fall apart in communities. Personally, any time something happens which leads me to change rules for something, it's reactive and is the voice of the majority of people (not the loudest, the majority). Most people are happy that way.

Disagreement is never a rule I'd have. I'd only make sure that people either have the tools to self-censor what they don't want to see, or tell them they might be happier going elsewhere.


> Outline rules ... clear and understandable

There are many discussions about rules being either clear or understandable, but usually not both.

> Enforce...

Oh. We "just" enforce it? You sure this mechanism will be both clear and understandable, and be enforced as intended?

> change the rules

Change the rules, in reaction to specific instances, while keeping them clear and understandable? This is why they're usually neither both clear and understandable and enforceable.

> Steps 3 and 4 are where things mostly fall apart in communities.

There is no community where this idea of moderation could hold true.


Really? It seems as though that is how most online communities try to follow this logic.

I fully get that 'clear' and 'understandable' are very subjective, though.


I've seen so many moral-handwringers try to justify silencing people by trotting out that XKCD comic as though they were "the community" that Cueball weirdly stopped feeling like a positive hacker character for me - half the time I encounter him now it's as some apologist for "I know best" assholery. The "Moral Majority" all over again.

In my experience it's hardly ever been the community showing someone the door, it's always a corporation getting cold feet over potential controversy, or some narrow-minded zealot abusing their janitorial moderator powers.

Last time this happened it was the community that overwhelmingly wanted to show the zealot the door (a vote happened while the mod was away), but they couldn't - they were nice normal people and not the kind of weasels who dedicate hours of janitorial work for the chance of having veto power over others in an internet forum.

Moderation and censorship can be separated - readers can be given the option to circumvent the moderation when/if they choose, or they can have the option to "unsubscribe" from the actions of moderators they disagree with. To create genuine public spaces we are going to have to find a way that doesn't assume uncorruptible integrity of moderators (the role attracts weasels). Private websites can certainly behave however they want, but they are currently standing in for our public spaces - and presenting themselves as such, which means we have none.


  ...or they can have the option to "unsubscribe"
  from the actions of moderators they disagree with
This is a really interesting idea. Another would be to make moderation opt-in, e.g. you select the curated filters you want to apply to forum X based on their reputation for fairness and pleasantness. These might in fact be the personal filter settings of upstanding members, or a vote between a few such members.

Anyway, thanks for the yummy food for thought.



A moderator --is-- a censor.


Yes. Agreed. They're different terms for the same thing, with different cultural connotations.


Basically what I came to say. daurnimator said it better, though.


If Disqus would simply design their software so it allows users to filter/block whichever users they don't want to see no one would need to be censored or moderated. All the trolls, paid or not, would vanish from a user's view and it would be up to the user whose messages they see. Problem solved.


Moderation: the action of making something less extreme, intense, or violent

Censorship: the practice of officially examining books, movies, etc., and suppressing unacceptable parts

Moderation is censorship.

To say otherwise is to make a distinction without a difference.

Rephrase your question to have meaning. Perhaps "When is censorship justifiable?"


Moderation is always censorship. But censorship isn't always problematic. Government censorship, or censorship by an entity that has effectively the same kind of broad control over media of communication is problematic.


As others have noted, moderation is censorship, at least by action. Neither however fall under the legal protections of free speech, which don't apply.

The first question is: what is a given community for, and how should it accomplish those goals? Individual moderation, collaborative filtering, individual killfiles, expert ratings systems, etc., are all tools toward these ends. They're non-trivial.

https://www.reddit.com/r/dredmorbius/comments/28jfk4/content...

Community behavior is very strongly dependent on BOTH scale AND founder cohorts.

A group with one person (blog, winking in the dark) is different from one with two, or a small set of people engaged in discussion (say 3-30), or a larger group discussing common topics (say 30-100), etc. Part of this can be thought of as a cost function in which the positive contribution of members falls with scale, while the cost of each additional participant is rather more constant. Eventually, adding more participants makes the experience worse for all.

It's really difficult for any one conversation to have more than a few key participants. Two and a moderator, or perhaps 5-6 participants who know each other well and get along.

If you're trying to arrive at some truth or understanding, it's really difficult not to have a truth-based moderation criterion.

Individual killfiles are somewhat useful, except that the killfilee tends to see a great many one-sided discussions (others interacting with those they've filtered). Unless the system blots out both sides, this accomplishes little.

I've started looking more at discussion tools which foster both smaller and larger groups. "Warrens" and "plazas". The idea of creating persistent communities well below Dunbar's number (say 50-300 people) and promoting up material from those has some appeal (you'd also want to allow for lateral movement of individuals). A small set of tiers would easily accomodate the entire global population. (And yes, there's a social network set up on this basis though the name escapes me.)

A huge problem with allowing noisy participants is that they draw the oxygen out of the room, and tend to very strongly discourage high-quality particpants. There's a curiously persistent asymmetry between an individual's proclivity to participate in a group discussion and the interest in others in their doing so. Good designs balance this mismatch.


It becomes censorship when a moderator removes a comment or discussion based purely on personal beliefs and disagreements.

Moderation should be about removing the trolls, not what it has become, which is censorship.

It feels like many people personally enjoy moderating down disagreements because they don't get any actual power in their own lives.

It's one of the reasons I stay away from most online discussions these days: because nobody can be open and honest. It really makes me wonder if this is the reason why secret groups like the masons were created. Back in those days, instead of getting moderating down, you were killed or attacked.


"Moderation should be about removing the trolls"

True, but some argue that those who support Trump, or who are against gender identity equality, are trolls.


That's the problem in a nutshell. About 95% of the people out there think the definition of "troll" is "someone who disagrees with me", and I don't see how you ever really get away from a subjective determination.


Because it's a fundamentally subjective evaluation. There is no way to objectively define "trolling".


> There is no way to objectively define "trolling".

Oh it's been very objectively defined. Knowing when someone is trolling, or when they legitimately believe what they're saying is the problem.


So...it's defined, but it's impossible to know whether it's happening unless the person specifically tells you they're trolling? That sorta seems like a flawed definition.


Can you define lying? Has lying been defined? Check your dictionary.

Knowing what a lie is is not the same as knowing when someone is lying.


I can prove something is false. I can't prove someone else's motivation.


> or who are against gender identity equality

"Anyone who doesn't agree with my ultra progressive views that people should be able to define their own gender identity is a hateful bigot and needs to be silenced"


Awww, sad lil' hater.


And Hacker News deletes it, <sigh>


Epic

Edit: I don't think this item has been deleted, it apparently was flagged and has been "moderated" automatically without human intervention?


Sounds plausible - abusing the flag/report system seems to be common with groups who like to act as self-appointed censors.

How did you tell the difference? (or suspect the difference)


... immediately.


A user-driven upvote/downvote system is a form of decentralized moderation. Do you also believe that if a post is being downvoted, it's censorship? Do you believe that upvotes are a tool of censorship?

They can be, of course. Censorship is not just deletions/removal. You can censor something by hiding it, drowning it out, discrediting the author, etc.

Now keep all that in mind before giving a one-word response to a very complex question.


The article uses moderation in a sense apart from an upvote/downvote system. Your argument is premised on a upvote/downvote system being moderation. How exactly does it moderate? If it merely reorders and doesn't remove, then this is not moderation in the context at hand.

Therefore your argument is invalid. (Your argument is really just orthogonal, premised on different assumptions.)


The point I was trying to make was that you can't just say "immediately" and get away with it. And upvotes/downvote don't necessarily have to merely reorder. On many websites, large counts of downvote automatically hide and sometimes delete the post in question.


Sorry about the delay in responding ...

I don't consider upvotes/downvotes by themselves to be moderation or censorship. That's just non-verbal communication.

However, when an upvote or downvote system hides a post or promotes a post, then it's censorship, because promoting a post necessarily hides some other post.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: