More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

  • b1tstrem1st0@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    11 months ago

    Monetization of such content is questionable for sure, but I’m affirmative about what he says about the propagation of such extreme views. Simply being unaware about such things won’t make them go away. People should know who they are and why they are so we can deal with them better. There’s alot we can do better but can’t do because of limited awareness and our own negative attitude to deal with them.

    • adrian783@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      11 months ago

      this is absolute horseshit. there is a huge difference in giving platform to spread Nazi ideology and vigilance. to make it simple for you, one say “Nazi good” the other one says “Nazi bad”.

      there are examples plenty where not giving or actively purging hateful content and deplatforming reduces radicalization. substack is basically profiting from and encouraging domestic terrorism.

      • b1tstrem1st0@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        11 months ago

        This is not about difference of opinion. You are sending a very wrong message with that oversimplification. There is no such difference for Nazis.

        Deplatforming don’t always work the way you expect.

      • b1tstrem1st0@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        This is not about difference of opinions. You’re sending a very wrong message with that oversimplification. There is no such difference for Nazis.

        And deplatforming don’t always work the way we expect.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      11 months ago

      And how have the ones we already know about been dealt with? What tangible benefit have we achieved by giving them a platform?

      • b1tstrem1st0@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        11 months ago

        If I talk about benefits. For one, we can track an individual’s history.

        Edit: I assume you know how terrorism is still a thing, but I can say most likely you don’t know why it is. That’s the case with most people. Try to think about it.

          • b1tstrem1st0@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            11 months ago

            Let me say it in another way, by silencing the Nazis, you are erasing their footprints, hence making it easier for them to evade serious attention and live peacefully in their own little bubble.

            If I was an employer, I would never know one of my employee is a Nazi.