More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

  • Sanyanov@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    11 months ago

    I’ll just give an example.

    Recently, when discussing defederated instances, I’ve seen an interesting picture: people cheered defederating instances of Nazis and…pedophiles.

    An average person would see no issue here. Right, one more terrible group banned! Take those perverts down! But there’s a catch that I discovered quite a while ago, and it’s a rabbit hole like no other.

    • First, pedophile is not a child molester. We equate the two wrongly all the time, and those words became synonymous.
    • Second, pedophilia is an immutable trait; unlike with Nazis, no one can decide to stop being one.
    • Third, many of the pedophile instances, including those massively banned, actually feature anti-contact pedophiles, i.e. those specifically dedicated to never ever touch or interact with a child in any inappropriate way; said instances also generally prohibit any forms of child sexual imagery or only allow fictional drawings. And early research suggests it actually helps them. We brought up some reads and scientific articles on the matter throughout discussion. Conclusion: there are uncertainties, but it seems to work in protecting kids and reducing suicide rates.

    And when you see something like that, you clearly understand that there’s a lot of things in the world people still heavily misunderstand, while feeling certain about the position they didn’t have 5 minutes to research on, and that people are already on the slippery slope, banning groups they didn’t have time and effort to comprehend. And there’s a lot more of that than just pedophiles, this is just a very bright example that will probably make most of those reading this uncomfortable and will illustrate the concept best.

    Also, I’m full aware that most people will likely choose to downvote this, not comment anything and end up thinking I support child molesters (hell no, if you support child molestation go get some mental health asap, fucking kids is very bad)

    • user91@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      If all the content on those instances was ai generated then your hot take could be taken seriously. We all know it’s not.

      • Sanyanov@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        11 months ago

        I’m talking specifically about instances with strong rules, either prohibiting any child imagery or only allowing drawings (which is just about any anti-contact place). Both types are heavily defederated from, and barely anyone makes a difference between that and literal child porn instances (which should be not just defederated, but seized by authorities and admins brought to justice)

        I’ve updated third bullet point in accordance with your comment, thank you.

    • sc_griffith@awful.systems
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      sorry what exactly about banning nazis causes one to ban non-offender pedophile support groups. like what is the actual causal link you’re suggesting? if you just mean “I noticed random people endorse this thing I have no opinion on, and also this similar sounding thing I think is bad,” that’s not super compelling

      • Sanyanov@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 months ago

        I’m saying that banning Nazis comes from public opinion and perception (which is correct to my knowledge), and that banning pedophiles comes from public opinion of just the same people (which is wrong as far as I know). Both groups (third is instances full of bots and spam) are heavily banned on the Fediverse, so it’s not “some people’s opinion” but rather, essentially, a policy for majority of instances.

        This is to the point that the organized banning of groups that shouldn’t be banned and hate towards groups that shouldn’t be hated didn’t stop, and without venues for free speech, we may never know that and keep hating and banning those we need to support to make this world a better place.

        • sc_griffith@awful.systems
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          11 months ago

          by causal link, I mean how does banning nazis cause support groups for non-offending pedophiles to get banned. like how does that actually happen. please be as specific as you can be

          • Sanyanov@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            11 months ago

            I see.

            It’s not banning nazis directly causing banning non-offending pedophiles, it’s banning people considered dangerous causing both, with Nazis just setting the precedent (because obviously they are bad, and there’s little disagreement). Fedi is just one example where banning Nazis is not full stop. Other groups are banned too, sometimes without much consideration, and this happens on many different platforms - Tumblr, Discord, Facebook, and even daddy Elon’s Xitter, to name a few.

            This goes as part of my argument on why we need spaces with completely free speech. We cannot expect instance admins or even platform owners to be completely objective in their estimations of right and wrong, and we can’t trust them to be unaffected by societal stereotypes.

            Moreover, even in such an ideal scenario where they are fully objective, their userbase might think differently, forcing admins to take measures against various marginalized groups.

            At that point, it seems to me like the only way out of this conundrum is having some platforms - not mainstream ones, mind you - allowing everything: platforms, from which positive, but initially rejected ideas can spread.

            • sc_griffith@awful.systems
              link
              fedilink
              English
              arrow-up
              1
              ·
              11 months ago

              nobody but nazis wants to be on those lol. go post on gab or whatever if you want that. it’s free. you can do it. you just don’t actually want to

              • Sanyanov@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 months ago

                Why would I want to post anything on Gab, a far-right platform?

                I hoped we’ll keep on with sensible conversation.

                Substack, on its part, is used by various authors and is absolutely not limited to Nazis.

                • sc_griffith@awful.systems
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  11 months ago

                  the site you are imagining, the supposed free speech site? it converges to gab. this dynamic is basic and I can’t take you seriously if you don’t get this.

                  • nazis are encouraged to be equal voices on a platform
                  • they use the platform’s reach to radicalize fence sitters
                  • other users, realizing their digital roommates are Nazis, are alarmed and leave
                  • now it’s a nazi site

                  what exactly do you think substack will consist of in two years if they don’t do a 180? the entire reason we’re having this conversation right now is that a bunch of substack writers said they would rather leave than hang out with nazis