• deweydecibel@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    8 months ago

    It’s probably not true anymore, but at the time this guy was being radicalized, you’re right, it wasn’t algorithmically catered to them. At least not in the sense that it was intentionally exposing them to a specific type of content.

    I suppose you can think of the way reddit works (or used to work) as being content agnostic. The algorithm is not aware of the sorts of things it’s suggesting to you, it’s just showing you things based on subreddit popularity and user voting, regardless of what it is.

    In the case of YouTube and Facebook, their algorithms are taking into account the actual content and funneling you towards similar content algorithmically, in a way that is unique to you. Which means at some point their algorithm is acknowledging “this content has problematic elements, let’s suggest more problematic content”

    (Again, modern reddit, at least on the app, is likely engaging in this now to some degree)

    • CopHater69@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      8 months ago

      That’s a lot of baseless suppositions you have there. Stuff you cannot possibly know - like how reddit content algos work.