• JusticeForPorygon@lemmy.world
      link
      fedilink
      English
      arrow-up
      30
      ·
      3 hours ago

      AI absolutely has its benefits, but it’s impossible to deny the ethical dilemma in forcing writers to feed their work to a machine that will end up churning out a half assed version that also likely has some misinformation in it.

        • JusticeForPorygon@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 hours ago

          I don’t think so, at least for a little bit. Big cooperation will surely try to market it that way, but we’ve already seen how badly AI can shit the bed when it feeds on its own content

    • driving_crooner@lemmy.eco.br
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      2 hours ago

      The company I work for recently rolled up copilot and is have been a mixed bag of reactions, the less savvy user were first blowed up by the demonstration but then got exasperated when it didn’t worked as they tough (one of them uploaded an excel file and asked to some analysis it couldn’t do, and came to me to complain about it), but for me, and my team had worked great. I’ve been uploading some of my python and SQL scripts and asking for refactoring and adding comments, or uploading my SQL script and some example I found on stackoverflow and asking for it to apply the example method on my script.

      I say to everyone that if they don’t know shit, the AI isn’t not going to help a lot, but if you have at least the basic, the AI would help you.

    • AbeilleVegane@beehaw.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 hours ago

      I like AI. But I’m not sure I like the way we use it if it’s only to meet shareholders’ expectations or to be a tool for greedy people. What is your opinion concerning the way we seem to use AI in academic research?