…without informed consent.

  • @LesserAbe@lemmy.world
    link
    fedilink
    English
    913 hours ago

    This is a good post.

    Thinking about it some more, I don’t necessarily mind if someone said “I googled it and…” then provides some self generated summary of what they found which is relevant to the discussion.

    I wouldn’t mind if someone did the same with an LLM response. But just like I don’t want to read a copy and paste of chatgpt results I don’t want to read someone copy/pasting search results with no human analysis.

    • @belit_deg@lemmy.world
      link
      fedilink
      English
      68 hours ago

      I have a few colleagues that are very skilled and likeable people, but have horrible digital etiquette (40-50 year olds).

      Expecting people to read regurgitated gpt-summaries are the most obvious.

      But another one that bugs me just as much, are sharing links with no annotation. Could be a small article or a long ass report or white paper with 140 pages. Like, you expect me to bother read it, but you can’t bother to say what’s relevant about it?

      I genuinely think it’s well intentioned for the most part. They’re just clueless about what makes for good digital etiquette.

    • @Almacca@aussie.zone
      link
      fedilink
      English
      510 hours ago

      If you’re going to use an LLM, at least follow the links it provides to the source of what they output. You really need to check their work.