• 0 Posts
  • 67 Comments
Joined 2 years ago
cake
Cake day: June 21st, 2023

help-circle





  • Republicans only care about themselves. So they care about THEIR dog, THEIR family, etc. anything or anyone else is “others” that can burn for all they care. You see it when some of them suddenly support homosexuality when their kid is gay and “oh, so it’s actually NOT a choice!” But they’re also more likely to disown their kid or shoot their dog if it doesn’t meet their expectation or fit in their world view.






  • I took my dad for cancer radiation treatment. While in the waiting room, this little old lady came in. I saw her struggling to remove a necklace and offered to help. She had really tangled herself in it trying to get it on (definitely in a “chemo brain” mind fog).

    She answered her phone, and I heard a very obvious scam on the other line. I tried telling her, and at first she tried to explain to me that I was wrong, it was some kind helpful people. I took the phone from her and confirmed it was a scam. I told the staff at the clinic but that was about all I figured I could do.

    This Ai maybe could have helped. Maybe.



  • BossDj@lemm.eetoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    I meant that fabrication doesn’t imply intent as “lies” would.

    It seems like you use the hallucinations term correctly, when output has no relation to input.

    In this case, as in many numerous others, the Ai took input of “cite a source” and did as output cite a source as requested, but invented the content of the source. It fabricated, which means to make up, create.

    Fabricate does not imply intent to deceive, where lie does.

    I will agree that if the output is purely unrelated to the input, hallucination is still fine, but is absolutely a romanticized term when we’re referring to this computer generated code… It’s literally personification.