

ChatGPT is not sentient, it doesn’t feel, it doesn’t have thoughts. It has regurgitation and hallucinations.
ChatGPT isn’t sentient, doesn’t feel or have thoughts. It has <insert equally human behavior here>
While I agree with what you mean, I’d just like to point out that “hallucinations” is just another embellished word like the ones you critique - were AI to have real hallucinations, it would need to think and feel. Since it doesn’t, its “hallucinations” are hallucinations only to us.
Or maybe it’s trained on some SF. Any agents like ScubaGPT are always self-preserving in such stories.