• 1 Post
  • 45 Comments
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle


  • Did I say that it did?

    No?

    Then why the rhetorical question for something that I never stated?


    Now that we’re past that, I’m not sure if I think it’s okay, but I at least recognize that it’s normalized within society. And has been for like 70+ years now. The problem happens with how the data is used, and particularly abused.

    If you walk into my store, you expect that I am monitoring you. You expect that you are on camera and that your shopping patterns, like all foot traffic, are probably being analyzed and aggregated. What you buy is tracked, at least in aggregate, by default really, that’s just volume tracking and prediction.

    Suffice to say that broad customer behavior analysis has been a thing for a couple generations now, at least.

    When you go to a website, why would you think that it is not keeping track of where you go and what you click on in the same manner?

    Now that I’ve stated that I do want to say that the real problems that we experience come in with how this data is misused out of what it’s scope should be. And that we should have strong regulatory agencies forcing compliance of how this data is used and enforcing the right to privacy for people that want it removed.


  • I build software and can confirm this.

    This is pretty run-of-the-mill analytics and user session recording. There’s nothing surprising here.

    Usually it’s not actual screen recording but rather user action diff recording (Which effectively acts like recording the application except that it only records things that changed so that the recording is much cheaper to store)

    This is extremely effective for tracking down bugs, solving user support issues with software, or watching session recordings to figure out if users are using the software in unexpected ways.










  • Yeah, but they hold none of the actual real emotional needs complexities or nuances of real human connections.

    Which means these people become further and further disillusioned from the reality of human interaction. Making them social dangers over time.

    Just like how humans that lack critical thinking are dangers in a society where everyone is expected to make sound decisions. Humans who lack the ability to socially navigate or connect with other humans are dangerous in the society where humans are expected to socially stable.

    Obviously these people are not in good places in life. But AI is not going to make that better. It’s going to make it worse.