• 0 Posts
  • 68 Comments
Joined 2 years ago
cake
Cake day: July 7th, 2023

help-circle









  • Yeah, this can be an unpopular opinion on Lemmy, because there’s a giant Linux circlejerk. But the unfortunate reality is that changing to Linux does have some major stumbling blocks. The “switching is so easy, just do it” crowd totally glosses over it, but that’s kind of rhetoric doesn’t help long term adoption. Because if some new user has only heard “switching is so easy” and immediately runs into issues, they’ll be more likely to go “well if it’s super easy and I can’t figure it out, I guess it’s just not for me” and abandon things.

    There’s also a very vocal (and toxic) part of the Linux community that basically just screams “RTFM” at every newbie question. New users shouldn’t be expected to dig into a 350 page technical document just to learn the basics of their new OS.



  • The fact that you don’t need to actually supply any real CSAM to the training material is the reasoning being offered for supporting AI CSAM. It’s gross, but it’s also hard to argue with.

    Yeah, this is basically the crux of the issue. When you get into the weeds and start looking at more than just surface-level “but it needs CSAM to make CSAM” misconception, arguments against it basically boil down to “but it’s icky.” Which… Yeah. It is. But should something being icky automatically make it illegal, even if there are no victims?

    I hate to make the comparison (for a variety of reasons) but until fairly recently homosexuality was psychologically classed as a form of destructive/dangerous kink. Largely because straight people had the same “but it’s icky” response whenever it got brought up. And we have tried to move away from that as time has passed, because we have recognized that being gay is not just a kink, it’s not just a choice, and it’s not inherently dangerous or harmful.

    To contrast that, pedophilia has remained stigmatized. Because even if it passed the first two “it’s not just a kink/choice” tests, it still failed the “it’s not harmful” test. Consuming CSAM was inherently harmful, and always had a victim. There was no ethical way to view CSAM. But now with AI, it can actually begin passing that third test as well.

    I don’t know how I feel about it, myself. The idea of “ethically-sourced” CSAM doesn’t exactly sit right with me, but if it’s possible to make it in a truly victimless manner, then I find it hard to argue outright banning something just because I don’t like it.

    This is really the biggest hurdle. To be clear, I’m not arguing that being an active pedo should be decriminalized. But it is worth examining whether we’re basing criminality purely off of the instinctual “but it’s icky” response that the public has when it gets discussed. And is that response enough of a justification for making/keeping it illegal? And if your answer to that was “yes”, what if it could help pedos avoid consuming real CSAM, and therefore reduce the number of future victims? If it could legitimately help reduce the number of victims but you still want to criminalize it, then you are not actually focused on reducing harm; You’re focused on feeling righteous instead. The biggest issue right now is that harm reduction is very hard to study, because it is such a taboo topic. Even finding subjects to self-report is difficult or impossible. So we’ll have no idea what kinds of impacts on CSAM consumption (positive or negative) AI will realistically have until after it is widely available.




  • Yup, camp toilets are a similar concept. It’s just a 5 gallon hardware store bucket with a snap-on toilet seat lid. You line it with what is essentially a trash bag, just to make disposal easier. Then you use a gelling agent (just like what is in disposable diapers to allow them to soak up a bunch of moisture) to reduce sloshing and smell. It’s handy for when you’re going to be away from toilets for a day or two, but don’t want to (or aren’t able to) dig a hole to shit in.

    But the same concept applies for when you’re going to be trapped somewhere (like a classroom) for an extended period of time. Like, for instance, during a school shooting. When you have 30 kids in a classroom, there’s a very good chance that at least one of them will need to piss after an hour or two. And nobody wants to deal with human waste in something like an open trashcan during a lockdown.

    And as an added bonus, the bucket can be used to store all of active shooter supplies when it’s not in use. So everything is in a single location to quickly grab and prep. Active shooter happens? Great, just grab the big bucket out of the closet, dump all of the supplies out, and you’re ready to go. Now all of your tourniquets, styptic bandages, etc are accessible.

    But it quickly got distorted into “they’re making kids use litter boxes to indoctrinate them” instead.