

Can’t believe you cut the other half out



Can’t believe you cut the other half out



I’ve been wondering lately how effective a service would be that takes your credentials for a service, such as YouTube, and starts engaging with content at random so that the data the service has on you is all junk
Don’t know how meaningful something like this would even be, but if all these companies are gonna try so hard to collect data on me then I’d rather fill that data with useless junk


I don’t doubt you, but do you have a source for that figure?


This makes me think we’re looking at the other side of the uncanny valley


people would rather feed another addiction (spend hours on TV and TikTok, but one hour cooking is too much)
I’d argue that people engage in these activities because people are tired from working too hard for too little for too long


opens new tab
the verdict “misunderstands services that are critical to the security, performance, and reliability of Android devices.”
Ummmm maybe I’m misunderstanding but how on earth is opening a new tab critical to security and performance?


It’s strange to think about how complicit the public has become with this. You mean to tell me that 185 separate connections to other companies are required for me to… read an article?


I’m not very well read on this so I could very well be off-base, but couldn’t you leverage the heat as a means to desalinate saltwater instead of using freshwater and letting it evaporate into the atmosphere?
The thing that’s mostly wrong with AI summaries is that people don’t click through to the page the summary summarizes. So those sites don’t get ad revenue.
Don’t ad blockers have a similar effect?


Those damn corners weren’t round enough yet, thanks Apple designers👍


Gif compression definitely makes it look more believable, I remember falling for this the first time I saw it


Also doors and gates
They may also have concluded that the public finds a humanoid robot more acceptable than those cube 4-wheeled robots that never took off that people like to tip and kick over and stuff
Might be a silly question, but is YouTube TV considered a part of YouTube here?


Same here, Malcom X practically read like a footnote to me when I was taught by my school


I mean we’re talking about kids who are functionally illiterate. The system has failed to teach them this basic skill. Critical thinking about complex and nuanced topics is way beyond that!
I agree with you there, and I don’t think we’re really all that far off from each other. Writing has both synthetic (the critical thinking to which I referred) and syntactical (what I believe you’re getting at) components to it, and kids have been missing out on the synthetic component for quite a while now and are now beginning to miss more of the syntactical part as a result of AI.
Where I disagree with you is:
And the problem is they’re not going to learn the basic skills if they use AI to prevent themselves from doing any work.
Kids not doing their work didn’t start with AI. LLMs haven’t even been mainstream or otherwise publicly available for three years yet. A lot of these kids were never going to complete coursework in good faith because the curriculum is failing to engage them. Either that, or there are influences in their lives that make it altogether impossible, such as poverty or neurodivergence. In my other comment I was speaking mainly to career readiness, but the principle of meeting students where their circumstances and interests lie applies throughout their time in K-12.
A trend I’ve noticed in this issue is demonizing students (hence why I keep bringing it up). These kids had nothing to do with their parents putting iPads in front of them instead of reading to them when they were little, or having to take classes that were designed before their parents were born, or so many other observations about the structure of education that make it archaic and broken (perhaps by design, but that’s out-of-scope here). Every stakeholder around this issue should be discussing with each other the ways that school can better serve students; instead, we’ve hastily created a stigma that using AI to complete assignments that you don’t understand, don’t have time for, or simply couldn’t care less about makes you a cheater.
It is truly a wicked problem, and I believe the way that our leaders haven’t adapted education is primarily to blame. I haven’t even mentioned social media, and I think that government’s inability to regulate it has its share to blame for kids struggling in school. But as problematic as AI is, it is not the reason why this is happening, and we may have to agree to disagree on that point.


I may disagree with you that the ability to write alone is where the problem is. In my view, LLMs are further exposing that our education system is doing a very poor job of teaching kids to think critically. It seems to me that this discussion tends to be targeted at A) Kids who already don’t want to be at school, and B) Kids who are taking classes simply to fulfill a requirement by their district— and both are using LLMs as a way to pass a class that they either don’t care about or don’t have the energy to pass without it.
What irked me about this headline is labeling them as “cheaters,” and I got push-back for challenging that. I ask again: if public education is not engaging you as a student, what is your incentive not to use AI to write your paper? Why are we requiring kids to learn how to write annotated bibliographies when they already know that they aren’t interested in pursuing research? A lot of the stuff we’re still teaching kids doesn’t make any sense.
I believe a solution cuts both ways:
A) Find something that makes them want to think critically. Project-based learning still appears to be one of the best catalysts for making this happen, but we should be targeting it towards real-world industries, and we should be doing it more quickly. As a personal example: I didn’t need to take 4 months of biology in high school to know that I didn’t want to do it for a living. I participated in FIRST Robotics for 4 years, and that program alone gave me a better chance than any in the classroom to think critically, exercise leadership skills, and learn soft and hard skills on my way to my chosen career path. I’ve watched the program turn lights on in kids’ heads as they finally understand what they want to do for a living. It gave them purpose and something worth learning for; isn’t that what this is all about anyway?
B) LLMs (just like calculators, the internet, and other mainstream technologies that have emerged in recent memory) are not going anywhere. I hate all the corporate bullshit surrounding AI just as much as the next user on here, but LLMs still add significant value to select professions. We should be teaching all kids how to use LLMs as an extension of their brain rather than as a replacement for it, and especially rather than universally demonizing it.


Do we have to throw mud at “cheating” students? I’ve been hearing similar stuff about K-12 for a while with regards to looking up answers on the internet, but if the coursework is rote enough that an LLM can do it for you, then A. As a student taking gen-eds that have no obvious correlation to your degree, why wouldn’t you use it? And B. It might just be past time to change the curriculum


Again… So much proprietary software is the industry standard, particularly Adobe, and much of it is Linux-compatible, making it not so easy to make the switch as a freelancer


Yeah I’ve only used Inkscape a little bit but I would not categorize it as a program that can quickly be picked up
Apple