

I propose we trick our fellow Americans by making smol cars offroady enough to embarrass an F150:






Look at them! Who would want a rolling brick over that?
And the Ford Focus is already mostly there.


I propose we trick our fellow Americans by making smol cars offroady enough to embarrass an F150:






Look at them! Who would want a rolling brick over that?
And the Ford Focus is already mostly there.


It’s literally style. Those pickup lifts often ruin durability and off-road capability.


No politician should be simped for.
Unfortunately, simping has proven to be very effective :(


https://www.comparitech.com/kodi/kodi-piracy-decline/
Based on our research, comparative search volume for “Kodi” has fallen around 85 percent from 2017 to 2022. Google Trends data reveals the dramatic decline started in Q2 of 2017 and has, for the most part, continued that trend up to this point. Consequently, the decline in people searching for Kodi directly relates to the appearance of the coordinated attack against piracy in the form of ACE.
And this is with Kodi furiously distancing itself from pirates at the time.
Attacks don’t have to be direct. Though they absolutely can be, too.


That serves the purpose too. It’s harder to pin Plex as an “illegal distribution service” when you have to pay for access. Either the streamer or “distributor” can’t be very anonymous, which makes large scale sharing impractical.
On the other hand, the more money they squeeze out, the more they risk appearing as if they “make money from piracy,” which is exactly how you get the MPAA’s attention.


You may (half) joke, but MPAA attention on Jellyfin would suck.


Playing devil’s advocate, I understand one point of pressure: Plex doesn’t want to be perceived as a “piracy app.”
See: Kodi. https://kodi.expert/kodi-news/mpaa-warns-increasing-kodi-abuse-poses-greater-video-piracy-risk/
To be blunt, that’s a huge chunk of their userbase. And they run the risk of being legally pounded to dust once that image takes hold.
So how do they avoid that? Add a bunch of other stuff, for plausible deniability. And it seems to have worked, as the anti-piracy gods haven’t singled them out like they have past software projects.
To be clear, I’m not excusing Plex. But I can sympathize.


For all the criticism of AI, this is the one that’s massively overstated.
On my PC, the task energy of a casual diffusion attempt (let’s say a dozen+ images in few batches) on a Flux-tier model is 300W * 240 seconds.
That’s 54 kilojoules.
…That’s less than microwaving leftovers, or a few folks browsing this Lemmy thread on laptops.
And cloud models like Nano Banana are more efficient than that, batching the heck out of generations on wider, more modern hardware, and more modern architectures, than my 3090 from 2020.
Look. There are a million reasons corporate AI is crap.
But its power consumption is a meme perpetuated by tech bros who want to convince the world scaling infinitely is the only way to advance it. That is a lie to get them money. And it is not the way research is headed.
Yes they are building too many data centers, and yes some in awful places, but that’s part of the con. They don’t really need that, and making a few images is not burning someone’s water away.


It’s misleading.
IBM is very much into AI, as a modest, legally trained, economical tool. See: https://huggingface.co/ibm-granite
But this is the CEO saying “We aren’t drinking the Kool-Aid.” It’s shockingly reasonable.


That’s interesting.
I dunno if that’s any better. Compiler development is hard, and expensive.
I dunno what issue they have with LLVM, but it would have to be massive to justify building around it and then switching away to re-invent it.


…The same Zig that ditched LLVM, to make their own compiler from scratch?
This is good. But also, this is sort of in character for Zig.


They’re pretty bad outside of English-Chinese actually.
Voice-to-voice is all relatively new, and it sucks if it’s not all integrated (eg feeding a voice model plain text so it loses the original tone, emotion, cadence and such).
And… honestly, the only models I can think of that’d be good at this are Chinese. Or Japanese finetunes of Chinese models. Amazon certainly has some stupid policy where they aren’t allowed to use them (even with zero security risk since they’re open weights).


Hostly, even a dirt cheap language model (with sound input) would tell you it’s garbage. It could itemize problematic parts of the sub.
But they didn’t use that because this isn’t machine learning. Its Tech Bro AI.


All true, yep.
Still, the clocking advantage is there. Stuff like the N100 also optimizes for lower costs, which means higher clocks on smaller silicon. That’s even more dramatic for repurposed laptop hardware, which is much more heavily optimized for its idle state.


This is interesting, because “add ads” usually means margins are slim, and the product is in a race to the bottom.
If ChatGPT was the transcendent, priceless, premium service they are hyping it as… why would it need ads?


Same with auto overclocking mobos.
My ASRock sets VSoC to a silly high coltage with EXPO. Set that back down (and fiddle with some other settings/disable the IGP if you can), and it does help a ton.
…But I think AMD’s MCM chips just do idle hotter. My older 4800HS uses dramatically less, even with the IGP on.


Yeah.
In general, ‘big’ CPUs have an advantage because they can run at much, much lower clockspeeds than atoms, yet still be way faster. There are a few exceptions, like Ryzen 3000+ (excluding APUs), which idle notoriously hot thanks to the multi-die setup.


Eh, older RAM doesn’t use much. If it runs close to stock voltage, maybe just set it at stock voltage and bump the speed down a notch, then you get a nice task energy gain from the performance boost.


Depends.
Toss the GPU/wifi, disable audio, throttle the processor a ton, and set the OS to power saving, and old PCs can be shockingly efficient.
And functional! You’re looking at Dakar rally champions, and hatchbacks so unreasonably fast on dirt, their league was banned.
This is what actual off-roaders look like.