salutes
Just a regular Joe.
salutes
It’s accelerating trends that have already been well underway in the world, with the US leading the pack, and doubling down on its own demise (and apparently also working toward the active demise of European Democracy and Freedom) under trump and jd vance.
The analogy I always think of is: We’ve got shovels and we are in a big hole … which way are we going to dig? In my experience, most people keep digging down because it seems easier now, and eventually find themselves in a deeper hole.


That this is and will be abused is not in question. :-P
You are making a leap though.


While this is a popular sentiment, it is not true, nor will it ever be true.
AI (LLMs & agents in the coding context, in this case) can serve as both a tool and a crutch. Those who learn to master the tools will gain benefit from them, without detracting from their own skill. Those who use them as a crutch will lose (or never gain) their own skills.
Some skills will in turn become irrelevent in day-to-day life (as is always the case with new tech), and we will adapt in turn.


Indeed… Throw-away code is currently where AI coding excels. And that is cool and useful - creating one off scripts, self-contained modules automating boilerplate, etc.
You can’t quite use it the same way for complex existing code bases though… Not yet, at least…


Interesting fact: You can use an elephant’s trunk as a low-resolution 3D printing nozzle


Hah, yeah. Vibe coding and prompt engineering seem like a huge fad right now, although I don’t think it’s going to die out, just the hype.
The most successful vibe projects in the next few years are likely to be the least innovative technically, following well trodden paths (and generating lots of throwaway code).
I suppose we’ll see more and more curated collections of AI-friendly design documents and best-practice code samples to enable vibe coding for varied use-cases, and this will be the perceived value add for various tools in the short term. The spec driven development trend seems to have value, adding semantic layers for humans and AI alike.


Yeah - there’s definitely a GIGO factor. Throwing it at a undocumented codebase with poor and inconsistent function & variable names isn’t likely to yield great revelations. But it can probably still tell you why changing input X didn’t result in a change to output Y (with 50k lines of code in-between), saving you a bunch of debugging time.


Most code on the planet is boring legacy code, though. Novel and interesting is typically a small fraction of a codebase, and it will often be more in the design than the code itself. Anything that can help us make boring code more digestible is welcome. Plenty of other pitfalls along the way though.


I have a suspicion that the guy took issue with my use of “one” instead of “you”, more-so than the content. Maybe it came across as uppity.


It’s a changing world, and there is going to be an ever increasing amount of AI slop out there, and even more potential programmers who won’t make the leap due to the crutch.
At the same time, there are always people who want to and will learn in spite of the available crutches the latest tech revolution brings.
There will also be many good engineers who will exploit the tech for all its worth while applying appropriate rigour, increasing their real productivity and value manyfold.
And there will be many non-programmers who can achieve much more in their respective fields, because AI tools can bridge gaps for them.
Hopefully we won’t irreversibly destroy ourselves and our planet while we’re at it. 🙈


Hm? Oh, I obviously misread the room. It seems I interrupted a circle jerk? My apologies.


No, but it can help a capable developer to have more of those moments, as one can use LLMs and coding agents to (a) help explain the relationships in a complicated codebase succinctly and (b) help to quickly figure out why one’s code doesn’t work as expected (from simple bugs to calling out one’s own fundamental misunderstandings), giving one more time to focus on what matters to oneself.


This was similar to a trick that a few smaller (less serious) hobby-ISPs did back in the days of 14.4k/28.8k modems to take advantage of the “reasonably priced” business plans for ISDN. They’d register multiple businesses at a single address to qualify for the plans, then balance new egress connections across the pool using squid and other magic. Fun times…


I find it pretty useful to help get me over mental hurdle of starting something. So it’s faster than me procrastinating for another day. ;-)


Hmm. You are right, but they might not need it for every region. Steam is probably big enough that existing regional companies would come to it and be eager to form partnerships. They could become more of a payment processor aggregator, focused on a low risk market segment. And of course they can do CCs directly too - that’s the easy part.
The challenge will be to get consumers on board. I know that I groan every time I need to enter my CC details online these days.
They would face anti-competitive behaviour from Peepal though. So it’s a risk.
Internally, they are probably already working on ways to appropriately segment their catalog based on payment provider. “Sorry User, you cannot purchase title X using Paypal. We recommend $Competitor instead.”


It sounds like some payment processors are treating mastercard’s contractual requirements as a hard risk in this case - maybe it’s justified, maybe not. Try getting corporate lawyers to be risk averse in the finance world. Mastercard doesn’t seem to want to soften their wording but talks platitudes in public statements. Shrug.


They could do it with significantly fewer people, for themselves and even for GOG, Itch and potentially others. Their use-case is digital payments for games, which is limited in scope and risk. PCI and compliance is a PITA, but manageable.


https://www.urbandictionary.com/define.php?term=Hacker
“The media’s definition of the real term malicious cracker. A hacker used to be a well respected individual who loved to tinker with gadgets.”, plus a few other definitions.
Why? Because DOS and Windows 3.11 kind of sucked and I wanted to learn and experiment.
Even though I started out working mostly with the console, it was amazingly refreshing. X came a year or two later, when the web made it worth it. OS/2 Warp 3 also slipped in there for a while. Great times.