

I mean, I wouldn’t call tcpdump a “hacking tool”…


I mean, I wouldn’t call tcpdump a “hacking tool”…
It’s really popular in the server world, and it’s the foundation of many other distros, maybe that’s why?
No Bias, No Bull AI I’ve spent my career grappling with bias. As an executive at Meta overseeing news and fact-checking, I saw how algorithms and AI systems shape what billions of people see and believe. As a journalist at CNN, I even hosted a show briefly called “No Bias, No Bull”(easier said than done, as it turned out). Trump’s executive order on “woke AI” has reignited debate around bias and AI. The implication was clear: AI systems aren’t just tools, they’re new media institutions, and the people behind them can shape public opinion as much as any newsroom ever did. But for me, the real concern isn’t whether AI skews left or right, it’s seeing my teenagers use AI for everything from homework to news without ever questioning where the information comes from. Political bias misses the deeper issue: transparency. We rarely see which sources shaped an answer, and when links do appear, most people ignore them. An AI answer about the economy, healthcare, or politics, sounds authoritative. Even when sources are provided, they’re often just footnotes while the AI presents itself as the expert. Users trust the AI’s synthesis without engaging sources, whether the material came from a peer-reviewed study or a Reddit thread. And the stakes are rising. News-focused interactions with ChatGPT surged 212% between January 2024 and May 2025, while 69% of news searches now end without clicking to the original claiming neutrality while harboring clear bias. We’re making the same mistake with AI, accepting its conclusions without understanding their origins or how sources shaped the final answer. The solution isn’t eliminating bias (impossible), but making it visible. Restoring trust requires acknowledging everyone has perspective, and pretending otherwise destroys credibility. AI offers a chance to rebuild trust through transparency, not by claiming neutrality, but by showing its work. What if AI didn’t just provide sources as afterthoughts, but made them central to every response, both what they say and how they differ: “A 2024 MIT study funded by the National Science Foundation…” or “How a Wall Street economist, a labor union researcher, and a Fed official each interpret the numbers…”. Even this basic sourcing adds essential context. Some models have made progress on attribution, but we need audit trails that show us where the words came from, and how they shaped the answer. When anyone can sound authoritative, radical transparency isn’t just ethical, it’s the principle that should guide how we build these tools. What would make you click on AI sources instead of just trusting the summary? Full transparency: I’m developing a project focused precisely on this challenge– building transparency and attribution into AI-generated content. Love your thoughts.
- Campbell Brown.


I don’t use any GUI… I use terraform in the terminal or via CI/CD. There is an API and also a Terraform provider for Proxmox, and I can use that, together with Ansible and shell scripts to manage VMs, but I was looking for k8s support.
Again, it works fine for small environments, with a bit of manual work and human intervention, but for larger ones, I need a bit more. I moved away from a few VMs acting as k8s nodes, to k8s as a service (at work).


I do the same in Proxmox VMs, in my homelab, which is… fine. I was talking more about native support, manageable via an API or something.
Say I need to increase the number of nodes in my cluster. I spin up a new VM using the template I have, adjust the network configuration, update the packages, add it to the cluster. Oh, maybe I should also do an update on all of them while I’m there, because now the new machine runs a different docker version. I have some Ansible and bash scripts that automates most of this. It works for my homelab.
At work however, I have a handful of clusters, with dozens of nodes. The method above can become tedious fast and it’s prone to human errors. We use external Kubernetes as a service platforms (think DOKS, EKS, etc), who have Terraform providers available. So I open my Terraform config and increase the number of nodes in one of my pre-production clusters from 9 to 11. I also change the version from 1.32 to 1.33. I then push my changes to a new merge request, my Gitlab CI spins up, who calls Atlantis to run a terraform plan, I check the results and ask it to apply. It takes 2 minutes. I would love to see this work with Proxmox.


Man, I’ve been living and working in Germany for close to 10 years now. Proxmox is like that 50yo colleague of mine. Hard worker, reliable, really knowledgeable, a treasure trove of info, but he can’t be budged. He insists on installing any new VM using the GUI (both Windows and Linux), he avoids learning “new things” like Docker or Kubernetes, and really distrusts “the cloud”.
I will keep using Proxmox, as I have for many years both at work and at home, but we are migrating from a VM (with Docker) setup to Kubernetes. It would have been great for Proxmox to offer some support there, but…


They also have an API, I think a chunk of that revenue comes from there. Think 3rd party apps and services having chat bots, writing assistants, etc that use openai’s API.


Give easyeffects a try.


Ffmpeg is totally capable of doing this. Something like ffmpeg -i in.mkv -vf "crop=width:height:x:y" out.mkv might work. You would need to specify the crop area (x:y), which you can get with ffmpeg’s cropdetect. Here’s an article about it. To automate this, I would use a for loop in a shell script, for more control, or just a one liner if width, height and x:y are the same for all:
for file in *.mkv; do ffmpeg -i "${file}" -vf "crop=width:height:x:y" "cropped_${file}"; done


I’ve been daily driving Hyprland for 4 years now. Before that it was DWM, and before that Gnome. I was never a KDE fan, don’t know why… I never disliked it, I just preferred Gnome.


https://github.com/suitenumerique/docs
Self-hostable, needs Minio (or any S3 compatible system).
Oh no! Anyway…


I immediately ordered a new one from another vendor, as I really needed to build a workstation, and let the purchasing department handle the case. All I know was that the seller did not believe/accept the “wrong cpu story”. From their perspective, it was a sealed box…


It happened to me a few years ago, when I ordered for work an i9 9900k, and inside the sealed box was a core 2 duo… After the seller (not Amazon) refused the return, I looked up a bit online, and it’s a common practice. I even found rolls of “Intel original” seals for 5€ on eBay.


They’ve owned it since 2011. When they did buy it, they had Lync, which sucked pretty bad. Now they have Teams, probably the result of merging Lync and Skype.
First time I heard of Gitee, I don’t think it’s that popular. Also, their website appears to be in Chinese only.
Gitea on the other hand is pretty popular, but after some controversial decisions, Forgejo was born and it started getting a lot of traction.
I have
~/work/code/project-name-1,~/work/code/project-name-2or~/priv/code/project-name-3, but not by language… I only separate work and private repositories.