• 0 Posts
  • 23 Comments
Joined 1 year ago
cake
Cake day: June 29th, 2024

help-circle









  • I mean yeah, I just said above that someone almost killed me. They were probably a human driver. But that’s a “might happen, never know.” If self driving cars are rear-ending people, that’s an inherent artifact of it’s programming, even though it’s not intentionally programmed to do that.

    So it’s like, things were already bad. I already do not feel safe doing any biking anymore. But as self driving cars become more prevalent, that threat upgrades to a kind of defacto, “Oh, these vast stretches of land are places where only cars and trucks are allowed. Everything else is roadkill waiting to happen.”











  • I just looked it up.

    "The AI research company DeepSeek recently released its large language model (LLM) under the MIT License, providing model weights, inference code, and technical documentation. However, the company did not release its training code, sparking a heated debate about whether DeepSeek can truly be considered “open-source.”

    This controversy stems from differing interpretations of what constitutes open-source in the context of large language models. While some argue that without training code, a model cannot be considered fully open-source, others highlight that DeepSeek’s approach aligns with industry norms followed by leading AI companies like Meta, Google, and Alibaba."

    So fake open-source.