• 0 Posts
  • 37 Comments
Joined 2 years ago
cake
Cake day: July 10th, 2023

help-circle

  • 0x01@lemmy.mltoTechnology@lemmy.worldWhy LLMs can't really build software
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    7
    ·
    4 months ago

    I use it extensively daily.

    It cannot step through code right now, so true debugging is not something you use it for. Most of the time the llm will take the junior engineer approach of “guess and check” unless you explicitly give it better guidance.

    My process is generally to start with unit tests and type definitions, then a large multipage prompt for every segment of the app the llm will be tasked with. Then I’ll make a snapshot of the code, give the tool access to the markdown prompt, and validate its work. When there are failures and the project has extensive unit tests it generally follows the same pattern of “I see that this failure should be added to the unit tests” which it does and then re-executes them during iterative development.

    If tests are not available or if it is not something directly accessible to the tool then it will generally rely on logs either directly generated or provided by the user.

    My role these days is to provide long well thought out prompts, verify the integrity of the code after every commit, and generally just kind of treat the llm as a reckless junior dev. Sometimes junior devs can surprise you, like yesterday I was very surprised by a one shot result: asking for a mobile rn app for taking my rambling voice recordings and summarize them into prompts, it was immediately remarkably successful and now I’ve been walking around mic’d up to generate prompts.
















  • “Which is bad news for developers”

    Nah, we’ve been through lots of iterations of community for developers, irc, maillists, forums, stackoverflow, etc. Most of my complex questions go through specific discord communities now. I’m not trying to spend a year editing a single post because some swamp ass weanie on stackoverflow has his nose covered in rule dust.

    Yes ai has changed the game a bit, but it is not removing community, it’s mostly just cutting down on the question duplication

    My most recent foray into a new technology was working with vulkan in rust on a mac, stackoverflow is useless compared to the vulkan discord.




  • Likely a prefrontal cortex, the administrative center of the brain and generally host to human consciousness. As well as a dedicated memory system with learning plasticity.

    Humans have systems that mirror llms but llms are missing a few key components to be precise replicas of human brains, mostly because it’s computationally expensive to consider and the goal is different.

    Some specific things the brain has that llms don’t directly account for are different neurochemicals (favoring a single floating value per neuron), synaptogenesis, neurogenesis, synapse fire travel duration and myelin, neural pruning, potassium and sodium channels, downstream effects, etc. We use math and gradient descent to somewhat mirror the brain’s hebbian learning but do not perform precisely the same operations using the same systems.

    In my opinion having a dedicated module for consciousness would bridge the gap, possibly while accounting for some of the missing characteristics. Consciousness is not an indescribable mystery, we have performed tons of experiments and received a whole lot of information on the topic.

    As it stands llms are largely reasonable approximations of the language center of the brain but little more. It may honestly not take much to get what we consider consciousness humming in a system that includes an llm as a component.