[conspire] Discussion: Using LLMs the Right Way: 10/1/2025 7pm Eastern Daylight time

Steve Litt slitt at troubleshooters.com
Wed Oct 1 23:25:21 PDT 2025


Deirdre Saoirse Moen said on Wed, 01 Oct 2025 14:56:13 -0700

>So I want to talk about the "risk" (lol) of dev jobs being taken over
>by AI, courtesy of a recent consulting contract that didn't go well.
>Suffice to say that, if it can't get *this* simple task correct, then
>it's going to just wreak more havoc than it could possibly solve.
>
>My recent adventure:
>
>Mercurial, as used at several large firms, manages *all* their source
>code in what’s called a monorepo. (Whereas at Apple, there are many,
>many source code repositories, which is far more typical).
>
>That means that when you make local changes, you effectively create an
>alternate universe for a moment. You commit that code, and then you
>attach it to a diff. When that diff gets approved, it then becomes
>part of the official repository and lives on the main universe for
>everyone rather than just on a side universe.
>
>However, my code didn’t get attached to the diff, and I was asking the
>AI for help to diagnose why and help me get that sorted (because that
>was the ONLY channel of support I had access to, seriously).
>
>I asked the developer AI for help. NOT to fix it, just to tell me what
>I was missing because my commit wasn’t showing in a diff, and I missed
>something, but…what?
>
>Instead of telling me, “Here are three solutions to your issue,” (the
>usual thing it did), it backed out my local changes, nuking them, and
>creating two more orphaned alternate universes called
>“(uncommitted/untracked changes)” in the post.

If used right, I think an LLM is a big benefit to an excellent
programmer. But the minute the LLM does more than advise, there's a big
potential for trouble. It's only 2025, and LLMs aren't good enough to
be in the drivers seat.

Let's take "in the drivers seat" literally for a second. You know why
I'm not against self-driving cars? I'm not against them because a
self-driving car, with all its bugs, is still much less likely to crash
than 98% of the imbeciles driving regular cars. Sure, self-driving cars
will cause crashes, but they'll prevent even more crashes related to
idiot drivers.

This is flipped on its ear when we speak of self-driving trucks, or
busses, or ships, or anything else usually driven by an expert
professional. It will take a lot longer to outperform truck and bus
drivers, and a ship's crew. And having the LLM do the commits: I think
that's whack.

SteveT

Steve Litt 

http://444domains.com




More information about the conspire mailing list