[conspire] Discussion: Using LLMs the Right Way: 10/1/2025 7pm Eastern Daylight time

Ivan Sergio Borgonovo mail at webthatworks.it
Thu Oct 2 02:30:16 PDT 2025


On 10/2/25 10:44 AM, Rick Moen wrote:
> Quoting Ivan Sergio Borgonovo (mail at webthatworks.it):
> 
>> The failure of your AI is mostly due to  Mercurial having a market
>> share of 0.17-2% (according to AI).
> 
> I don't think that makes any sense.  The failure Deirdre described had
> very little to do with the particular SCM's semantics or data model.

really?

>> You couldn't expect a LLM model to "understand" your problem, but to
>> match your description with a template of an existing solution in
>> the corpus.

> It seems to me like a minimal expectation to not have the LLM tool back
> out and lose the coder's changes, as a colossal and unpleasant surprise.
> I'm thinking, here, of Feynmann's comments about failure rates of
> O-rings on the Space Shuttle:  When the spec says something should
> happen never under any circumstance, it's not OK to shrug that it
> happens rarely.

Does this have anything to do with the capability of the LLM to answer 
or to the instrumentation they added to the LLM that gave it the power 
to alter the repo?

>> That's mainly a social and economical problem rather than a
>> technological one.
> 
> True but not relevant to Deirdre's point.  You knew that, right?
> The profligate overuse of drinking water (to name just one squandered
> resource) for server farm cooling for, lately, huge LLM installations
> (previously cryptocurrency mining) may not be a problem in climates with
> year-round watercourses everywhere (like Dad's native Norway), but it's
> already horribly damaging in, say, the American West.
> 
> That is a real problem, and saying it somehow doesn't matter being it's
> "social and economical" rather than technological misses the point that
> -- again, sticking only to water as an example -- humans, crops, etc.
> being edged out for fresh water is a big issue.
> 
> Now, you could say that my country ought to have better protections of
> key resources against depredation via greater buying power, and I would
> agree.  But the point remains, that the immediate effects are
> pernicious, which I believe was Deirdre's point.

So you're saying that I'm responsible for resource waste because I asked 
a LLM to show me a template on how to drive a PWM?
I bet I'm also responsible for microplastics because I didn't spend 
enough time to deconstruct a fucking package made of 5 different 
materials glued together.

Quoting Deirdre
«I like water. I like natural resources. I like them far, far more than 
the negligable benefits LLMs offer.»

The environmental impact doesn't repay the benefits because LLM are used 
to produce a LOT of slop. Not because they are inherently waste 
producing machines.

-- 
Ivan Sergio Borgonovo
https://www.webthatworks.it https://www.borgonovo.net





More information about the conspire mailing list