Little by little, I started to reply on tools like ChatGPT and Copilot to write boiler plate for me. No issue, it’s just saving me writing static void Main every time I create a new repo.

Weeks go by, I go from asking it to generate boiler plate to writing unimportant scripts I’ll use once and then delete. That’s fine, at least I still remember how to write a Python script myself.

A year goes by, my IDE has copilot on constantly, if I’m without it, it’s like writing code without syntax highlighting. I still know what I am writing but without, there’s a delay in what I want to do and knowing what code to write. The weird syntax of shell script slowly becoming harder to remember how to write. Maybe this is becoming an issue.

But look what these tools have allowed me to do, scaffold projects in a fraction of the time, make projects without even knowing the language or framework it is written in. But there are those positives?

I have can build faster, without a doubt, if I am building something the models have been traiend to make. What if I’m writing something novel, in a niche language? No Tooling is basically useless, try writing Scheme with Copilot which is only trained on Lisp. Its suggestions are believable, it must be right? No, you just introduced an insidious bug in your code you’ll uncover when it crashes in prod.

And faster to build? I have a million unfinished projects, building faster isn’t going to convince me to finish this one, I’ll hit a roadblock and ooh, shiny new project with none of the technical debt. AI makes this worse by writing sloppy but often sensible looking code and reducing the barrier to scaffolding the next project.

Now for the worse part of all, it gives you a false sense of security in your programming ability. You ask it about a new framework or language since no one enjoys reading the docs, it gives you reasonable code at first look. I think I know how it works until it’s too late, the code crashes or worse. You have no idea where to even begin to debug it.

So called “AI” is like that person you know who does word of the day, they sure seem knowledgeable from a distance but on any closer inspection, it falls apart.

So I’ve decided, I will no longer use AI for authoring any code. A while I uninstalled Copilot from Visual Studio Code, at first it was painful but I begin to slowly check Stack Overflow instead of asking ChatGPT or Gemini on how to implement X, Y and Z. But for simple scripts I still asked ChatGPT since I didn’t want to waste time on writing a long-ish but simple script but I never read with intent to fully understand not just how it worked but why it chose that way of doing so.

There are so many points I haven’t touched on here outside of programming from environmental impacts, lose of jobs over AI snake oil salespeople, misinformation production and everything in-between.