Artificial Intelligence June 23, 2025
Artificial Intelligence is an interesting field. And quite divisive
I have some history with artificial intelligence. I was first formally instructed on it in a class with 4 other people (typical computer science classes would have 20-30), and I got a B. We covered things like Dijkstra's algorithm, we wrote Prolog programs, we read papers. It was quite a rudimentary introduction, but it was a 400 (senior) level course.
Of course, AI has come up lots of times in the time even without taking a class in it. Like playing against the “Hard Computer” in games like Starcraft or even back to “Ice Hockey” on Nintendo (where the optimal configuration was 1 big guy, 2 medium guys and 2 small guys). Or playing through the campaigns in various single player games.
Recently Large Language Models (LLMs) have come into focus. I've used them sometimes for crafting a peer review ("come up with good things to say about a project manager"). I've seen their utility in the education sector with saving time on certain tasks that teachers need to do. I've generally been interested in studying them. I've used one to generate generic images for some posts I had going on a Substack I had made a few posts to.
Coding is also a thing that LLMs can do. Sometimes people have been coding exclusively with LLMs, in a phenomenon calls “vibe coding”. This is the extreme. The opposite end of that excluding “no use of AI”, would be Googling “how to draw a line to a canvas in javascript” and it gives you like 1-3 lines of code. I've done that a few times because I didn't feel like scrolling past Google's AI generated result to find the human-written code. But that's the extent of the AI I will use in my code. That is a good use of it, because you could click on a few results before actually finding the answer, or the answer could be a combination of a few of the results, and the Gemini will basically summarize them. With the option to click further into results still available.
There are studies coming out that there's cognitive decline in developers who use AI frequently in their coding, without going to the extreme. Vibe coding is just dumb and talentless, I won't be using that use case in my thinking. But even without “AI Writes everything”, I think that developers are relying on AI code far too much. It's become a problem at the job recently, as some of the leaders in my practice are like “just have AI write it” or otherwise advocating for using AI when I just feel the exact opposite. Even in the case where I would need to use a technology, like front end tech, that I don't know at all. I'm going to learn it, not have AI write it.
I've been trying to summarize my thoughts on it. And while the wall of text so far would suggest otherwise, I came up with a good summary. Even without the problematic side effects that studies are starting to show, about the cognitive decline in AI users.
I won't use AI and be able to call myself a developer, like I wouldn't use AI to write stories and be able to call myself an author. Or have it write music and call myself a musician.
This isn't me being that thing, it's me using a tool to do that thing. Imagine the person that invented the player piano calling themselves a pianist. I mean they might be, but not because the piano can play by itself. They'd be the only pianist that could play hundreds of pianos at the same time separated by great distances, wherever these pianos are located. Or why would I learn to play guitar when I can just build a machine to do it for me?
There are exceptions to this, like where it makes sense. Some machines prevent intensive manual labor. Like some computer programs that I wrote, that will generate code or sync databases etc. These are tools. You're not about to chop up a sidewalk with your hands when a jackhammer has been invented. These aren't doing the “thinking” and “problem solving”, they're just helping get the job done.
It's one thing I just equate with computer science, with problem solving, with me being a programmer or computer scientist. You have to learn new things, you have to use the stuff that you learned and apply it to problems, sometimes in interesting ways. I don't care that an AI might be able to one day do it better or faster. I wouldn't be a computer scientist if I didn't do it.