On AI

Beep boop bop.

6 Feb 2026

To be honest, I’m not scared of AI taking away what I do. I see it as a tool (however inefficient it is by depriving our planet or by stealing other’s copyrighted material) that’s abstracts away a lot of things. It’s honestly useful sometimes to get up to speed on how to do things or quickly see how a feature would look like! But maintenance and critical infrastructure will always need human approval. Would anyone like AI in their pacemaker? Talk about giving someone a heart attack! :P

In fact, I think AI slop highlights the waste we already endure!

Let’s take movies for example. When Avatar (the blue people movie) came out, it was highly rated because of the graphics. But if you were to watch it now, the graphics are nothing new and you’d actually be focusing on how sparse the plot is. Same with reading emails. How many times have you read something long winded where you thought to yourself, “Just get to the point already!”. In art, I see AI being used to get the pretty visuals out of the way while getting humans to focus on the more meaningful storyline. And in text form, I expect more people to concisely respond to things, dealing the death blow to the verbose and flowery prose that texting and social media had initially sidelined with their introduction.

It’s the same with software engineering. What is programming exactly? It’s the marriage between mathematics and language. That’s it. AI (specifically LLMs) understands the language part of programming really well but struggles with mathematics. And this is due to the way AI is structured: it’s a statistical machine that “guesses” what is likely to come next. Someone said it’s like “keyboard autocorrect on steroids” and I’d be inclined to agree.

Since we’ve defined programming as the marriage between language and mathematics, I can potentially see a future where AI can (maybe) slowly replace the language part. Here are some revolutions that I can potentially see with AI:

I think companies will eventually realize that coding isn’t simply about the number of lines you write, but the knowledge you gain from maintaining the codebase and interfacing with customers for their needs. In other words: trust.

So if AI is an abstraction and there is a lot of work still left to do (both to get AI to a viable state and the features that customers are requesting), why are there so many layoffs going on? Simple: it’s a long running macroeconomic recession. Ask yourself the following questions:

The fact that OpenAI didn’t guard their most prized possession from others or that technology companies aren’t pumping out useful features by the truckload shows that there is something else causing the slowdown in the global economy that everyone is missing.

In fact, when it’s time for the AI software companies to pay back their dues, I think software companies will realize too late that they are too dependent on Claude, ChatGPT, and Gemini. Their codebases will be too full of AI generated code and there will barely be any engineers who truly understands the codebase. And then it will be a gold rush for hiring engineers once again (similar to 2020) to try and monkey patch the codebase. In fact, there’s already a term for this for how these AI companies are operating: Predatory Pricing. Humans are the expensive resource that AI companies are trying to frame on getting rid of, but in the process are losing everyone’s trust.

It doesn’t have to be this way. We can set up a system where every human is valued and create a future where most would like to live, not just the top %. Instead of depleting the Earth’s resources or harming other humans in a system that values greed, I believe the best way to do this is to create a worldwide social safety net: Worldwide UBI. I’ll hopefully be going into this a bit in an upcoming post, but we will see when that comes out.

And on the point of consciousness: we’ll have a conscious AI once we can define what consciousness truly is. :P