Sensing Sensors

Stare into AI and AI stares back at you.

19 Nov 2025

Woo! I got more peripherals working! Er… well, would it be AI’s copyright? Hm… more on that later.

Anyways, let’s start with the technical discussion first, shall we?

The Patch Series

Here’s the preliminary patch series!

Now, how does this work? Well… I wish I could go into the details, but I don’t have a schematic for this device. Since most of these peripherals’ drivers already exist upstream, it was just a matter of reading the downstream board file and converting that into nodes I could use in the DTS. So while I could describe the patch itself, I think you’re smart enought to read the patches yourself. Or you could give it to AI for a decent understand of how it works.

I mainly am writing this post on my thoughts on using AI to generate these patches though.

Diving into AI

Manually converting the board files into nodes was very problematic. For example, I was stuck on getting NFC working for my device for MONTHS, and boy was that frustrating.

That frustration finally made me cave into trying AI. You see, there aren’t a lot of guides on the internet how to do hardware bringup. And online, people either tell you to RTFM (even when there is no documentation) or ghost you. A lot of this information is locked away behind companies wanting to protect their copyright.

That’s where the appeal of AI came from. It has a good high level overview of the Linux kernel and how it’s changed over the years. It should be relatively easy for it to figure out the conversion, right? After all, it’s a rubber duck that talks back.

AI’s Speedup

I first tried to toss it something hard. I wanted to get the GPU working, and for that you need IOMMU. I think I got IOMMU working, but I just couldn’t get GPU working because it seems that the Adreno 225 doesn’t have a DTS compatible string. I’m still struggling to figure out if the approach I’m taking is correct or not.

If you noticed, in the patch series I didn’t include IOMMU by itself. Even though it probes successfully now, apparently IOMMU is broken on APQ8064 since at least 5.10 (if there is anyone with an APQ8064 device, please help bisect it and fix it <3). Someone did come into the Matrix channel to test the patch I created with AI, but it didn’t work, so I suspect my IOMMU patch doesn’t actually work unfortunately.

I then had the idea to shift to peripherals whose drivers were already in Linux. After all, half of the work was cut out already!

Here’s roughly how the patches came to be:

I was playing around with trying to get other peripherals (touchkey, vibrator, the camera’s led, and gpu) working in between these ones, but I don’t remember anything notable in trying to get it to work. And since they’re still broken, it wasn’t worth diving into.

Tips For Using AI

After a little bit of experience using AI to write DTS patches, I can now hopefully give some helpful tips for others:

It’s also useful to cut down on the search space if you can beforehand. In terms of hardware enablement, what that means for me is compiling the downstream kernel and getting rid of all the uncessary .c files that I don’t need (wanted to get rid of .h files, but I did this long ago and don’t really remember how I did it. I noted it somewhere on Mastodon).

If you can, try speeding up the test times as well, otherwise you will spend a lot of time waiting for things to flash, test, and send the errors back to the AI.

I also noticed that AI tends to go in a loop sometimes. For example, I was either trying to get touchkey or the vibrator working, and the AI kept thinking that the DTS formatting was the issue because that’s what the error during the build was saying. It messed up so much that I just had to scrap that work, sometimes losing my progress. Oh well, lesson learned that git is also useful during AI’s coding sessions.

AI’s Usecase

In it’s current state, AI will not be replacing humans anytime soon. I find that it’s very good in finding patterns in the realm of what it was trained on. Asking things beyond what it knows yields gibberish. It’s kinda like Nobel disease.

I think it’ll be usefull for repetitive work. Instead of yielding human hours, just train an AI with data on the task at hand and fewer humans can manage it’s output. It will be hard to tell what data is within AI’s known and unknown realm, as it will be responding to both confidentally. That’s where humans have to come in and filter the output.

AI Ethics

Let’s see what happens with AI in the near future. I will continue to try and play around with AI, and hopefully can land more patches upstream!