The Character of a Computer

Idle thoughts on the shared personhood of machines

The other day, waiting for either automated tests and/or creeping existential crisis to pass, I found myself thinking about how we, as humans, percieve and interact with computers. My coworker was fuming about the state of some code, and I sympathized and ended up telling him that the poor thing was trying its best, he should go easy on it.

Which naturally caused a raised eyebrow and one of those 'uh wat' moments. I dropped it and moved on, but I still feel the same way: That program was trying its best. That's not to say its a sentient, conscious being, but frankly most life isn't, so why should we hold that against it? That computer was built and assembled to always try its absolute hardest for you. And it does. Mostly.

Some context: From an early age, I had been fascinated with computers. My elders always had words of caution when I got overly excited by some new feature or technical advance. "Garbage in, garbage out," I heard with somber nods over every holiday meal. Computers will not save you. Not too bad, as far as generic ominous tech advice goes.

But, to my impressionable brain, there was a different lesson to be learned: Nomatter how frustrated I got with a computer, it was doing its best, and I had to respect and support it, like one tries to do with everything in the world.

I mean, despite all the AI marketing fluff out there, computers (and the programs that run on them) are basically infants: incredibly dependent and very fragile, throwing tantrums or just shutting down if needs aren't met. They have very little conception of your world and context, regardless of how you simplify it for them (IE, code). They literally cannot choose to do anything but do their best to interpret whatever work you give them.

I have to remind myself with every extension error or BSOD that they didn't know any better, and I have to understand it better to find what ailed it. Empathy is a powerful cognitive tool to let you observe and infer things that most would miss. Being able to understand something's perspective unlocks so many doors - especially handy when you're debugging a program or trying to fix a computer.

So, I say, humanize your apps, your devices. Give them names. Make them yours. Understand them, take care of them. Treat them with respect and love and you might find them a bit more amiable to debug. Its just an illusion, a hack of human instincts, but who cares? Let yourself see the personhood in things you'd assume don't have it, and you might be surprised at how satisfying caring about something can be.

Drop me a line!

© Sarah Robin 2019 - 2023