Would you put a chip in your brain?
Let’s start this new year right with a light, breezy article. I’m gonna try to define the nature of human consciousness.
An idea has been rattling around my head for a while now. Like most great ideas, this one came to me while mildly drunk at a Halloween party with fellow writer Not a N*rd. The topic of the upcoming cyberpunk dystopian future came up, and we discussed which robotic augmentations we’d be willing to get. We were totally down for the robot sword arms and eye augmentations, but there was a hard line we drew: no chips in our brains.
But why draw the line there?
There’s a quick answer. The one that my gut comes up with before my mind. It’s that a chip in your brain would change something too fundamental about who you are: the way you think. Your consciousness. The technological alteration becomes too close for comfort. But too close to what, exactly?
I think we’ve all heard the joke about every human simply being “a brain piloting a meat suit”, and while it assuredly got the whole squad laughing in grade five, I think there’s a lot of people who really do view themselves from that lens. Who are they? They are their brain, everything else is controlled by the brain. I control my body, versus I am my body.
There’s a sense that the brain has an exalted role amongst the organs; that it’s the source of one’s true self, everything else just plays support functions. I sometimes wonder if there’s a sense of brain-ego involved: it assumes it must be the most important job because it’s the one doing all the thinking, and so the only one who gets a say in the matter. Is that rational? Is the brain more essential to the intact preservation of “us” than the lungs, or the heart, equally functionally vital organs?
We’ve all been told at one point or another to imagine the CPU as the brain of the computer. Let’s break new ground and imagine the brain as the CPU of the body. All sensory input needs to be passed through it, it’s processed and sent to short/long term storage, and then that information is used to determine the next action to take.
But let’s not forget, this computer is totally useless without software. It’s a collection of silicon and gold connected together in a very elaborate formation. The software is what gives the computer the ability to compute, to think. But where is the software? Which part of the computer is the “software” part you can point out on a motherboard? Software has this cyclic nature where it is both the enabler of, and enabled by, all the physical hardware components, and as such is arguably everywhere, a final phantom component in a PC.
Can you point to a computer and say that because the CPU is doing the calculation, the software “lives” in the CPU and the computer therefore is the CPU?
Can you point to a human and say that because consciousness is computed in the brain, consciousness totally lives in the brain, and therefore the human is just the brain?
I don’t know.
~
To be honest with you, I have no idea if this makes any sense at all.[^1] The only other person I’ve had a chance to chat about this idea with since Halloween is Not a N*rd, who politely yet vehemently disagrees with me.
She tells me that a computer can’t be compared to a human. Were a computer to lose its storage or RAM, it would become totally non-functional. With a person, even if they were to lose a limb (or gain a new one!) they would remain totally the same person, so the critical functions of personhood cannot be contained within those elements. If you lost the brain, however, you wouldn’t be doing much of anything.
All of this is plainly true, and I of course agree with it. I nevertheless have excellent counterpoints to these arguments, but I’m about to hit the word limit, so you’ll just have to take my word for it.
There’s an interesting coda I have to this point, though. Around the time I started thinking about my brain like this, it took some authority away from my brain in the mental model of my overall self. I began to think of my mind as a part of myself, not the whole self entirely.
I occasionally have random waves of anxiety. One of the things I thought I was doing was thinking about real problems I had—school, family, safety—and then having an objective emotional reaction to those circumstances. I thought because it was coming from my brain, and therefore an objective, “pure” sense of my own reality, it was therefore reasonable and objective to dwell on this worry.
What I realized was that the brain is a part of my body, like my stomach or my knee, that can just… hurt randomly for no external reason. I wasn’t reacting to a situation, I was feeling anxious and then coming up with an excuse to justify the thing I was already feeling, no matter how flimsy. If I waited for the worst to pass and then thought about the “problem” again, it was almost always less bad than it felt in the heat of the moment.
This is definitely not universal mental health advice (PLEASE do not dissociate because of a mathNEWS article) but I definitely noticed a change in how I approach stress once I started thinking about myself like this.
My gut really does come up with things before my mind sometimes. Who’s the one thinking in there?
Anyway, if anyone manages to figure out this whole “consciousness” thing, please let me know via an article in a forthcoming issue of mathNEWS. It’s the only thing I read these days.
~
1.: I wrote most of this at 2am. Does it show?