AI separate things

|

Today I learned about technologies being separators. I watched a fantastic video that only has a few views and subscribers. I think it will explode soon: https://www.youtube.com/watch?v=1CbmC2aWhjY. As of now Sunday 17th of may, 3.5K Views only 205 subscribers (refreshed the page and it has already 600 subscribers in an hour or so).

My main point is in that video, I just wanted to combine it with my own interpretation.

It's hard to know where to start, but I like one conversation I had with Gemini (I deleted my Pro account and now all my conversations are gone, so be careful).

But it went something like this:

Why do humans think so much in dualities, when things are more of a spectrum rather than in pairs? Part of the answer was: because of boundaries. A thing is a thing because it is NOT another thing (duh). If it's not THAT thing, then there is a boundary. Crossing the boundary means blurring the lines between one thing or another.

An easier analogy: boundaries wrap stuff, so I like to think of a boundary as a bucket of things. It can be one thing in our mind, but later we sometimes learn that we can get things out of a bucket, and the thing we get out is a different thing, while the bucket has fewer things in it.

Technology helps us get things out of buckets. It's a separator.

Before writing was invented, knowledge lived inside people's heads and was preserved by being passed from one person to another. We had a bucket of knowledge and memory. Then we invented writing. You no longer have to remember something to know it; you can just write it down, read it, and then you know it again. So writing splits knowledge from memory.

I can go on and on with examples: GPS splits getting somewhere from knowing the way, refrigeration splits eating food from making food, the telephone splits a voice from a person. I'll just give one more that I find interesting: energy.

Energy and its source were the same thing, and they were very hard to separate. Heat: you need fire. Grind grain: you need flowing water or wind at the windmill. Move something heavy: muscles have to work. Light: burn a flame in the room. Energy and the source of energy were the same thing. One bucket. A battery splits energy from the present moment. You can store electricity and use it later. Energy became something you could save, not only use. Just like writing saved knowledge from memory.

The trendy example is AI: it splits intelligence from identity. There was already a blur between intelligence and identity before LLMs. Google blurred it. I heard and made a lot of jokes about Google being Big Brother, or the godlike entity that knows everything. LLMs took that a step further: they literally seem to know things no single human could possibly know, but at the same time, an LLM is not a human and it doesn't have an identity. It pretends to have an identity, like an actor.

"You're a helpful assistant", "You're an expert in reading dreams", "You're a doctor". If you're close to tech, you know these are system prompts. These are masks we put on the LLM so it can engage in dialogue with humans.

Before AI, if something tailored an answer exactly for me, that was an expert, a person with expertise. To access expertise, we needed to engage with a human. I can also stretch the point and find something that blurs the line: SaaS products are the codification of expertise into a particular interface and a set of repeatable tasks. But we never thought of SaaS as "experts". Experts are what we ideally needed in order to know which SaaS products we should build and how we could build them.

LLMs are not absolute experts in everything yet, but it's fair to say that, just as Claude Code is better at coding than any programmer alive today, LLMs have the potential to separate other experts from their expertise too. People can obtain the knowledge without interacting with a human who carries it.

I could go on and on with other examples. AI separates: effort from work, humans from relationships, language from understanding, authority from a person or an institution, listening from a listener.

Things are changing, and with AI it's easy to fall into the trap of defending: "it's not really X, it's actually...". Next time I do that, I'll try to stop and zoom in: the thing I am defending is a bucket of things. Am I defending the full bucket, or only some of the things in it? Because AI is for sure taking some things out of there.