computerlove.org

The difference between humans and computers is just two letters

The film Blade Runner 2049 is (by far) my favourite sci-fi movie of all time. This is largely due to its enormous philosophical depths, and its amazing director, Denis Villeneuve, who actually studied science before his interests shifted towards filmmaking.

I must admit, it seems futile to explore Blade Runner 2049's many philosophical treasures in a single blog post, which is why I only focus on one very brief dialogue between the android "Officer K" and his virtual companion "Joi".

In this scene, they both sift through a public DNA database in an effort to locate the missing child of a replicant, as the following conversation unfolds:

Joi: Mere data make a man. A and C and T and G. The alphabet of you. All from four symbols. I'm only two, 1 and 0.

Officer K: Half as much but twice as elegant, sweetheart.

The four letters Joi was referring to (A-C-T-G) represent the four nucleotide bases that make up our human DNA. These letters appear in triplets, allowing for 64 possible combinations of each one.

Ordinary computers, on the other hand, use only a two-letter alphabet, 1 and 0 (aka binary). However, do these two additional letters truly represent the fundamental distinction between humans and computers?

Well, the answer to this question really depends on your perspective, or, as computer scientists would say, the level of abstraction.

From an information theory perspective, both humans and computers are primarily information storage systems operating via discrete states, which are represented by symbols (aka the alphabet of you). Both humans and computers store complex information in sequences of these symbols, and both can theoretically encode "any" information, if you allow for a long enough sequence.

So, if computers can encode the same information as biological systems using only half the alphabet, is Officer K right to conclude that binary systems are twice as elegant as four-letter biological systems?

Well, when looking at the human system as a whole, this wouldn't be a fair comparison because, at higher levels of abstraction, biological systems operate very differently from silicon-based ones.

DNA is not executed in the same way as computer code. Instead, it interacts in a multi-layered biochemical system, which is far too complex to do it justice here. Also, in biological evolution, error tolerance is absolutely vital, which is why biological systems come with built-in redundancy and repair mechanisms.

Computers, on the other hand, require exact precision to remain reliable (although error-correcting codes do exist). However, in the realm of electronics, binary is more stable because it is more tolerant of noise. And despite requiring longer sequences, it can actually achieve the same level of complexity as four-letter systems, or any amount of letters that make up a system (which is amazing).

So in the end, we could rightfully say that computers are more elegant, not because they require less code sequences, but because they require less letters to begin with.

In summary, binary provides the most minimal sufficient structure to encode any digital information. So yeah, Officer K was right (in a way).


I hope you enjoyed this short journey into the Blade Runner universe. See you in the next post!