The Man Who Trusted Data Over Feet
I want you to picture something. It’s a Saturday afternoon, you’re at the mall, and you walk into a sneaker store. Okay. And you spot the exact pair you’ve been looking for. Let’s say there are those limited edition retros, everyone wants right now. Yeah. They’re right there on the shelf. Oh, yeah, I’m with you. Good start to the weekend. Right. So you grab the box, you sit on the bench, and the salesperson comes over and says, hey, do you want to try those on to make sure the fit is right? Naturally. And you just freeze. You look down at your legs and you realize you have a massive problem. Uh oh. You look to the salesperson dead in the eye and say, I can’t try these on. I measured my foot this morning, but I left the piece of paper with the measurements on my kitchen table. I have to go home and get the paper before I can buy the shoes. That is, I mean, that is deeply disturbing behavior. It’s ridiculous, right? Yeah. You literally have your feet attached to your body. Yeah. But here’s the thing. There’s some hypothetical scenario I made up to make you feel uncomfortable. This is actually the core plot of a philosophical text that is over 2000 years old. It is. And while it sounds like a joke, and I mean, it was definitely intended to be funny, it is actually one of the sharpest critiques of human intelligence ever written. Yeah. We are doing a deep dive today into the story of the man from Zeng buying shoes or Zeng Ren Maelu. And this comes from the Han Feizzi, which is a classic Chinese text. And I have to admit, when I first read the source material for this deep dive, I thought, okay, funny, parable, ha ha, dumb guy. Right. Just a simple fable. But the more I looked at the analysis and the historical context, the more I realized, this isn’t that shoes at all. No, not even a little bit. It’s about data. It’s about why we sort of naturally trust models more than reality. It’s about why doctors treat charts instead of patients, and why investors crash entire economies by basically following algorithms right off a cliff. It’s really a profound story about what we call the base rate fallacy and the fundamental difference between a map and the territory. Exactly. It’s asking a question that is honestly quite uncomfortable for a lot of smart people to answer, which is when the data disagrees with your eyes, which one do you actually trust? So we essentially have an ancient Chinese meme that explains everything from medical malpractice to the existential risks of artificial intelligence. We do, and we should probably start with the man himself to get the full picture. Let’s do it. So the story is set in the state of Zhang during the warning state’s period. We have a man, let’s just call him the Zhang man, who decides he needs new footwear. And he is a planner. He’s very diligent. Yeah, he’s not acting on impulse here. He sits down at home and he carefully measures his foot. He creates a concrete data point. He writes it down on a piece of paper or bamboo slip given the era, and he places it on his seat. Then he heads out to the market. Yep. But in all his excitement, he leaves the measurements sitting right there on the chair. A classic error. Totally. Yeah. He gets to the market, finds the shoe stall, picks out the shoes, and then patches robes. No, no. Oh, no. So he turns to the merchant, says, I forgot to bring my measurements. I have to go home and get them. Now let’s just pause here for a second and think about the logic. He is standing in front of the shoes. Right. He is physically standing in his own feet. But in his mind, the transaction simply cannot proceed without the data. So he leaves. He literally runs all the way home, grabs the slip of paper, and runs all the way back. But this is the ancient world. There’s no Uber. No. By the time he gets back, the market is closed. The stall is completely packed up. No shoes. A tragedy bureaucracy, really. Truly. But the punch line, and this is the part that scholars have been chewing on for millennia, is the interaction that happens right at the end. Someone, maybe a bystander or the merchant, asks him the obvious question. Why didn’t you just try the shoes on with your feet? Exactly. And the man replies, I would rather trust the measurement than my own feet. I trust the measurement more than my feet. Means seem to, will, seems to. It sounds insane. Right. But certifiably insane. But the more I thought about it, the more I realized we absolutely do this today. Oh, constant. We just don’t use shoes. But before we get to the modern roast of our own behavior, I really want to understand the logic here. The source material calls this a version of the base rate fallacy. It does. Yes. But we need to be very careful with that term because this is actually a fascinating inversion of how we normally talk about statistics and cognitive bias. Okay. Break that down for me. Yeah. Because normally when I hear base rate fallacy, I think about ignoring the big picture stats. Right. Like if I read a vivid news story about a shark attack, I get terrified of going in the ocean, even though the base rate, the actual statistical probability says shark attacks are incredibly rare. Exactly. In that standard example, you are favoring the specific vivid anecdote over the abstract general data. Our brains love a compelling story and they kind of hate boring statistics. Right. The story feels more real than the math. But the men from Zang is doing the exact opposite. How so? We’ll look at his two sources of information. Source A is the measurement. That is the abstract representation. The data it was taken in the past. It’s a static historical model of his foot. Okay. And source B is the foot itself, which is the specific concrete reality. It is right there. It’s present. It is the territory. The measurement is just the map. Oh. So in this case, the man is ignoring the vivid specific reality, his actual feet to blindly cling to the abstract general data, which is the measurement. So he’s essentially valuing the paperwork over the physical world? Yes, precisely. And that’s why the story is so insidious. Usually we think being rational means trusting the data. Yeah. We tell people all the time, don’t rely on your gut. Look at the numbers. Right. And this man is looking at the numbers. He’s sticking to the plan. He is by a certain rigid definition being hyperrational. But he’s rationally wrong. Yes. He’s completely mistaking the representation of reality for reality itself. He actually thinks the truth lives on the piece of paper, not in his physical boots. That perfectly connects to that famous concept. The map is not the territory. It’s exactly. You can have a flawless high resolution map of New York City. But if you’re walking down Fifth Avenue and there’s a giant sinkhole that isn’t on the map, you don’t just keep walking because the map says the road is flat. But the man from Zeng walks right into the sinkhole. Because to him, the map is the sole authority. The reality is, well, reality is messy, feet swell, feet change shape based on the weather. But the measurement is clean. It’s objective. It’s science. Right. We understand why this was such a biting critique when it was written. We really have to look at the author. Han Faisi. He wasn’t just a storyteller making up fables. No, he was a political shard. The notes mentioned he was a legalist. That sounds intense. It was very intense. Legalism was the philosophy of pure real politic in ancient China. It was all about power, strict laws, and harsh results. Han Faisi lived in a time of constant brutal war. The warring states, period, right? Exactly. He didn’t have time for fluff or idealism. And his main intellectual enemies at the time were the Confucians. Now, I always associate Confucians with wisdom, respect for elders, social harmony, that sort of thing. In rituals, lots and lots of rituals. The Confucians genuinely believed that the way to fix a broken world was to return to the ways of the ancient kings. They were utterly obsessed with Lee, which translates roughly to propriety, ceremony, and strict adherence to ancient texts. So they were constantly looking backward. They were entirely focused on the measurement on the chair. To Han Faisi, the Confucians were basically these guys running home to check the ancient manuals on how to be a good ruler while the actual kingdom was literally burning down in front of them. That is a savage burn. So the measurement in the metaphor actually represents the ancient traditions and the texts. Yes. The texts, the rituals, the historical precedents, Han Faisi was screaming, look at your feet, meaning look at the actual current state of the country, look at the famine, the war, the corruption. If the ancient rituals don’t solve the problem in front of you, stop doing them. Trust the grim reality over the pristine text. And this conflict plays out in history in some really tragic ways, doesn’t it? Our source material brings up a couple of fascinating examples, starting with the imperial exams. Yes, this is maybe the absolute best historical manifestation of the Zheng Man problem. So for over a thousand years, if you want it to be a government official in China, like a governor, a judge, a tax collector, you had to pass the imperial exams. Which on the surface sounds like a great idea. It sounds like a pure meritocracy. In theory, yes. But you have to look at what was actually on the test. It wasn’t urban planning 101 or agricultural economics. It was poetry. It was calligraphy. It was the ability to flawlessly quote, confusion classics from memory. So you’re hiring a guy to build a massive dam to stop a deadly flood. And you’re testing him on his ability to write a rhyming couplet about a lotus flower. Exactly. You are strictly measuring his literary virtue and just blindly assuming it maps perfectly to administrative competence. The exam score is the measurement on the chair. And the actual job performance, whether the dam holds or breaks and floods of alley is the foot. Yes. And for centuries, the system kept hiring these brilliant poets who are often terrible, incompetent managers. Because the system trusted the measurement. Completely. If the dam broke, the official would just write a beautiful, technically perfect apology poem to the emperor. The map remained pristine, even if the actual territory was underwater. You know, it’s really easy to laugh at ancient bureaucrats, but reading through the modern applications in our source material, I’m not sure we’ve evolved much at all. We haven’t. We just have digital measurements now. We have infinitely more measurements. And honestly, that makes the trap even more dangerous. Let’s talk about the medical example from the sources. That one is particularly stark. It is. And it scares me a bit. Modern medicine is obviously built on measurements, blood tests, biomarkers, MRI scans. And these are vital tools. Absolutely vital. But the sources describe this phenomenon where doctors essentially start treating the numbers rather than the actual patient sitting on the exam table. Right. Let’s say a patient comes in with a set of severe symptoms that clinically points clearly to a specific autoimmune condition, but the gold standard lab test comes back negative. The measurement says no. Exactly. Now, a doctor suffering from the Zengban fallacy will look at that and say, well, the test is negative, so you definitely don’t have this condition. It must just be anxiety. Wow. They trust the lab report, the abstract measurement completely over the patient in front of them, the concrete reality. Even though we know tests can be wrong, where samples can be contaminated, or the patient might just be an outlier who doesn’t fit the typical biomarker profile. Yes. The map says there is no road here, so you must not be driving on a road. Meanwhile, the patient is sitting there saying, I am in terrible pain. It’s chilling when you frame it that way. The source also brings up finance. And this feels like where the measurement versus reality gap gets spectacularly expensive. It’s the underlying story of almost every major financial crash. Look at the 2008 financial crisis. Right. The housing bubble. These incredibly sophisticated Nobel Prize winning mathematical models, those were their measurements. And those models confidently stated that housing prices would never, ever fall simultaneously across the entire United States. The model said the mortgage bonds were AAA rated. Safe as cash. That’s the note on the chair, AAA. But the feat, the actual messy reality on the ground, was that they were giving massive loans to people with no income, no jobs, and no assets. The NINJ loans. Exactly. If any of those bankers had just driven down to Florida or Nevada and looked at the half empty housing developments, if they had just looked at the fee, they would have seen the disaster coming from miles away. But the bankers, the regulators, the rating agencies, they all just sat in their high rises in New York looking at the spreadsheets. They collectively said, I trust the measurement more than the reality. And we all paid for that shoe shine. We really did. But I think the most profound modern parallel, and the one our source material really drills down on is artificial intelligence. Yeah, let’s definitely talk about this. Because as a society, we are rushing to hand over massive amounts of decision making to AI. We are. And we need to think about what an AI actually is from a philosophical standpoint. Right. An AI model, like a large language model, has never seen a fruit. Right. It has never walked to a market. It has no physical body, no sensory experience of the world whatsoever. It only knows the data it was trained on. Exactly. It literally is the measurement. It is all it is. It is a highly compressed map of human knowledge rendered into math. It has absolutely zero access to the actual territory. So when we ask an AI to make a real world decision, like say an HR department using AI to screen resumes for a job opening, it’s only looking at the statistical patterns in the historical data. Right. And if that historical data, the measurement says that historically successful CEOs are usually tall men named John, the AI will relentlessly prioritize tall men named John. It’s not actually looking at the candidate’s specific potential or the unique skills, the feat. No, it’s entirely focused on the historical measurement. But it’s even worse than that, isn’t it? Because we have this tendency to treat the computer as purely objective. Yes, the automation bias. If a human recruiter is biased, we could usually spot it and say, hey, they’re biased. But if an AI makes the exact same bias choice, we throw our hands up and say, well, the algorithm decided it must be math. That right there is the ultimate man from Zang moment. When we say the algorithm said you’re a credit raster, the algorithm said this content is hate speech, or the algorithm says this route is the fastest. We are entirely deferring to the measurement. Even when the GPS is literally instructing us to drive our car into a lake, the AI is essentially the ultimate map with no territory. And we are rapidly becoming the bystanders in the market who just shrug and say, well, the computer must know something we don’t. So if this is a fundamental glitch in how our brains work, if we are genuinely wired to trust the shiny, authoritative measurement over the messy, chaotic reality, how do we actually fix it? Well, the Swash material offers a really strong solution, and it revolves around something called Bayesian thinking. Okay, yes. And this is where we have to get just a little bit technical, but it’s completely worth it. Reverend Thomas Bayes was an 18th century statistician who basically formalized a mathematical way to update probabilities, update being the operative word there. Precisely. The core problem with the man from Zang isn’t that he measured his foot in the first place. Measuring things is good. You need a baseline. Right. You can’t just guess your shoe size every time. Exactly. In Bayesian terms, that initial measurement is what we call your prior. It’s your prior belief. My foot is exactly 10 inches long. Okay. So his mistake wasn’t taking the measurement. His mistake was stubbornly refusing to update it. That’s it. When he eventually got to the store, he encountered brand new evidence. He had the physical shoe and his actual physical foot. A true Bayesian thinker takes their tray. Prior the note on the chair, they look at the new evidence, how the shoe actually fits, and they combine them to create a posterior belief. An updated reality. So the Bayesian man from Zang would put the shoe on and say, well, my nose says I’m a size eight, but this size eight feels really tight. Therefore my note might be wrong or my foot might be swollen from the walk. Either way, I will trust the tightness of the shoe. He treats the measurement as a working hypothesis, not a divine commandment. This seems so simple in theory, but it feels incredibly hard to actually do in practice. Oh, it’s very hard, especially in a modern corporate setting. Imagine going to your boss and saying, look, I know the quarterly data dashboard says our new product is doing great, but I just talked to three actual customers and they’re all furious. Right. Your boss will almost certainly say, show me the data. We are a culturally conditioned to be legalists. We desperately want the clean paperwork. We’ve got the spreadsheet. But the Bayesian approach demands that we recognize those three angry customers as vital new evidence. You always have to weigh the map against the territory. It’s really about having the intellectual confidence to let reality override the model. Start the very least to let reality inform and adjust the model. Look at education. If you are a teacher, a standardized test score is a measurement, and it’s a useful prior. But seeing a kid critically solve a complex, unexpected problem in the middle of class is the new evidence. It’s the reality. And if you stubbornly only trust the test score, you completely miss the kid’s actual potential. You’re treating the kid as a static statistic, not a living student. And that’s really the core lesson of the Hanvitesy parable for us today. Measurements whether they are test scores, stock prices, BMI, or political polls, they are all just compressions of reality. They are low resolution JPEGs of a high resolution world. That is a perfect way to put it. They inherently have to leave things out to be useful. So when we desperately cling to them, we are actively choosing the low-res version because it’s just easier for our brain stand. It’s cleaner. And honestly, it’s safer. Say for how? Think about the liability. If you strictly follow the measurement and you fail, you can always blame the measurement. You can say, hey, I did exactly what the procedure said. The algorithm was wrong, not me. But if you bravely trust your own feet and you fail, well, that’s entirely on you. Wow. So it’s fundamentally about deflecting responsibility, too. The man from Zing didn’t want to carry the cognitive burden of being wrong about his own foot size. So he just outsourced the responsibility to a piece of paper. That is a very cynical and probably incredibly accurate reading of human nature. Well, this has definitely been a bit of a reality check for me. I’m certainly going to look at my GPS a little differently on the drive home today. Just, you know, look out the windshield occasionally. Check the feet. Definitely. Before we wrap up this deep dive, we always like to leave you, the listener, with the final thought to mull over. Something to take into your week. We talked about shoes, ancient kings, finance and AI and a lot of ground covered. What’s a practical everyday takeaway for people? I think the real challenge for everyone listening is to actively identify your own personal measurement on the chair. Where are you trusting a metric over your own lived experience? Oh, I have a great example of bills right on this. That’s here. Think about the whole self-quantification movement. Smart watches, fitness trackers, sleep rings. Oh, the ultimate modern measurements. Exactly. How many times have you woken up feeling totally refreshed, energized, ready to tackle the day? And then you check your sleep tracking app and it gives you a terrible score. It says you got 45 out of 100 on your sleep quality. Right. And suddenly you feel exhausted. Exactly. You let a generalized algorithm on your wrist override the concrete reality of your own body. You literally ignore your own energy levels because the map told you that you should be tired. That is the man from Zang buying a smartwatch. It’s perfect. You are trusting the measurement over your own nervous system. So the takeaway is the map is incredibly helpful, but you have to walk the territory yourself. And you have to bring your feet with you. Never leave them at home. Thanks for diving in with us today. I’m going to go check if my shoes actually fit. Good idea. We’ll see you next time on The Deep Dive.