“No to Captivity”
It was one of the most amazing turnabouts I’d ever seen. Back when I worked at the church in Mount Holly, we did a children’s night every Sunday. Part of the event was that we served dinner. Most of the time, one of the families of the church made the meal and served it but, every once in a while, we’d just order pizza from Dominos instead. We’d always get like a dozen pepperoni pizzas as everyone seemed to like them. One kid, in particular, loved pepperoni pizza. Claimed it was his “favorite food” of all-time, high praise if you asked me coming from a kid. So, imagine my surprise when, one night, I saw this pepperoni-crazy child deliberately picking the pepperoni off his slice of pizza. It was such an odd behavior from this particular child, I had to ask him. “Why aren’t you eating the pepperonis tonight?” I wondered aloud. “They’re bad for you,” the child flatly declared. And, although I’m sure they’re not the healthiest things in the world, eating two or three a night probably isn’t going to negatively impact your health. We talked some more and it turned out one of his little friends whose parents are vegetarian got into the kid’s ear about how bad pepperonis were. Once that kid got a little understanding of what them being “bad” meant, he piled them all back onto his slice of pizza. He was much happier knowing the difference.
I remember that story because, in a way, that child was held captive by a belief that wasn’t entirely true. It’s amazing how much what we think we know can impact our behavior.
Sometime in the 1900’s, a series of experiments was conducted known as the Milgram experiments. The aim of the psychological tests was to determine just how far the role of authority goes in our lives. So, an experiment was arranged. Two people would be brought into the study, a questioner and a responder. The two were told that they were to be in separate rooms. One, the questioner, would ask a series of trivia questions. If the person got it right, nothing happened. If they got it wrong, the questioner would press a button and an electric shock would be administered. As the responder missed more questions, the severity of the electric shock would be amplified.
So, off the two went into their separate rooms. Connected only by headsets, the two would begin the experiment. Now, lest you worry about the person getting shocked, they weren’t. It was a ruse, that’s why they were in different rooms. But the questioner didn’t know that. And all the things they heard from the other person reinforced the idea that they were actively being shocked. On and on the experiment went. The shocks became so bad that the responder began begging the questioner NOT to apply the shocks. What they found out was that, depending on the level of expertise perceived by the questioner to the scientists running the experiment, that the level of shocks normal people would give other people was extremely high.
Turns out, we’re actually quite malleable when it comes to what we’ll do when we believe something. The Milgram experiments exposed a little insight into our core wiring – we follow authority, even when it conflicts with what we know to be right. After all, it isn’t right to shock anyone to the point of incoherency. But there we were, with people doing it to one another willy-nilly.
So, when Paul says to avoid being taken captive, his advice is good. You’d be surprised at how what we think we know impacts what we do. What Paul is saying therefore in a way is to make sure you know what you think you know. That’s the role of philosophy. The word simply means, “love of wisdom or knowledge” but, whether we know it or not, philosophy plays an outsized role in our culture right now. You might not believe me but, ultimately, politics is an outworking of philosophy. We do certain things, vote for certain people not so much for what they promise us but because, right or wrong, we believe a particular party or candidate shares our world view or philosophy on the world.
Recent Comments