In Apollo 13, the Ron Howard flick, there’s a scene where three astronauts are in a simulator. The veteran pilot has been replaced by a rookie (played by Kevin Bacon). They are simulating re-entry interface. An indicator light goes off. They are coming in too steep. The rookie pilot says, “I’m switching to manual.” The veteran commander (Tom Hanks) looks at Bacon like he’s crazy, but he lets him do his job—Bacon is the pilot. Their descent is too steep. They’re going to burn up. As many times as Bacon says, “I got this, guys,” it doesn’t help their situation. He can’t fix the problem. The go in too steep and burn up in the atmosphere.
Tom Hanks talks with the simulator techs alone. They tell him that they didn’t really burn up—the simulation was to give the astronauts a false indicator light on re-entry. They only thought they were coming in too steep..because this control panel light blinked red when it shouldn’t have.
That’s a false indicator light problem.
It doesn’t just exist for astronauts. It exists for everyone.
Related is the missing-information or lack-of-indicator problem.
Here’s an example. Veteran programmer Fang calls me to his desk. He needs another set of eyes on a hard debugging problem. The compiler is giving him an error message for a line in a function that seems to have nothing to do with the code on that line. It’s like if the code was about the bright, bright sun and the compiler failed with an error message about the moon. We debugged this problem step by step, shedding ourselves at each stage of assumptions we were making about the situation. That’s what debugging is: becoming aware of your own false assumptions, your own false knowledge, things you think you know but which are false—just like a false indicator light. What finally solved our problem in this case is we decided to copy the function into another document and re-type the entire function from scratch. When we compiled that, it compiled with no problems. So we knew we were in a very difficult situation to see, and by running the two pieces of text through a tool that shows the differences between texts, we discovered that in the original function there was an invisible character that the compiler couldn’t handle. That was a hard debugging problem, and it had nothing to do with understanding C language syntax—to solve that problem, we had to question our assumption that what we saw on the screen was the same set of characters the compiler saw. That is almost universally a safe assumption, but not in this case. Somehow an invisible special character got into Fang’s code and made it not compile.
That’s a lack-of-indicator problem. There’s a problem, but your warning light never goes on. To solve this type of problem, you have to take all the facts you know and realize that the fact that they don’t add up means you are missing information. This is very hard to do, but you can learn to do it.
This type of problem occurs often interpersonally. Often, people are withholding information such that the set of information you face does not add up. At times like those, it makes sense to ask yourself if you’re missing information.
My neurologist says, simply, that having bipolar means my sense organ is broken. Therefore, I cannot determine whether I am manic or depressed or normal. My sense organ for that, my brain, doesn’t work when answering that question. One of the properties of bipolar disorder is that people who have it, because of the way our brains are damaged, tend not to believe the diagnosis. That’s insidious: to have a problem which tends to make you believe you don’t have the problem. The same is true specifically of bipolar mania: one symptom of mania is that you don’t think you’re manic. Now that is a hard problem to solve. It’s an indicator light problem. It kills your relationships because people are trying to help you, saying, You’re manic, and you’re telling them they’re full of shit. You don’t trust them—you can’t, because what they’re saying doesn’t fit with the [false] data that you have about your own state. People tend to give up on you under such conditions. It’s very hard to be friends with a bipolar person because they discount your [correct] view that they are manic, depressed, delusional, hallucinating, etc. It’s a clash of realities, and it doesn’t feel good when your bipolar friend or son or lover tells you that there’s nothing wrong when, to you, there is clearly something wrong.
Even when your sense organ is broken such that your subjective reality is skewed, there are things you can do to check your own perception with objective measurements. In the pilot situation, the pilot can use a secondary means of measurement to check whether the indicator light is giving true information—you can’t always trust your senses. With bipolar, you can look at objective metrics like how much you are sleeping or (as my neurologist suggested) you can measure your rate of speech using software. Both of those give objective, indisputable indicators of mania or lack of mania.
Indicator light problems, their presence in the world, are reminders that what we think we know isn’t always right, even when it seems impossible that we could be wrong. Sometimes, what we’re looking at on the screen isn’t really what’s in the computer. Sometimes, when dealing with people, they haven’t told us the whole truth, and we’re sleuthing for answers given some false clues. To debug these types of problems it is necessary to be humble about what we think we know, to become aware of and question our most basic assumptions. We’re limited beings, and the truth is we know very little. A lot of what we know is false, actually, and usually we never even discover that it is.
There is no easy answer here except to question what you know. Indicator light problems are pervasive. And they are very, very hard to solve.