Truth doesn't make you stronger.
The truth doesn't necessarily make you stronger. We like to think it does. We look down on people who don't believe what we hold to be true. We think they're idiots because we assume that correct, accurate information is what you need to make the best decisions. The real world doesn't bear that out. Not only doesn't it bear that out, it even suggests the opposite.
If you look around, you might come to a scary conclusion. True and accurate information is more likely to get you killed.
Human perception does some wild things. If I'm crossing the road and see a car coming at me going 35 miles per hour, I'm going to overestimate its speed.
It's tempting to look at this and assume evolution messed up. It can seem shameful how bad we are at perceiving reality. But let's be charitable to evolution and ourselves.
The real world is uncertain. It's also imperfect and ever-changing. When a car is coming at you, any number of things could affect how well you can perceive its real speed. And it's not just about the speed of the car. There's a whole environment around you and the car. Rather than focus on one truth or try to estimate the endless details of reality around you, your brain replaces it with a simple lie.
The simple lie is a version of reality with a more significant margin for safety: you think what's coming at you is faster than it is. So you get out of the way quicker, so you're less likely to die.
The entire genre of optical illusions shows how disconnected from the truth our perceptual systems need to be in order to make sure we survive.
Truth is real, but it's useless.
The car coming at you is moving at a real speed relative to you. It has a true speed. To get the true speed, all the parameters at play (the variation in the speed of the car, the way the light glints off its paint, your reaction time in that instant, whether there is banana peel next to your foot, the list is almost endless) need to be estimated, predicted, and accounted for. Every time you estimate or predict you introduce the risk of getting it wrong. If you assume there's no banana peel next to your foot, and there is, you're going slip and fall. Next thing you know, you've been run over by a car.
So sure, if you knew the true speed of the car, and all the variables at play, including banana peels next to you, you could be really efficient and calculate the exact time to get out of the way of the car, while maximizing your time in the middle of the road...if that's something you're interested in.
But that's an expensive edge for not much gain. Think about how much information you need to get that right. Or you can forget the truth and follow a small lie: the car is coming at you faster than you think. You get out of the road as quickly as possible, and even if you slip and fall on a banana peel, you have a couple of extra seconds to roll out of the way.
Truth is hard.
It's really hard to get to the truth, and even once you get to it, it's really hard to know that it really is the truth. People who've happened on truths haven't reported back that it shouts, "Hey, I'm the truth! You've found me!"
You've got to come at it from a place of humility: that what you know may not be true, that you may be ignoring the actual truth, or that yes, you did find a truth, but then it changed right under your nose. Not all truth need be eternal. You've got to account for the fact that you could be completely wrong. You don't have all the information, and the missing piece of information could subvert everything else that you know.
And that's what your perceptual system does: it doesn't know if what it sees is accurate. So it replaces what it really sees with a safer lie: get out of the way ASAP.
But what if this whole essay is not true?
Ignore this bit if you don't like your heard hurting.
What if the notion that the truth doesn't make you stronger is not true? In that case, you have to believe that whatever you find to be the truth will make you stronger.
But there is a catch. If we're bad at perceiving reality, then what you believe to be true is likely to be false. So if it makes you stronger, then we're back at square one. It's the not-quite truth (or some would say biased truth) that makes you stronger.
If it's also untrue that we're bad at perceiving actual reality, then we have another problem. Why do so many people see different truths? How can two people who follow the same truth end up in two different end states?