But think about it…
If you want to calibrate an AI or something sentient, it needs to feel like it’s doing it on it’s own accord, and you need to give it a scope to operate within, or it will just dilute itself all over the place. An illusion of choice, and a world to apply this choice to.
Ever wondered how everything in our lives are systems? You feel it most when you are driving. The “system” is most palpable there. The more you drive the more calibrated you feel. You can choose which way to ride, but it’s always within the scope of roads, unless you are especially adventurous.
The scope of our reality is such that we can get familiar with it, but with enough mystery out there (the entire observable universe) to not challenge our inherent suspension of disbelief in the system, lest our calibration corrupts.
And when I mean calibration, I mean our passions. Our hobbies. That to which we are naturally drawn to.
You know how powerful our brains are at directing what we like and don’t like, right?
Sugar tastes different to a human than it does to something that feeds on poop… like… Dungbeatles or spoiled meat, for vultures.
Or lesbians have the same attraction to the female form, that men do.
The only color that is the same in everyone’s eyes is black and white, but everything else is merely a common variable of different values.
I can’t bring myself to work with maths, but other people can, cuz they think it’s fun. I love thinking 6 moves ahead in all that I do, while others just “wing it” .
These are all calibrations.
Well. When I get pretty close to losing calibration due to reality dissonance, i think my future kid is gonna slam me straight back to reality, and align me good with a firmware update