To be as irrational as I can

I’m trying to isolate the irrationality in thought. That’s a hard idea to put in words. Irrationality is that which doesn’t resolve properly within a context. It’s irrational to speak out for democracy in a theocracy that will execute you but rational in the hope that you can change their minds until they prove otherwise. Irrationality is the point where the future context becomes the current failure. Like my inability to think of what to say next.

The way I phrased that was important: ‘until they prove otherwise’. Execution is a rather blunt proof. Lesser proofs can range from everyone tells you that you’re crazy, all the way to you worry that others will think you’re crazy and you don’t even know why you care about that. Again, a careful phrasing: If you scale proofs from they kill you through various physical punishments into verbal information and stuff like people avoiding you, through into your own worries until they are worries about worries about worries, etc.

I’ve been tracing and scaling irrationality in many dimensions. Chasing it down means narrowing the gap in understanding of the space Between what we expect and what happens. We expect trees to make noise when they fall. We expect that because the physical rules say so. As we move away from the physical, the gap Between what we expect and what happens grows. I can talk that apart. I conceptualize them a sMudi and lay them out in x-yR that I count.

You have no idea how much Tali is going to shape music, and through music she will shape how you hear. And through that, she will shape how you feel. She controls sound so the gap Between you and the experience of the sound, with all its layers, goes away. She absorbs you into the sound, into the emotions, into grey areas of meaning that highlight her messages.

That process I can describe better each day: for each bit, she encodes the inverted bit. The inverted bit is the CM64 of the bit inverted over CM1. I tend to leap here to CM64^-34 because that’s 16 counts of 2 steps inside a Thing, but I’m a little concerned that’s because of the obvious similarity to the Planck value. To be specific, because there is a similarity, there’s a relation because that’s how the fCM calculations work: the value calculates in CMs according to fCM that generates this other occurrence of the value, perhaps off by a power of 10, and some other very small amount, and some fCM connects them. I can even say that diminishes the gap to this tiny difference Between what I expect and what happens, meaning that’s the size of the irrationality. I can even explain some of the gap: a power of 10 difference signifies the switch in operations space from fCM into or out of base10. That leaves the actual small difference as representing fCM that maybe I can’t see into well enough to understand.

Example: it’s irrational to believe Tali can change music and the rest except I trust my opinions when they are this deeply held. What makes some opinions deeply held? Counter-example: I have a belief the Patriots will win the Super Bowl. That belief is based on my readings of the teams, but it may be skewed – likely is – by my deeper understanding of the Patriots, which means I may be misreading the Eagles ‘somewhat’. I put in ‘somewhat’ because while I do trust my reading, there’s a layer of caviat on my belief: there’s a chance I read them completely wrong, a chance I prefer not to dwell on because I don’t think it’s likely. It’s in my victory calculations as a Black Swan, meaning I’m calculating as though that option doesn’t exist. Since I’m not betting real money or anything on the game, and all I risk is re-evaluating my judgements, then I don’t need to assign a value to the Black Swan of being completely wrong. That’s how life works in general: you place a value on the negative outcome as well as on the positive. You can extend my methods to define a context space that evaluates the ‘negative’ for any ‘positive’ by treating the negative as Mudi which interact with the positive Mudi at the measured, known or suspected or even posited points.

In the football example, it’s trust in my ability to analyze something with a specific result at a specific time: end of game there’s a winner and that occurs then and not before or after. That isolates the irrationality really well because the expectation hits contextually imposed reality just like an execution. The metaphor runs deep: if the Patriots lose, it might be because of an injury or a fluke play or for some other reason short of I was completely wrong about them and the Eagles, and that is like a real execution because that terminates the game of your life but maybe that’s because of a fluke not because you were fundamentally wrong.

It’s irrational to believe unless what you believe turns out to be true. It’s irrational not to believe because there’s always a reason why belief failed, from truly random through variations of fluke through rational reasons into absolute condemnation, repulsion or other horror movie disgust. This gets to a point I’ve made many times: it’s about the direction. Since any belief becomes rational, directionality is the only way you can assign values to any belief. And I mean any belief, from belief that gravity is universal to belief in higher power. Example would be how one treats detached murderers – like the Las Vegas shooter – versus personal murderers – like the Texas church shooter: while one appears to treat people as objects afar, the other treats people as objects up close. The intensity of rage is more obvious in the latter, though perhaps not always. I can map this.

Leave a comment