"That's fucking cool. It's just — do you think it's too idealistic? I mean, humans can be real assholes. Do you really think you can change human nature?"
"Yes," I replied. "I think it's too idealistic to assume the change will happen overnight. We're not living in La La Land with unicorns and rainbows and butterflies. We're fighting a lot of momentum in the system, and it's important to acknowledge that and work with it, not against it. But you know what they say — The best time to plant a tree was twenty years ago. The second best time is now. Our ancestors are going to look back in a hundred years and say, 'Wasn't it strange that Great Grandpa Zac once thought the world was material? He must have been pretty stupid.' It's similar to how kids these days say, 'What do you mean, you had to physically go to the library to do research? Why didn't you just use Google?' Our current paradigm will look positively neanderthalic and barbaric to our ancestors. This change is inevitable. It's a necessary step in our evolution."
"That doesn't answer the question about human nature, though. Do you think you can actually change human nature?"
I sighed. "I think people are generally doing the best they can with the resources they have. Imagine you were an AI bot placed within a poverty-stricken environment in a computer simulation. You take in all the data around you and form a mathematical model of your reality that says, 'I am poor and worthless, and resources are scarce.' Your consciousness believes this to be true, so that is what your external reality continues to manifest. There's violence and theft as people fight over resources. You believe that the rich are holding you down in life, and they should give their money to you. But when this does happen, somehow the money flows back into the hands of the rich. Wealth and power continue to be highly concentrated in society, which leads to more conflict. The wealthy people say, 'I don't understand what the problem is? Just work harder and earn more money and pull yourself up by your bootstraps.' But that is incredibly difficult to do when all the data coming into your senses tells you you're poor and worthless, with no hope for the future. The algorithm powering the game will continue to give you a physical experience of Who You Believe Yourself To Be. This is why power laws are such a strong emergent pattern in our reality. This algorithm creates a shitty loop that is incredibly hard to break out of without education and compassion from those who've escaped it, or never had to deal with it in the first place."
"So, do you think it's wrong to tax the rich?" Zac asked. "Do you think it's wrong to have a social safety net?"
"Not at all," I replied. "But there is a big difference between a wealthy person willingly parting with their money because it is a compassionate reflection of Who They Are, and their money being taken from them unwillingly. How can I experience myself as a compassionate person if I have no choice in the matter? This just leads to resentment — especially when the people taking the money say hateful things like 'eat the rich.' It's so disempowering and just perpetuates a divisive poverty mindset.
Materialism would have you believe that money is money, and someone's attitude towards money doesn't affect the success of policy. This is why Materialism OS is sending society off the rails. It's not the physical action that is causing results — it's the distribution of surprise that the action generates in the system, and the Sponsoring Thought behind that surprise.
For example, materialism would have us believe that we can solve all of our world's problems by simply rearranging physical matter in our external reality. It would also have us believe that our job as problem-solvers is to use our rational minds to figure out how to rearrange that matter. It's a very archetypically masculine, linear way of problem-solving, but it has no leverage. Humans aren't that intelligent, and this strategy relies on limited human intelligence to solve highly complex multi-dimensional problems."
"As opposed to what, though?" Zac asked. "Policymakers should be solving problems using a rational, data-driven approach."
"Yes, I agree. They should be taking a data-driven approach. However, they're currently optimizing for the wrong variable. We can solve all of the world's problems by optimizing for one variable: free energy."
"What's that?"
"Think of it as a quantitative measure of surprise in the system."
"Why's it called 'free energy,' then?"
"I'll get to that later. Anyway, let's quantify free energy in a very simple model. I can draw a graph with the origin point representing homeostasis in the system. The X-axis measures the average prediction error of a person. Positive prediction errors have love as their Sponsoring Thought. Negative prediction errors have fear as their Sponsoring Thought."
"You can then add a Y-axis to represent the number of people holding that average prediction error. The shaded area is a rough measure of free energy in the system. For example, one policy might generate 12 units
of free energy. Another might generate -8 units
."
"Both of these policies will generate different emergent patterns in physical reality. The policy that produces 12 units
of free energy would generally be superior."
"Why would it generally be superior?" Zac asked. "Surely, it would absolutely be superior?"
"Not necessarily," I replied. "I'm using an over-simplified model based on averages. The individual distribution of free energy in the system matters, too. For example, I could rearrange physical matter in such a way that it generates negative (fear-based) free energy for most of the population, but gives a small percentage of the population hope and opportunities. Artificial intelligence is a great example of this. If a small group of people control and benefit from AI, then a small group of people will be highly optimistic about the future. Meanwhile, the majority of the population will be angry and devastated because they no longer have jobs. According to this algorithm, a small group of people will generate beautiful, abundant emergent patterns in their reality while everyone else generates scarcity and chaos.
A materialist policymaker might look at these patterns and assume they need to redistribute wealth from the rich to the poor. And I don't know — perhaps that is what needs to be done. But I'm just trying to point out that physically redistributing matter will not change the emergent patterns that manifest in our reality unless it also changes the distribution of free energy in the system. I've said this before, and I'll say it again: linear cause-and-effect is an illusion. We need to start solving these problems at their root cause, but we're still stuck in this false, superficial paradigm."
"But if you don't tax people, they're not going to willingly give away their money. People are greedy."
"Firstly, I'm not advocating for no tax," I said. "I'm just saying that taxation only works to the extent that it generates positive free energy.
Secondly, you're wrong. I know many people who are highly compassionate and willingly give their money to help others. They have a very high state of consciousness, and they give out of love, not obligation. Furthermore, if I had lots of money, I'd feel much better about giving it to a cause other than the government — not because I don't want money going towards a social safety net, but because the government is notoriously inefficient at whatever they do. I mean, how much of my money actually makes its way to the people in need? Big institutions gobble up money while producing very little output. And then they demand more money and higher taxes to continue feeding their dysfunctional habits. They're not solving problems at their root cause — they're just layering on more bullshit and expecting society to pay for it. As you can tell, I'm not a fan of inefficient bullshit — whether it's materialism or big government.
Anyway, my point is not about the government or taxation. My point is this: human nature is absolutely malleable. If it wasn't malleable, we'd still be living in huts and killing each other."
"We are still killing each other," Zac said. "The world wars weren't that long ago."
"But, in general, society has evolved alongside our knowledge. Materialism was an adequate operating system to get us to this point in our evolution. However, it's not going to cut it going forward — especially with the rapid advances we're making in artificial intelligence. Our understanding of technology is advancing so much faster than our understanding of humanity is. If we don't sync up the two growth curves, we'll inevitably end up in dystopia. Whoever builds artificial superintelligence will rule the world, and, like a hammer, they can use that tool to kill someone or build a bed for a homeless person. I'd rather it be the latter. Hence, we need to install a new, modern, lightweight OS in the collective mind of humanity, and then methodically rebuild our societal structures on top of that."