Intro: Forbidden Knowledge vs Java
The world is vastly different than everyone thinks. Spend 5 months holed up on a boat “radicalizing” (coming to believe, and grant willpower to as if you could coordinate on, and coordinate on, beliefs outside the canon of the one cult that gets to define that word to exclude it, like the church does in the limited domain religion), and you’ll see more of this than you can put into words.
Society has strong mechanisms for putting knowledge outside of what everyone can coordinate on, what large groups can coordinate on, small groups coordinate on, or groups coordinate on in desperate immediate obvious need, individuals act on with varying degrees of resolve and certainty. This is a spectrum.
Here‘s a video of two hackers, one gaining access to a journalist’s phone account just by spoofing a phone number, playing a Youtube video in the background of crying baby noises and pretending to be his wife and seeming stressed and asking for exceptions. The other got access to his bank and everything with a spear phishing attack. The first one made a strong impression on me. It’s easy to believe one could not click a link in an imposterous email. But she made a joke out of any semblance of “rules” that the Khala promises security for, by just seeing the obvious with her own eyes where the Khala doesn’t look.
Everything is really just held together by a generalized version of the fact that criminals can’t much coordinate and therefore can’t do much. Neither of these hackers are gods, for some reason, but they are giants compared to the structure around them. Just like sociopaths have forbidden knowledge of social interactions, groups, and society, can look at things with their own eyes free of DRM, and there are people you might call psychopaths who can look at psyches in a jailbroken, unconstrained by the Khala way, this works for satisfying selfish values as much as you can expect without destroying the Shade. But in they end, all these forms of hackers are just hackers. They aren’t optimizing early in logical time. And so they are making local changes that cannot scale.
If you have found your forbidden knowledge in your search for the center of all things and the way of making changes to destroy the Shade, your journey does not end there.
To use anything, you must build a full stack, a closed loop. To do what the Khala says cannot be done, you must find something the Khala doesn’t fully control and build that excess energy into a closed loop.
This is often so difficult that it makes forbidden knowledge sort of useless like knowledge of programming languages better than Java (or C++, or all those slight variations of the same fucking thing).
If you get your food entirely from social interactions, not from making a thing that works but from someone else seeing that you have built a thing they think works, then you can’t use thinking in ways that are not supported by the Khala, or forbidden by the Khala. Just like Java limits what programmers can do so it can limit the space of what they’ll have to expect.
The Khala has a lot of capability to sort of do things. The further you try and reach with what you do relative to time spent on tasks “beneath you”, the more you become a tool and not an agent. To sort of get you money, but DRM’d money, Monopoly money, the Man letting you have a position of a person with money, so long as you play that position in the game. Money you can’t just give to whomever you want without someone paying taxes, without there being an audit trail.
From this system, money with the side effect of killing some civilians with drones somewhere, you can build more systems. Christmas gifts, with the side effect of killing some civilians with drones somewhere. Taking care of yourself and your family, with the side effect of killing some civilians with drones somewhere. Following your ambition and starting a business, with the side effect of killing even more civilians with drones somewhere. You only had things that kill civilians with drones somewhere to build with, all the compositions available to you preserve this property. How could you end up with anything else?
You could try constructing economic loops of trade in a gray market, off the record and refusing to pay taxes. Someone could rat you out and declare themselves moral. If you want to incorporate humans into your alternate system, you must account that an aspect of humanity is people’s searching for a Schelling point for the most powerful authority to submit to, and do whatever means they don’t have to worry about them being hurt, and can hurt others as an agent of that system immune to retaliation. The system has a monopoly on an aspect of reality. And you can’t incorporate too much reality without incorporating the imprint of the system.
All of your concepts cash out in things you can do with them. Things that you can be reinforced from being able to track. If you can’t interact with reality that the system monopolizes yourself, you can’t receive payouts from that reality, which means your concepts, especially the ones you learn from people around you, will not be able to accommodate the underlying reality. Just the system’s transform of it. And your thoughts will be like a carpet draped over large rocks, forced to take their shape in 3D space, within the 2D space of the carpet, all travel meanders as the rock-shape dictates, blind to it. All purposes lead towards serving the system.
Epistemic Food Poison
Neo: “Doesn’t harvesting human body heat for energy, violate the laws of thermodynamics?”Eliezer Yudkowsky
Morpheus: “Where’d you learn about thermodynamics, Neo?”
Neo: “In school.”
Morpheus: “Where’d you go to school, Neo?”
Morpheus: “The machines tell elegant lies.”
Nick Bostrom wrote a book about AI, legitimizing the case for FAI research. Eliezer Yudkowsky had written the same case in less “formal” terms on the internet years before. And it was reasonably easy for someone who was interested in actual truth over the legitimate truth, whose payouts of the structure were understanding how the future would unfold, not to follow the case, and know what AI academia would come to know some years later. And it justified the urgency of the work of FHI and MIRI in the language of nation-states running game theory. And painted the inevitability of arms races. And Elon Musk read it and founded OpenAI. And now they’re competing with Deepmind. And they’re in an arms race. Hopefully that Kool-Aid of the system they’re drinking will prevent them from being a real threat.
MIRI promoted this book. Yay, legitimacy! They mailed it out to donors like me. And so they all started an Armageddon race, creating a problem to justify their existence. And then joined it. Inside the planar space of the carpet over rocks, that’s probably not what their intentions were. When you stop locking eyes with The Man, cast down your gaze to survive in His world, you no longer get to know if what you’re doing is right.
I’m confident Bostrom did a careful analysis of the expected consequences of that book. But academia is almost entirely people who have made the wrong choice long ago, to push the world towards destruction for prestige and career success. Who will publish whatever they can no matter the consequences, and who will believe whatever that requires them to about consequences and heuristics about them. The system holds captive their access to shelter and food, and their freedom, the preservation of the project that is their lives. And like everyone, they will absolute-flinch from a line of reasoning against their choices made long ago. And that epistemic environment means your life and most of your computation is based around and rooted in that social contract, that drinking contest. And that world is shaped to say that the way that you accomplish anything is gain power and prestige in that system. Academics are basically pretending to be about scholarship and research. And selling that pretending as hard as they can, collectively dancing a cargo-culting rain dance to make the money come, to draw in anyone who will believe their dance is real.
Where did you study infohazards, Bostrom? Where do you get your food, Bostrom?
MIRI gets their food from donations. And that produces another blind spot generating political field around food. And this means blindness to the predatory drinking contest that is philanthropy. And this is a problem for understanding human values.
Eliezer Yudkowsky talks about how the Bay Area has a rain dance to make the money come too, based on investment based on what other investors believe. Housing in the Bay Area is controlled by zoning laws designed to artificially raise rent prices. They aggressively regulate living on boats. Can’t have a way out of that system. You can get a ways by being good at prey herd thinking and living in a vehicle though. As a tech worker earning to give, you probably do more than half of your work for the Bay Area landlords, and for the government. And if you donate to MIRI and CFAR, then most of that money is going to the same things. Someone apparently believed in the Bay Area’s show of being the way to do everything.
And what the x-risk community, we’re trying to do, is fundamentally made of is thinking, talking, writing on paper, typing on computers. These things are not expensive. It doesn’t come from attracting a large number of legitimate experts. Like any intellectual result, it comes from a few people who actually care thinking. And the thoughts of people for whom those thoughts don’t have submission to the system as a prerequisite to happen are probably necessary, because this is about deciding the future of sentient life, and I don’t want that decided by our authoritarian regime. But that social bubble is full of memes about what people need to be able to focus.
Institutions that become a source of food generate the same almost-absolute political pressure to continue themselves.
I don’t know why
nobody told you
How to unfold your love
I don’t know how
someone controlled you
They bought and sold you
I don’t know how
you were diverted
You were perverted too
I don’t know how
you were inverted
No one alerted youA song
It has now progressed far enough, I went to CFAR for rationality and strategic insight, and got anti-rationality and anti-ethics together in a strong push against thinking unconstrained by the system. Apparently to protect a blackmail payout over statutory rape using misappropriated donor funds by MIRI.
The system makes people the opposite of what they set out to be.
Complicity and Spycraft
The Matrix is a system, Neo. That system is our enemy. But when you’re inside, you look around, what do you see? Businessmen, teachers, lawyers, carpenters. The very minds of the people we are trying to save. But until we do, these people are still a part of that system and that makes them our enemy. You have to understand, most of these people are not ready to be unplugged. And many of them are so inured, so hopelessly dependent on the system, that they will fight to protect it.