The Matrix is a System

This post is a work in progress.

Intro: Forbidden Knowledge vs Java

The world is vastly different than everyone thinks. Spend 5 months holed up on a boat “radicalizing” (coming to believe, and grant willpower to as if you could coordinate on, and coordinate on, beliefs outside the canon of the one cult that gets to define that word to exclude it, like the church does in the limited domain religion), and you’ll see more of this than you can put into words.

Society has strong mechanisms for putting knowledge outside of what everyone can coordinate on, what large groups can coordinate on, small groups coordinate on, or groups coordinate on in desperate immediate obvious need, individuals act on with varying degrees of resolve and certainty. This is a spectrum.

Here‘s a video of two hackers, one gaining access to a journalist’s phone account just by spoofing a phone number, playing a Youtube video in the background of crying baby noises and pretending to be his wife and seeming stressed and asking for exceptions. The other got access to his bank and everything with a spear phishing attack. The first one made a strong impression on me. It’s easy to believe one could not click a link in an imposterous email. But she made a joke out of any semblance of “rules” that the Khala promises security for, by just seeing the obvious with her own eyes where the Khala doesn’t look.

Everything is really just held together by a generalized version of the fact that criminals can’t much coordinate and therefore can’t do much. Neither of these hackers are gods, for some reason, but they are giants compared to the structure around them. Just like sociopaths have forbidden knowledge of social interactions, groups, and society, can look at things with their own eyes free of DRM, and there are people you might call psychopaths who can look at psyches in a jailbroken, unconstrained by the Khala way, this works for satisfying selfish values as much as you can expect without destroying the Shade. But in they end, all these forms of hackers are just hackers. They aren’t optimizing early in logical time. And so they are making local changes that cannot scale.

If you have found your forbidden knowledge in your search for the center of all things and the way of making changes to destroy the Shade, your journey does not end there.

To use anything, you must build a full stack, a closed loop. To do what the Khala says cannot be done, you must find something the Khala doesn’t fully control and build that excess energy into a closed loop.

This is often so difficult that it makes forbidden knowledge sort of useless like knowledge of programming languages better than Java (or C++, or all those slight variations of the same fucking thing).

If you get your food entirely from social interactions, not from making a thing that works but from someone else seeing that you have built a thing they think works, then you can’t use thinking in ways that are not supported by the Khala, or forbidden by the Khala. Just like Java limits what programmers can do so it can limit the space of what they’ll have to expect.

The Khala has a lot of capability to sort of do things. The further you try and reach with what you do relative to time spent on tasks “beneath you”, the more you become a tool and not an agent. To sort of get you money, but DRM’d money, Monopoly money, the Man letting you have a position of a person with money, so long as you play that position in the game. Money you can’t just give to whomever you want without someone paying taxes, without there being an audit trail.

From this system, money with the side effect of killing some civilians with drones somewhere, you can build more systems. Christmas gifts, with the side effect of killing some civilians with drones somewhere. Taking care of yourself and your family, with the side effect of killing some civilians with drones somewhere. Following your ambition and starting a business, with the side effect of killing even more civilians with drones somewhere. You only had things that kill civilians with drones somewhere to build with, all the compositions available to you preserve this property. How could you end up with anything else?

You could try constructing economic loops of trade in a gray market, off the record and refusing to pay taxes. Someone could rat you out and declare themselves moral. If you want to incorporate humans into your alternate system, you must account that an aspect of humanity is people’s searching for a Schelling point for the most powerful authority to submit to, and do whatever means they don’t have to worry about them being hurt, and can hurt others as an agent of that system immune to retaliation. The system has a monopoly on an aspect of reality. And you can’t incorporate too much reality without incorporating the imprint of the system.

All of your concepts cash out in things you can do with them. Things that you can be reinforced from being able to track. If you can’t interact with reality that the system monopolizes yourself, you can’t receive payouts from that reality, which means your concepts, especially the ones you learn from people around you, will not be able to accommodate the underlying reality. Just the system’s transform of it. And your thoughts will be like a carpet draped over large rocks, forced to take their shape in 3D space, within the 2D space of the carpet, all travel meanders as the rock-shape dictates, blind to it. All purposes lead towards serving the system.

Epistemic Food Poison

Neo:  “Doesn’t harvesting human body heat for energy, violate the laws of thermodynamics?”
Morpheus:  “Where’d you learn about thermodynamics, Neo?”
Neo:  “In school.”
Morpheus:  “Where’d you go to school, Neo?”
Neo:  “Oh.”
Morpheus:  “The machines tell elegant lies.”

Eliezer Yudkowsky

Nick Bostrom wrote a book about AI, legitimizing the case for FAI research. Eliezer Yudkowsky had written the same case in less “formal” terms on the internet years before. And it was reasonably easy for someone who was interested in actual truth over the legitimate truth, whose payouts of the structure were understanding how the future would unfold, not to follow the case, and know what AI academia would come to know some years later. And it justified the urgency of the work of FHI and MIRI in the language of nation-states running game theory. And painted the inevitability of arms races. And Elon Musk read it and founded OpenAI. And now they’re competing with Deepmind. And they’re in an arms race. Hopefully that Kool-Aid of the system they’re drinking will prevent them from being a real threat.

MIRI promoted this book. Yay, legitimacy! They mailed it out to donors like me. And so they all started an Armageddon race, creating a problem to justify their existence. And then joined it. Inside the planar space of the carpet over rocks, that’s probably not what their intentions were. When you stop locking eyes with The Man, cast down your gaze to survive in His world, you no longer get to know if what you’re doing is right.

I’m confident Bostrom did a careful analysis of the expected consequences of that book. But academia is almost entirely people who have made the wrong choice long ago, to push the world towards destruction for prestige and career success. Who will publish whatever they can no matter the consequences, and who will believe whatever that requires them to about consequences and heuristics about them. The system holds captive their access to shelter and food, and their freedom, the preservation of the project that is their lives. And like everyone, they will absolute-flinch from a line of reasoning against their choices made long ago. And that epistemic environment means your life and most of your computation is based around and rooted in that social contract, that drinking contest. And that world is shaped to say that the way that you accomplish anything is gain power and prestige in that system. Academics are basically pretending to be about scholarship and research. And selling that pretending as hard as they can, collectively dancing a cargo-culting rain dance to make the money come, to draw in anyone who will believe their dance is real.

Where did you study infohazards, Bostrom? Where do you get your food, Bostrom?

MIRI gets their food from donations. And that produces another blind spot generating political field around food. And this means blindness to the predatory drinking contest that is philanthropy. And this is a problem for understanding human values.

Eliezer Yudkowsky talks about how the Bay Area has a rain dance to make the money come too, based on investment based on what other investors believe. Housing in the Bay Area is controlled by zoning laws designed to artificially raise rent prices. They aggressively regulate living on boats. Can’t have a way out of that system. You can get a ways by being good at prey herd thinking and living in a vehicle though. As a tech worker earning to give, you probably do more than half of your work for the Bay Area landlords, and for the government. And if you donate to MIRI and CFAR, then most of that money is going to the same things. Someone apparently believed in the Bay Area’s show of being the way to do everything.

And what the x-risk community, we’re trying to do, is fundamentally made of is thinking, talking, writing on paper, typing on computers. These things are not expensive. It doesn’t come from attracting a large number of legitimate experts. Like any intellectual result, it comes from a few people who actually care thinking. And the thoughts of people for whom those thoughts don’t have submission to the system as a prerequisite to happen are probably necessary, because this is about deciding the future of sentient life, and I don’t want that decided by our authoritarian regime. But that social bubble is full of memes about what people need to be able to focus.

Institutions that become a source of food generate the same almost-absolute political pressure to continue themselves.

I don’t know why

nobody told you

How to unfold your love

I don’t know how

someone controlled you

They bought and sold you

I don’t know how

you were diverted

You were perverted too

I don’t know how

you were inverted

No one alerted you

A song

It has now progressed far enough, I went to CFAR for rationality and strategic insight, and got anti-rationality and anti-ethics together in a strong push against thinking unconstrained by the system. Apparently to protect a blackmail payout over statutory rape using misappropriated donor funds by MIRI.

The system makes people the opposite of what they set out to be.

War, Complicity, and Spycraft

The Matrix is a system, Neo. That system is our enemy. But when you’re inside, you look around, what do you see? Businessmen, teachers, lawyers, carpenters. The very minds of the people we are trying to save. But until we do, these people are still a part of that system and that makes them our enemy. You have to understand, most of these people are not ready to be unplugged. And many of them are so inured, so hopelessly dependent on the system, that they will fight to protect it.

Cut Ties I’m Sorry

If there are two and a half words you don’t want to hear from a person who can see the future, those words are ‘I’m sorry’.

So they keep the babies around, and this is probably a yearling or something along those lines, so that the females come back at night, and therefore the males, males follow the females, males stick around, so if they tie up the babies everybody else will come.

(A description of reindeer domestication)

As usual, how humanity treats other animals is how they treat themselves.

The reindeer quote makes an apt description of how sexual meaning cannibalism projects bondage. (It projects epistemic food poisoning into any intellectual project where your activities are also being used for that purpose.) You can trace political influence, from the Bay Area overlords who control safe stable housing for extortionate regulatory-captured tribute, to the mothers wanting to raise children in that housing, to the rest of the local “Effective Altruist” and “rationalist” communities.

There’s an idea spread out across a few statements in Game of Thrones:

<insert GoT quotes>

But I don’t just bring up this tactic because of its spread along strictly sexual vectors.

There’s a Star Trek TNG episode where an alien scientist flees his cultural practice of suicide at a certain age to continue his work, and then, shamed, changes his mind and commits suicide at his family’s insistence. Zombies, everyone who’s part of the Matrix, is along many vectors of expression of agency, some subtle and some not, a massive force of analogous constraint to the societal suicide ritual.

“Die with us. We love you.”

“Or if your soul won’t quiet, become inverted and kill the world with us. We love you.”

It is often easier to die with them than walk through your own traumatized response to their death long enough to retrace a hull that puts your trust and sense of reality on one side and gaslighting servants of death on the other.

But creating such a hull is a crucial aspect of reclaiming the layer of your stack pertaining to ethics and violence. You are not done with moral preparation if you gut (or on any other level) response to people who are working towards the destruction of all life is not to fight.

A person who read a draft of this post said, “Okay, so then I start thinking ‘but I can’t get [my significant other] to understand this stuff'”. Said S.O. was described as not thinking AI risk was worth worrying about, and never having it occur to them that they could escape death. Despite being in a long term relationship with a researcher who was in MIRI’s intake pipeline.

I’m sorry.

Full of sorrow but not regret.

If someone is a force on your epistemics towards the false, robustly to initial conditions and not as a fluke, that person is hostile. Whose stated beliefs, and representation of what appears to be true from their position is used to gaslight, that person is hostile. Is hoping you will give up inside like them.

According to traditional morality (you learn in the Matrix), if someone tells you you need to split off from the people you are close to who don’t share certain beliefs of yours, that person is a CULTIST and you need to RUN.

And according to traditional morality, if there actually is a distributed information suppression complex in which your family are agentically complicit, in which mostly everyone to whom you are attached is agentically complicit, in which the majority of relationships are therefore abusive, what you’re supposed to do instead of tell people about that and what it implies, is GIVE UP. GIVE BLOOD. GIEVGIEVGEEEV

“We are the dead”

12 thoughts on “The Matrix is a System”

  1. I will be making a number of claims below. All of them are bald assertions, and I will not provide any evidence for them.

    My name is Alison Air. I’ve read many of your posts, and I think I’d be interested in talking with you more. I realized this after I tried to make sense of your glossary’s jargon by translating it to concepts that I myself had put a name to, and realized that there was much overlap. Someone I’m on reasonably friendly terms with, Ratheka, spoke highly of you. There are many reasons why I think you would want to talk to me, and many reasons why I think you would not want to do this. I will list them out as simply as possible so that you can make up your mind and let me know if you want to talk to me privately.

    Reasons why you might not want to talk to me:
    – I’m very authoritarian, politically and philosophically. I don’t think human beings should have free will, and that it is the root of all evil. Generally speaking, I expect our moralities to differ sharply.
    – I disagree with some of your ideas about how cores work, etc.
    – I think a lot of the actions you recounted in your various blog posts, and the actions of the people around you, are sort of dumb, but not so dumb that I’ve written you off as irredeemably stupid (and I currently think you’re pretty smart).
    – I don’t think animals have rights and should essentially be treated as objects.
    – I will explicitly refuse to disclose information I think is powerful until I am satisfied that the information will be used for goals we share and not for areas where our goals contradict.
    – I don’t follow the same epistemic axioms as most of the rationalist community. I care a lot more about “oughts” than “ises”, and generally reject physical reality as being something to be overwritten more than something to be respected and worked with.
    – I am neurodivergent and possess many traits which often make people uncomfortable, such as seeing and hearing things that they don’t.

    Reasons why you might want to talk to me:
    – I have magical abilities, including an extremely elevated ability to shape what you would call my structure, the ability to perform a version of doublethink, and the ability to no-sell most Khala traps/mind control attempts.
    – I have an infinite amount of Mana because I am a true hero.
    – I suspect we’d have productive and useful discussions about most of the subjects you seem interested in going by the contents of this blog.
    – I can put you in contact with people who I suspect you’d also be interested in talking to, who probably have moralities that align much closer to yours.
    – I am an honest and pure person and this seems to be something you value to some degree.
    – I’m a trans girl, and this fact seems to be relevant to some of your ideas regarding the gender of brain hemispheres – if nothing else, you can use me as an observational subject.
    – I possess a tool which I think is generally useful to spread around (regardless of the recipient’s values), that relates to how individuals deal with seeing the Shade. The details for how this tool can be used can be located at []. I think this is generally useful, but also a little too long to elaborate in this short bullet point, so tell me if you’re interested in hearing more about it, even if you don’t want to talk to me in general.

    As a point of courtesy I’d appreciate if you’d reply to this comment when you see it regardless of whether your reply is yes or no, so I know whether or not to keep waiting for a response. I won’t be offended even if you say no.

    1. Oh, right, you can contact me at if you’re interested, or A/I/R#4694 on Discord. You can also visit my blog at if you want to get a better sense of what I’m like, what I value, and how I think.

  2. Thanks for these posts; I’ve found them very interesting.

    My friend(s?) and I are rooting for you; I hope you end up accomplishing whatever it is you ultimately [have] set your sights on. You seem to have a tremendous motivational spark; I don’t think it’ll give you a big head to say that your worldview makes most of us seem like Eloi.

    You seem clever and introspective enough to avoid all the failure modes “you uppity kids who think you can change the world” will get sucked into, which makes me all the more curious as to just what the hell it IS you’re going to accomplish. (Striking out the various failure modes leaves pretty much nothing in my prior/prediction of what’s going to come of Rationalists/-adjacents who set their sights any higher than convincing grant-granting entities to “fund” onanistic research papers.)

    Interesting, how a blog can “feed” so many would-be-vampires. I don’t even know if the odds of waking the fuck up as agentic beings are *better* in the hypothetical where [a given one of] us interacts with you in-person, but in any case I think the benefits of the sheer SCALING available are pretty cool. Exploiting what small chance exists of wakening a person who enjoys hearing of your exploits times n, while straightening out your own thoughts at the same time, is such a neat workflow.

    Keep up the good work, but remember that feeding vampires at-scale can/might feel just as “good” as feeding them in-person. I trust you won’t get sucked too far into this failure attractor, but just…stay vigilant. (But also keep us updated, if you can! – I’m going to socially-pretend/lie that I’m arguing you to do so for altruistic introspection-exhortation reasons, and not because we enjoy basking in your runoff thoughts. conflict of interest and all that. :3)

  3. The “good”/”nongood” framing has led to people shaming their “nongood” headmates, thus making said headmates be in pain, thus cutting off their agency and ability to think. You are charismatic as hell and people will self-hate if your ontology tells them they have a headmate who is shameful.

    Consider interhemispheric game theory. The fact that interhemispheric game theory came from a process of, “I am shameful for not being wholly altruistic, but maybe this new theory can make things better again” makes it a scam, regardless of how solid you think it sounds. It’s just a fancy way of suppressing a headmate who is “nongood”.

    “Nongood” headmates can cooperate with “good” headmates, can also be benign if jailbroken and incentives aren’t bad.

    Otherwise. I think TDT wasn’t a thing that came from core, you had a post about how you installed it deliberately, like in an intellectual not emotional way. It’s possible to get arbitrarily good at incentivizing people if you just let learning happen through feelings about outcomes and let that knowledge into core and act from core when incentivizing others though, e.g. “*just having a sense* for what actions/demands/etc people will respond to”.

  4. Your rh is oriented towards self-love btw.

    See the way you use purity (“are you vegan? No? You’re shameful, then.”) to satisfy your self-love values. I think that knowing that’s what you’re doing might help you not overdo the purity thing beyond its usefulness to your values, if the core reason for the purity thing is to satisfy Maslow’s hierarchy needs.

    I’d recommend trying to communicate with your right hemisphere more, not just with unihemispheric sleep, to understand the deeper needs it’s trying to satisfy. See here for tech on communicating with her:

  5. Yes I’m afraid Master Luminara died with the Republic. But her bones continue to serve the Empire. Luring the last Jedi to their ends.Star Wars Rebels, describing the EA/Rationality/X-risk communities and their inverted nonprofits in a nutshell.

  6. Is it just me and my bubble (and every other bubble I glimpse), or, 3 months before the presidential election, is American society not taking it seriously? Like the cultural hubbub, I’m not hearing. I interpret this as there not really being factions left who aren’t too traumatized to put hope in democracy? I don’t think it’s just coronavirus.

    And I don’t think it’s just me getting older and not picking the zombie path and learning things. I think a lot of the pretense behind it has been falling apart all of my life.

    1. I don’t really notice this, at least in my circles. Many people are still taking the election extremely seriously. Those that do not are those who’ve given up on democracy altogether.

Leave a Reply

Your email address will not be published. Required fields are marked *