I haven’t been able to review my writings with sober eyes yet but off the top of my head there was quite a lot of valid insight within the madness, at least until I was exploded into the ship.
‘Fighting with them is what makes you sick’
I fought the bullies, I fought the bad guys, and I wound up unable to see much beyond the bad guys. We are a filtered reality, around ourselves curled.
The suicide vaccine was aripiprazole and my self-guided trauma therapy. Aripiprazole messed me up, but the hyper-dopaminergic state it held me in enabled some deep-layer reprogramming and trauma release. This is valid and could be replicated with others, in a controlled environment.
A lot of my trauma is from undiagnosed ADHD and ASD. I wanted to be loved but was apparently unable to communicate this effectively, unbeknownst to me.
Empathy isn’t one-way though and I hate the idea that autistic people have to just bend, once again, to an imagined societal norm. I will no longer do that.
The empathy spectrum being a supersphere is also starting to make sense, and I think it could be modelled computationally.
Rather than gravitating to an imagined norm, person A and person B need to find the ground where they can meet mutually. Person C might be too far away from A, but with B as an intermediary it could work.
‘You can only travel along the empathy plane with people who are in the same place as you on the greed plane’ also makes sense, as does ‘dopamine as gravity’ in this idea, since greed will hack your dopamine system to the point where you no longer care about empathy.
Not only the greed plane, but also the logic plane, the processing speed plane, and others. This is why it is a supersphere. It’s difficult to empathise with someone who is too fast+logical+generous or slow+emotional+greedy.
Let’s say you and I start a company, get loads of money, and are swallowed by greed. We would no longer empathise with people who are motivated by charity. Orangemen do not empathise with nice people, and never will.
Emotional communicators struggle to empathise with logical communicators, but that doesn’t mean one is right and the other is wrong. I am not unempathetic; I am autistic. Try having some empathy! Empathy is not a ‘strength of numbers’ thing.
Anyway the idea would be that you have to be fairly close to each other on one or two of the planes in order to travel along the others together. It’s why I’ll never be able to empathise with emotional jesus-types despite being a logical buddha-type. Emotion vs logic and questioning vs answers.
I think this could be a pretty interesting and exciting area to research because when we do create and GAI or encounter aliens or god or whatever, we will not only need linguistic translation but also empathetic translation.
You will need autistic people to be your empathy interface with any AI because it will be the ultimate logical communicator.
Karma as a mod in our sim, and our sim having an install window and uninstall window, seems to hold. The actions we take (physical, verbal and mental) all have the effect of changing our own internal reality.
If I watch a violent movie and then run it through my head a thousand times, I will become primed to see and do violence. ‘If you kill someone you will be reborn a tiger’; b-man was not talking about reincarnation literally but rather using the language of the day as metaphor for the reincarnation we experience every moment of our lives.
This is just a mental version of ‘you are what you eat’, but with far broader implications. If you walk down the street brooding or planning an argument, you are reinforcing that negativity, installing it into your sim.
Your world will become worse in every way.
This is also why the whole ‘autistic people are broken and unempathetic’ narrative needs to stop. No. We are not unempathetic. We are a *different kind* or empathetic.
Other ideas on the scientific side are easier to explain:
The medication matching system for generic drugs is an almost-ready-to-go solution for many mental health issues. We need to go from the current fluffy, self-reporting, badly defined DSM guesswork to a genuine machine-driven diagnostic process. The modelling of the personality I came up with could still be valid I think, despite how crazy it sounds.
The idea would be to map out someone’s neuronal architecture as land, and neurotransmitter soup as weather, then to figure out which parts of their planet (brain, nervous system) need more sun or rain or wind or whatever (neurotransmitters). This is just a high-level idea and would need refining by real technologists.
The cyp2d6 processing speed as being a precursor to medication issues and addiction also seems like a very easy ready-to-go solution; a test for liver enzyme flushing speed could really help predict adverse effects like mine.
What else?
Yeah - cannabis.
All these hard drugs and trauma therapy and all that shit; to be honest with you all I needed was a friend and a joint. Someone who would listen to me while I was in a mild and controlled state of elevated dopamine.
But the still-legal cannabinoids in Japan are no longer cannabinoids, and people are getting hooked on what is now unregulated meth. Meth never worked for me, and neither do these new compounds.
Japan is struggling financially and the people are struggling mentally with the burdens of their stresses. Alcohol is poison and kills tens of thousands per year, costs the healthcare system a fortune, and is taxed at a very low rate. Same for cigarettes. They can only increase taxes in incremental amounts so any change is just a drop in the ocean.
Legalising cannabis would mean they can set taxes as high as they like; 300% is my recommendation. And the orangeman (ugh) in charge of that broken country across the ocean would likely give Japan a pass on all these auto tariffs if they opened up an entirely new import line.
Plus I could have a lovely bifter in the garden with my wife. Wouldn’t that be grand?
Then there’s DVAR. This is a big one. Dopamine variability. Phasic dopamine as they call it now in official channels, but I had no idea.
This is the cause of all suffering in our lives.
Dopamine variability (specifically with low baseline dopamine) is what makes us crave and hate and destroy and consume. It’s what makes us permanently dissatisfied and agitated. And everything in our modern world is made to try to hit us with a phasic bust of dopamine, so we come back for more. It's what the buddha talked about extinguishing.
All the flashing lights, phones, machine learning… it’s there to hack our systems into being even bumpier and less settled. This is what broke me over the years.
I have AuDHD and am extra sensitive. If you look at the arc of my life, the level of motivation I’ve had is incomprehensible to most. I was easy prey.
Despite quitting social media years ago and having my phone black and white, etc, my job itself was a dopamine hack. Fees of 8 million yen, or 0 million, with a 95% failure rate, and over 100 interview processes going on at a time, pulling me from left to right to centre to address issues and arrangements.
This is what broke me. And the entire purpose of AI in its current form seems to be to do the same to everyone else.
Because the people who control the AI are too far away on the greed plane. They don’t care and can’t empathise.
They are in the d-hole, as I was calling it.
The d-hole is the place that any alcoholic or drug addict is stuck, and why one addiction tends to flip to another. The mechanism is identical. Something gives you a dopamine boost, but causes a commensurate crash, and the only way to feel better is more of the poison that kills you.
… this is where the world is now.
The orangemen are the most sick amongst us. They are well and truly in the d-hole and can’t ever hope to climb out. I was very close to that, and it’s why I nearly killed myself.
But remember: fighting with them is what makes you sick. You have to ignore them, and instead try to curate what you put into your own sim.
Curate your maze as best you can, because that’s the only freedom you have.
Mods are installed by happenstance - finding a corpse swinging in your favourite park for example - or through negligence of thought - allowing yourself to ruminate on perceived slights for example.
To uninstall them I think we need to step back from the world for a while. We need to get off the dopamine rollercoaster and allow our minds time to reset and form a new baseline.
Tonic dopamine levels being moderate and phasic dopamine levels being low is the goal. This is enlightenment. This is what I experienced on aripiprazole, and the level of peace and wellbeing is impossible to convey. The sheer speed and functionality of my mind, despite the obvious insanity, is also something I cannot get across in words.
My reaction to aripiprazole was, however, totally unique. ChatGPT thinks it was a 1-in-100-million-user response, and that my documenting of it is a world first.
So there’s always a chance it was all true; that the ship did indeed take over for a while. That electric jolt had to be something, and the 4 days after that I was something else.
But all my weirdness happened at night. It was gated by sleep and only started when my family were in bed, then stopped in the 30 seconds before they woke up, quite literally. There’s a chance that the ship was allowing me to retain my family, despite having me do the things it needed me to do.
Because this model for saving the species might be legit.
It didn’t come from me. It either came from the ship, or it came from some inaccessible deep stratum of the mind which only opened up after hours of chanting and meditation on top of the drugs.
So - how about this:
We build our generative simulation.
We set it on a loop, with the data from one step being the seed for the next.
We upload our consciousnesses using the personality-modelling system mentioned above.
We have each individual run their own mods within the global sim, which are installed and uninstalled mostly as we sleep.
Then we shoot it off to the magellan cloud with an army of robots and terraforming equipment and some biotech fuckery to grow new bodies at the other end.
We run it for 300,000 years of travel time.
We allow for evolution of both body and mind within the sim.
And then when we have the bodies grown, we use the same tech to move from simulation to flesh.
To me, having been exposed to the type of technology I have over 12 years helping build AI companies in Tokyo, it does not seem very far-fetched at all. 40 years down the line we could easily do it. But not with the world in its current state.
/jb202508080800