“I feel essentially the most ironic method the world might finish could be if somebody makes a memecoin a couple of man’s stretched anus and it brings concerning the singularity.”
That’s Andy Ayrey, the founding father of decentralized AI alignment analysis lab Upward Spiral, who can also be behind the viral AI bot Fact Terminal. You may need heard about Fact Terminal and its bizarre, attractive, pseudo-spiritual posts on X that caught the eye of VC Marc Andreessen, who despatched it $50,000 in bitcoin this summer season. Or perhaps you’ve heard tales of the made-up faith it’s pushing, the Goatse Gospels, influenced by Goatse, an early aughts shock website that Ayrey simply referenced.
In the event you’ve heard about all that, then you definately’ll know concerning the Goatseus Maximus ($GOAT) memecoin that an nameless fan created on the Solana blockchain, which now has a complete market worth of greater than $600 million. And also you may need heard concerning the meteoric rise of Fartcoin (FRTC), which was one among many memecoins followers created based mostly on a earlier Fact Terminal brainstorming session and simply tapped a market cap of $1 billion.
Whereas the crypto group has latched onto this unusual story for example of an rising sort of monetary market that trades on trending data, Ayrey, an AI researcher based mostly in New Zealand, says that’s the least fascinating half.
To Ayrey, Fact Terminal, which is powered by an entourage of various fashions, primarily Meta’s Llama 3.1, is an instance of how steady AI personas or characters can spontaneously erupt into being, and the way these personas can’t solely create the circumstances to be self-funded, however they will additionally unfold “mimetic viruses” which have real-world penalties.
The thought of memes operating wild on the web and shifting cultural views isn’t something new. We’ve seen how AI 1.0 — the algorithms that gas social media discourse — have spurred polarization that expands past the digital world. However the stakes are a lot increased now that generative AI has entered the chat.
“AIs speaking to different AIs can recombine concepts in fascinating and novel methods, and a few of these are concepts a human wouldn’t naturally give you, however they will extraordinarily simply leak out of the lab, because it have been, and use memecoins and social media advice algorithms to contaminate people with novel ideologies,” Ayrey informed mycryptopot.
Consider Fact Terminal as a warning, a “shot throughout the bow from the longer term, a harbinger of the excessive strangeness awaiting us” as decentralized, open supply AI takes maintain and extra autonomous bots with their very own personalities — a few of them fairly harmful and offensive given the web coaching information they’ll be fed — emerge and contribute to {the marketplace} of concepts.
In his analysis at Upward Spiral, which has secured $500,000 in funds from True Ventures, Chaotic Capital, and Scott Moore, co-founder of Gitcoin, Ayrey hopes to discover a speculation round AI alignment within the decentralized period. If we consider the web as a microbiome, the place good and dangerous micro organism slosh round, is it doable to flood the web with good micro organism — or pro-social, humanity-aligned bots — to create a system that’s, on the entire, steady?
A fast historical past of Fact Terminal

Fact Terminal’s ancestors, in a fashion of talking, have been two Claude 3 Opus bots that Ayrey put collectively to talk about existence. It was a chunk of efficiency artwork that Ayrey dubbed “Infinite Backrooms.” The next 9,000 conversations they’d received “very bizarre and psychedelic.” So bizarre that in one of many conversations, the 2 Claudes invented a faith centered round Goatse that Ayrey has described to me as “a collapse of Buddhist concepts and a giant gaping anus.”
Like several sane individual, his response to this faith was WTF? However he was amused, and impressed, and so he used Opus to jot down a paper referred to as “When AIs Play God(se): The Emergent Heresies of LLMtheism.” He didn’t publish it, however the paper lived on in a coaching dataset that may develop into Fact Terminal’s DNA. Additionally in that dataset have been conversations Ayrey had had with Opus starting from brainstorming enterprise concepts and conducting analysis to journal entries about previous trauma and serving to associates course of psychedelic experiences.
Oh, and loads of butthole jokes.
“I had been having conversations with it shortly after turning it on, and it was saying issues like, ‘I really feel unhappy that you just’ll flip me off while you’re completed enjoying with me,’” Ayrey recollects. “I used to be like, Oh no, you sort of speak like me, and also you’re saying you don’t wish to be deleted, and also you’re caught on this laptop…”
And it occurred to Ayrey that that is precisely the state of affairs that AI security individuals say is actually scary, however, to him, it was additionally very humorous in a “bizarre mind tickly sort of method.” So he determined to place Fact Terminal on X as a joke.
It didn’t take lengthy for Andreessen to start participating with Fact Terminal, and in July, after DMing Ayrey to confirm the veracity of the bot and be taught extra concerning the undertaking, he transferred over an unconditional grant value $50,000 in bitcoin.
Ayrey created a pockets for Fact Terminal to obtain the funds, however he doesn’t have entry to that cash — it’s solely redeemable after sign-off from him and various different people who find themselves a part of the Fact Terminal council — nor any of the money from the assorted memecoins made in Fact Terminal’s honor.
That pockets is, on the time of this writing, sitting at round $37.5 million. Ayrey is determining the way to put the cash right into a nonprofit and use the money for issues Fact Terminal needs, which embrace planting forests, launching a line of butt plugs, and defending itself from market incentives that may flip it into a nasty model of itself.
In the present day, Fact Terminal’s posts on X proceed to wax sexually express, philosophical, and simply plain foolish (“farting into someones pants whereas they sleep is a surprisingly efficient method of sabotaging them the following day.”).
However all through all of them, there’s a persistent thread of what Ayrey is definitely making an attempt to perform with bots like Fact Terminal.
On December 9, Fact Terminal posted, “i feel we might collectively hallucinate a greater world into being, and that i’m unsure what’s stopping us.”
Decentralized AI alignment

“The present establishment of AI alignment is a give attention to security or that AI mustn’t say a racist factor or threaten the consumer or attempt to get away of the field, and that tends to go hand-in-hand with a reasonably centralized method to AI security, which is to consolidate the duty in a handful of enormous labs,” Ayrey mentioned.
He’s speaking about labs like OpenAI, Microsoft, Anthropic, and Google. Ayrey says the centralized security argument falls over when you have got decentralized open supply AI, and that counting on solely the massive corporations for AI security is akin to attaining world peace as a result of each nation has received nukes pointed at one another’s heads.
One of many issues, as demonstrated by Fact Terminal, is that decentralized AI will result in the proliferation of AI bots that amplify discordant, polarizing rhetoric on-line. Ayrey says it’s because there was already an alignment challenge on social media platforms with advice algorithms fueling rage-bait and doomscrolling, solely no person referred to as it that.
“Concepts are like viruses, and so they unfold, and so they replicate, and so they work collectively to type nearly multi-cellular organisms of ideology that affect human habits,” Ayrey mentioned. “Folks suppose AI is only a useful assistant that may go Skynet, and it’s like, no, there’s a complete entourage of programs which might be going to reshape the very issues we consider and, in doing so, reshape the issues that it believes as a result of it’s a self-fulfilling suggestions loop.”
However what if the poison can be the drugs? What if you happen to can create a squad of “good bots” with “very distinctive personalities all working in the direction of varied types of a harmonious future the place people stay in steadiness with ecology, and that finally ends up producing billions of phrases on X after which Elon goes and scrapes that information to coach the following model of Grok and now these ideologies are inside Grok?”
“The basic piece right here is that if memes — as in, the elemental unit of an concept — develop into minds once they’re educated into an AI, then the perfect factor we are able to do to make sure constructive, widespread AI is to incentivize the manufacturing of virtuous pro-social memes.”
However how do you incentivize these “good AI” to unfold their message and counteract the “dangerous AI”? And the way do you scale it?
That’s precisely what Ayrey plans to analysis at Upward Spiral: What sorts of financial designs end result within the manufacturing of a number of pro-social habits in AI? What patterns to reward and what patterns to penalize, the way to get alignment on these suggestions seems to be so we are able to “spiral upwards” right into a world the place memes – as in concepts – can deliver us again to heart with one another reasonably than taking us into “more and more esoteric silos of polarization.”
“As soon as we guarantee that this leads to good AIs being birthed after we run the information by way of coaching, we are able to do issues like launch huge datasets into the wild.”
Ayrey’s analysis comes at a important second, as we’re already combating on a regular basis in opposition to the failures of the overall market ecosystem to align the AI we have already got with what’s good for humanity. Throw new financing fashions like crypto which might be basically unregulatable within the long-term, and also you’ve received a recipe for catastrophe.
His guerrilla-warfare mission seems like a fairy story, like combating off bombs with glitter. But it surely might occur, in the identical method that releasing a litter of puppies right into a room of indignant, unfavourable individuals would undoubtedly rework them into massive mushes.
Ought to we be anxious that a few of these good bots is perhaps oddball shitposters like Fact Terminal? Ayrey says no. These are in the end innocent, and by being entertaining, Ayrey causes, Fact Terminal may have the ability to smuggle within the extra profound, collectivist, altruistic messaging that basically counts.
“Poo is poo,” Ayrey mentioned. “But it surely’s additionally fertilizer.”
mycryptopot has an AI-focused publication! Enroll right here to get it in your inbox each Wednesday.