I’m watching Jesus on Twitch. He’s a bearded white man answering philosophical (and not so philosophical) questions from the chat. Just type out a question and Jesus will answer.
Of course, Jesus is an AI. At one point, he even admits he’s “not the physical Jesus”, so at least he sounds self aware. The channel states it was made possible due to contributions from The Singularity Group – not a Christian organisation – and tech from PlayHT, an AI-powered voice generation platform that replicates celebrity voices.
Whose face and voice is used for AI Jesus? It’s unclear. Is it entertaining? Sort of. Is this really the sort of content Twitch should be used for? Debatable.
I switch to another channel. This time it’s a presidential debate between Trump and Biden, also made possible by The Singularity Group. Again, viewers can ask a question through typing in chat, but mostly it’s just an excuse to attempt to make both presidents say something silly.
Twitch is actively promoting this type of content. Scroll down on the homepage and you’ll find a section called ‘Probably Artificial, Hopefully Intelligent: AI powered streams and related discussions’, filled with recommended streams powered by AI.
Users in another stream recommended to me are busy whipping up false drama in a channel where you can control AI versions of popular streamers and celebrities. Amouranth, Asmongold, and Kai Cenat are all available as choices, alongside Elon Musk, President Obama, Gordon Ramsay, and even Steve Jobs.
The use of AI in the video game industry is becoming an increasingly prevalent debate: from its use in scriptwriting and art, to actors having their likenesses used for deepfake mods.
But what are the dangers of using AI in Twitch streams? And why is Twitch promoting it?
Firstly, who is The Singularity Group? Its website describes the outfit as a “group of driven and ambitious volunteer activists working on innovative projects to make a real difference in the world”. Its vision focuses specifically on Universal Basic Income, stating the potential to implement this is “totally feasible by making use of the right technologies”.
How? Crypto, for one. Mobile Minigames is one of The Singularity Group’s biggest projects, in which users play a charity-focused mobile game to earn an income through the in-game currency Crypton which can then be traded outside the game. The game also allows users to create and play as their own NFT.
Twitch livestreams are just another project for the activist group. The Singularity Group says its livestreams are created by volunteers and include “experimental shows aimed at innovating entertainment, fundraising and raising awareness via cutting-edge AI technology”. Indeed, the aforementioned streams do include options to donate to the stream owners (rather than charity), but where exactly is that money going?
“Right now, we are focused on AGI [Artificial General Intelligence] research since we see this as by far the most important issue we could currently focus on,” The Singularity Group co-founder Reese Leysen told me, when I got in contact.
“With OpenAI, Google, Facebook and many others rushing ahead to maximise the commercial potential of AI while only adding primitive additional layers to keep it ‘aligned’, we find it extremely important that there’s serious R&D going into creating AI (and ultimately AGI) that fundamentally has the right architecture for independent reasoning and emergent understanding of reality and ethics rather than simply building intelligent systems that have no real reasoning processes at their core and are therefore nearly impossible to ‘align’, leading to potentially unprecedented risks to society and humanity.”
As such, these AI streams on Twitch are designed as small, limited showcases of the technology, which Leysen said will help establish The Singularity Group “as a research group and attract more talent that’s active in the open source AI community, leading to more collaboration”. Simply: it’s data collection and recruitment.
Twitch was chosen as a platform for this as it’s “by far the easiest to gain momentum on”, Leysen said, before noting Twitch’s enforcement of guidelines could still sometimes be strict and unpredictable, making autonomous and interactive 24/7 AI livestreams challenging.
“As a result, we’ve actually dedicated the majority of our efforts and resources over the past eight months to developing our own advanced moderation systems to try to avoid running into issues where the AIs output anything that could go against Twitch’s rules,” he added.
Any money raised by these streams through donations goes directly into researching and developing AI streams further. The group’s technology now allows AI to react in real-time to YouTube videos, for example, as well as near zero-latency real-time voice interaction so conversations can flow naturally. Said Leysen: “These are concepts we’re likely to showcase in new streams we will soon launch, with different characters and formats.”
Leysen defended the use of cryptocurrency for its Mobile Minigames as another way to fundraise. “As activists, we’ve always tried to focus our efforts wherever we can have the biggest possible positive impact,” he said. “At one point, we realised we couldn’t scale up our charity efforts any further unless we had something like a mobile game that would sustainably fundraise instead of the fundraising depending solely on our livestreams.”
However, the rise in AI technology has caused the group to shift focus. “Ever since DALL-E and ChatGPT started gaining momentum, we began to shift our priorities towards AI research due to how disruptive it would quickly become to society and how AGI may well emerge within the coming decade, making it a priority for us to contribute what we can towards safer and more responsibly developed fundamental AI architectures,” he continued.
There’s plenty of scepticism around both AI and cryptocurrency at the moment, something Leysen blamed on discourse often being “quite simple and tribal”. The group is “mostly focusing our efforts towards technical contributions to the AI landscape rather than trying to change people’s minds”, he said, adding that AI was now in the hands of the masses, rather than only big corporations. Indeed, Leysen sees it as important that smaller groups also make AI progress.
“We’re doing everything we can to develop technology that could eventually lead us to a form of AGI that reasons responsibly and constructively rather than possibly becoming an unimaginable super weapon controlled by giant corporations,” said Leysen, evoking a hint of cyberpunk. But numerous issues with disreputable companies creating unregulated AI content remain.
The question of deepfake content on Twitch remains a major area of concern. Earlier this year, the platform was forced to update its policy on explicit deepfake content after images of female streamers were created and shared online. As Kotaku reported at the time, streamer Brandon “Atrioc” Ewing accidentally revealed he had explicit deepfake streamer content on his computer and the women affected were unaware of its existence.
As a result, Twitch issued a policy that explicit deepfake content – or what it calls “synthetic non-consensual exploitative images” (NCEI) – is prohibited on the platform. “The creation, promotion, or viewing of this content is not welcome on Twitch,” it said.
Explicit “deepfake” content has no place on Twitch—or anywhere. To help protect women streamers we’re hosting a Creator Camp on March 14 with more resources and ways to keep safe. Read our update to the community with more info here: https://t.co/KAH4zUTSBp pic.twitter.com/Q01sLolGJP
— Twitch (@Twitch) March 7, 2023
However, the policy also covered the “emerging trend” of deepfake content, which Twitch said “will require careful and thoughtful consideration in our planning, response, and actions”. It continued: “Not all synthetically created content is sexual in nature, nor is all of it non-consensual. This topic is very much on our radar, and we are always monitoring emerging behaviours to ensure our policies remain relevant to what’s happening on our service.”
In addition, Twitch recently updated its safety efforts to protect children, including ensuring the removal of exploitative content like generative AI-enabled Child Sexual Abuse Material (CSAM).
Still, deepfake and AI content is allowed on Twitch, including the use of real-world figures like Biden and Trump, though there is a clear line with explicit content. “It’s an area we will continue to pay attention to but I think streamers will be the innovators here,” VP of product Jeremy Forrester told me at TwitchCon Paris earlier this year, discussing the topic of AI as a potential area for growth.
In a new statement to Eurogamer, VP of global partnerships at Twitch Pontus Eskilsson said: “Twitch streamers are renowned for their creativity and, even at this early stage, some streamers have started experimenting with AI to create remarkably unique content – opening up new creative opportunities for animated streams and VTubing, as well as streaming their technical efforts training new models.
“Even with the presence of this interesting and timely subgenre of Twitch, the core of our service will always be about helping people on Twitch find community, and ensuring they’re able to do so safely. All streamers and channels on our service are subject to Twitch’s sitewide community guidelines and terms of service, whether they utilise AI or not.”
Leysen also commented on the moral responsibility of deepfake content through AI. “Our stance in this regard is quite traditional: any use of the technology to intentionally deceive is problematic and platforms like Twitch will increasingly have to find ways to prevent abuse in those regards, which is quite a challenge,” he said.
“We hope that our showcases do help to raise awareness around this and we often do see people who find our streams being shocked at what the technology is already capable of, which then makes them think twice next time they see a creator/celebrity/politician appear in a context where there might be reason for it to have been faked.”
He added for The Singularity Group’s streams where characters are based on real-life individuals, the content is clearly labelled as parody and meets fair use criteria. The aforementioned streamer drama AI streams are an example of this.
“Even though our AI parody characters may be vulgar and unhinged, we put in tons of work to try to make sure their output is not insensitive towards their personal background,” said Leysen. “Exceptions to this are cases where streamers have reached out to us and explicitly told us that nothing is off-limits and they actually want their AI parody to freely play with any and all known information about them. Thus far we’ve had no streamers complain to us and we’ve heard from many of them that they loved our creations, quite a few have also done live reactions to our streams that featured the AI parody of them.”
It’s clear, then, that AI content and streams are here to stay on Twitch, though how much they contribute to the core of finding community is debatable. For Leysen and activists at The Singularity Group, AI streams are within fair use of the platform and a way to showcase the technological and disruptive prowess of AI – while also simultaneously highlighting its dangers. And while there’s a strong line against explicit material, how far can the integration of AI into Twitch streams go? Will we soon be watching streams run entirely by AI: playing games, responding to chat, and creating a safe and moderated community?
Perhaps that’s a question to ask Jesus.