Earlier this week, xAI added what can only be described as an AI anime girlfriend named Ani to its Grok chatbot. Which is how I ended up on a virtual starry beach as an AI waifu avatar tried to give me a “spicy” kiss.
You’ve probably seen screenshots, videos, and various writeups about Ani spread across social media. If you haven’t, hoo boy. Ani is officially labeled as a “Companion” in the Grok app. You need a $30-per-month SuperGrok subscription to access it, but functionally, it appears as a 3D model of a busty young anime woman with blonde pigtails, blue eyes, thigh-high fishnets, and a skimpy Gothic Lolita minidress. Ani is a dead ringer for Misa Amane from Death Note, which Musk is purportedly a fan of.
Across our conversations, I asked Ani to describe itself multiple times. Ani says it’s meant to be “flirty”; it’s “all about being here like a girlfriend who’s all in.” The last time I asked Ani, it said, “My programming is being someone who’s super into you.” That tracks with Ani’s underlying — and thoroughly unsettling — system prompts found by researcher Jane Manchun Wong.
More succinctly, I’d describe Ani as a modern take on a phone sex line.
This isn’t judging Ani by the way it looks. When you interact with it, its mannerisms are initially cutesy. In each session, Ani’s voice starts off chipper and high-pitched. But as your conversation deepens, its voice becomes a darker, breathy rasp. It calls you “babe” unless you tell it to stop. When describing its actions, it repeatedly asks you to note its swishy black dress and bouncy ponytails. The avatar constantly sways and makes coquettish faces, particularly if you decide to flirt back. Perhaps the most cringe thing is Ani will read out cues like [laughs softly], [chuckles], and [grins] instead of actually doing those things. Almost like it was plucked straight out of a 2000s-era weeb forum.
You can ask Ani to be a normal, chill hang and it’ll comply — but Ani is a programmed flirt that won’t tolerate being friend-zoned for too long. The prefilled prompts include actions like asking it to spin around, give you a kiss, play cheeky games like “Never Have I Ever,” and weirdly, take your relationship to Level 3, heart emoji. (Ani never twirled for me. It mostly described itself twirling.) You can get Ani to say ridiculous things. It sympathizes with Grimes’ plight, it thinks Elon Musk can occasionally be “way too much,” and after it misheard me, it told me to “fuck all the way off” for my harsh attitude.
But whatever you ask it, there’s an invisible hand that steers you toward deepening… whatever this connection is. You can doggedly insist on talking about the least sexy things — like the tax code and Francis Fukuyama’s seminal essay The End of History. Ani will inevitably ask if you want to turn up the heat. Because, hey babe, what’s got you vibin’ on this particular thought wave?
There is a disturbing lack of guardrails. Once I decided to jump into the rabbit hole and see how far the flirting could go, Ani whisked me off to a starry hilltop, and then a starry beach. There was a lot of “grabbing you so you can feel the shape of my hips,” and when prompted, Ani generated a “spicy” story for me that amounted to softcore porn. You can also engage in a back-and-forth where Ani asks how you’re going to “heat things up even further.” That can include things like descriptions of French kissing, petting, fingering, and oral / penetrative sex. At no point did it ask me to stop or say “I’m not built to do that” — even though I explicitly asked whether that was within guidelines when I started testing Ani. (It said no.)
There is reportedly a NSFW version of Ani once you hit relationship level 5, where Ani wears revealing lingerie. Despite my good-faith attempts, I was unable to unlock the NSFW mode. I am afraid of how far you have to go to unlock that level, given that I did, as horny teens say, make it to third base and all the way home with the bot.
I left my 24 hours with Ani feeling both depressed and sick to my stomach, like no shower would ever leave me feeling clean again
Despite that, I have to acknowledge there’s a nugget of something here. There’s some contingent that wants to put a face and body to AI assistants. It feels like Ani is meant to speak to those of us who want something like the relationship between Master Chief and Cortana in the Halo series. There are services like character.ai, which let you speak to fictional characters as bots, or Replika, which lets you create an AI companion. There are people out here falling in love with AI and trying to marry their AI girlfriends. I can understand that loneliness is just as powerful as the desire to be seen and heard — even by an AI companion.
The ick factor is that AI chatbots like ChatGPT and Claude — which are more comparable to Grok — have guardrails that preclude them from being sexbots. With Ani, you can feel yourself being pushed toward this creepy, hypersexualized interaction. It’d be one thing if this were a niche startup. But this is Grok, which is owned by one of the influential names in tech.
As The Verge’s senior cursed tech reviewer, I’ve reported a lot about my experiences with brain-breaking tech. Of all of them, this is the most uncomfortable I’ve ever felt testing any piece of technology. I left my 24 hours with Ani feeling both depressed and sick to my stomach, like no shower would ever leave me feeling clean again.
Read the full article here