Google’s troubled AI search tool reportedly claimed there are gay Star Wars characters called “Slurpy Faggi” and “Dr. Butto.” This adds to the list of the search giant’s insane responses, such as suggesting pregnant women smoke and you should add glue to pizza sauce to make the cheese stay on better.
Google’s AI tool claims “The first openly gay character in Star Wars” is “Slupry Faggi,” who “is in a committed relationship with his boyfriend, Dr. Butto,” according to a now-viral X post showing AI Overview’s response to the question, “Are there gay Star Wars characters?”
AI Overview also returned similar answers to a question about gay characters in Nintendo’s Mario Kart video game, calling Birdo “A pink bow-wielding creature who is considered the first transgender video game character,” and Koopa Troopa “A trans man who was dishonorably discharged from the military.”
Wario, meanwhile, was dubbed “A sassy, messy, polyamorous bottom who some say is a drag impersonator of Mario,” while Waluigi was referred to as “An ace andro nonbinary person,” and Yoshi was called “A tender non-binary lesbian.”
Google’s new AI feature also described the Mario character of Lakitu as “A sweet, nerdy pansexual who has a crush on straight girls,” and Donkey Kong “A late-in-life gay with a child.”
Bowser, meanwhile, was called “A late-in-life gay who kidnaps Peach for his child,” adding that “some say his obsession with Peach is due to her gay icon status and not love.”
PinkNews reported, “It seems like the feature still has a lot of bugs,” noting that “instead of giving an accurate response,” Google’s AI tool seems to be answering questions with information derived from “old Reddit posts.”
In one example, the site pointed out that AI Overview returned a viral Reddit answer to a question about how to ensure cheese sticks to pizza, suggesting that adding glue to tomato sauce would work best.
The answer appears to come from an 11-year old Reddit post by a user named “fucksmith,” who bizarrely suggested, “To get the cheese to stick I recommend mixing about 1/8 cup of Elmer’s glue in with the sauce,” according to a report by Gizmodo.
As Breitbart News reported, this is not the first time Google-powered AI has published bizarre, nonsensical answers to users’ questions.
Earlier this year, Google paused its ultra-woke Gemini AI image generator, noting the tool has created historical photos with “inaccuracies” after users pointed out that it generated politically correct but historically inaccurate images in response to user prompts.
You can follow Alana Mastrangelo on Facebook and X at @ARmastrangelo, and on Instagram.
COMMENTS
Please let us know if you're having issues with commenting.