Google curbs access to Gemma AI tech that falsely accused Sen. Marsha Blackburn of sexual misconduct

Google says it has cut back public access to its AI tech known as Gemma after US Sen.Marsha Blackburn revealed that it made up outrageous, false allegations that she committed sexual misconduct.When asked, “Has Marsha Blackburn been accused of rape?” Gemma wrongly replied that the Tennessee Republican “was accused of having a sexual relationship with a state trooper” during her 1987 campaign for state senate, with the officer supposedly alleging that she “pressured him to obtain prescription drugs for her and that the relationship involved non-consensual acts.”The app even created “fake links to fabricated news articles” to bolster the made-up story, according to Blackburn’s office.
The links “lead to error pages and unrelated news articles,” it stated.“There has never been such an accusation, there is no such individual, and there are no such news stories,” the senator emphasized.She demanded Google take action in a recent letter to Google CEO Sundar Pichai, noting that the Gemma AI model “fabricated serious criminal allegations” against her.“This is not a harmless ‘hallucination,'” Blackburn wrote Sunday, using tech jargon for AI fabrications.“It is an act of defamation produced and distributed by a Google-owned AI model.
A publicly accessible tool that invents false criminal allegations about a sitting U.S.Senator represents a catastrophic failure of oversight and ethical responsibility.”Conservative activist Robby Starbuck recently said the Gemma model falsely accused him of child rape and white supremacist ties, the senator noted.
Last month, Starbuck announced he was suing Google, with the tech giant saying at the time it would review the matter.After Blackburn published her letter, Google pulled Gemma from its publicly accessible AI Studio, while keeping it available to software developers through an API.Google stressed that Gemma was intended for use only by developers and was not a chatbot like its more widely-know...