Child porn concerns prompt California to launch investigation into Elon Musk's AI firm

This is read by an automated voice.Please report any issues or inconsistencies here.
SACRAMENTO — California announced an investigation Wednesday into Elon Musk’s xAI, with Gov.Gavin Newsom accusing the artificial intelligence company of becoming a “breeding ground for predators to spread nonconsensual sexually explicit AI deepfakes.”Grok, the xAI chatbot, includes image-generation features that allow users to morph existing photos into new images.
The newly created images are then posted publicly on X.Critics say there are not sufficient safeguards on what images can be generated, prompting an influx of sexually explicit or nonconsensual images based on real people, including altered depictions that appear to show individuals partially or fully undressed.Others have generated images that appear to show minors, prompting criticism that the tools are being used to create child pornography.The social media site has previously said, “we take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary.” Earlier this month, Grok started limiting the ability for nonpaying users to create sexualized images amid a global outcry from users and governments.
Business Grok now restricts non-paying users from generating free deepfake images on X.Newsom called the images being created “vile.” Atty.Gen.
Rob Bonta said his office will use “all tools at our disposal to keep Californians safe.”“The avalanche of reports detailing the non-consensual, sexually explicit material that xAI has produced and posted online in recent weeks is shocking,” Bonta said in a statement Wednesday.“This material, which depicts women and children in nude and sexually explicit situations, has been used to harass people across the internet.”Newsom signed a pair of bills in 2024 that made it illegal to create, possess or distribut...