Source: WIRED
Meta's Muse Spark collects sensitive biometric data while delivering advice that fails basic clinical reasoning tests. This matters because health data is both exceptionally valuable to advertisers and exceptionally dangerous when mishandled. Meta's track record on privacy, combined with the model's demonstrated incompetence, creates compounding risk. Enterprise AI vendors are racing to monetize every data category without first proving their tools work, betting regulators will move slowly enough that user habits calcify before enforcement arrives.