Microsoft has implemented additional protections to its AI text-to-image generation tool, Designer, following reports of users utilizing the tool to create nonconsensual sexual images of celebrities. The move comes after AI-generated nude images of American singer-songwriter Taylor Swift went viral on platforms like 4chan and Telegram, where individuals were reportedly using Designer to create AI-generated images of celebrities.
A Microsoft spokesperson stated that the company is actively investigating these reports and taking appropriate action to address them. Microsoft's Code of Conduct explicitly prohibits the use of its tools for creating adult or non-consensual intimate content, and repeated violations may result in the loss of access to the service. The company affirmed its commitment to responsible AI principles and mentioned the deployment of large teams working on developing guardrails and safety systems.
While an ongoing investigation has not confirmed whether the AI-generated images of Taylor Swift originated from Designer, Microsoft is intensifying efforts to strengthen text filtering prompts and counter the misuse of its services.
Microsoft Chairman and CEO Satya Nadella expressed concern over the explicit AI-generated fakes of Taylor Swift, describing them as "alarming and terrible." In an interview with NBC Nightly News, Nadella emphasized the need to act swiftly in addressing such issues.
In response to the deepfake controversy, reports suggest that Taylor Swift is considering potential legal action against the website responsible for generating the explicit AI-generated content. The incident underscores the challenges and ethical concerns associated with the misuse of AI technologies for creating non-consensual and harmful content.
(With Agency Inputs)