Meta tool to block nude images in teens’ private messages
- Meta announces the launch of a new safety tool aimed at blocking children from sending and receiving nude images, including in encrypted chats.
- The decision comes after criticism from government, police, and children’s charities regarding Meta’s encryption of Messenger chats, which they argue hinders the detection of child abuse material.
- Meta’s new feature will use machine learning to identify nudity, however, claim this technology cannot be used to identify child abuse due to serious risk of errors.
- For more, please visit the BBC News website.
X blocks searches for Taylor Swift after explicit AI images of her go viral
- X has blocked searches for Taylor Swift after explicit AI-generated images of her went viral.
- X released a statement stating: “We have a zero-tolerance policy towards such content,” and “Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”
- In the UK, the sharing of deepfake pornography became illegal as part of the Online Safety Act in 2023.
- For more, please visit the BBC News website.
X plans to create a content moderation ‘headquarters’ in Austin
- X will hire 100 full-time employees for a new trust and safety office in Austin, focusing on child sexual exploitation.
- The team would be the first proper trust and safety team since Elon Musk bought the platform, formerly known as Twitter.
- The company’s business operations head reported that the team will help with ither moderation enforcement, such as forbidding hate speech.
- For more, please visit The Verge website.