Children’s Privacy: Meta to use AI to find & blur nude photos…protect children
Upping the game to protect minors, Meta is developing an AI “nudity protection” tool to use on Instagram. This after the company faced legal charges of exploiting young users to encourage use of their platforms despite knowledge that it harms mental health. The “sexploitation” protection mechanism will find and blur images containing nudity that were sent to minors, and then let recipients choose whether to see the images. Not sure this is reassuring….