Instagram, a popular social media platform, has announced plans to develop features that will safeguard teenagers from excessive nudity and inappropriate content.
According to a CNN report on the 11th, Meta Platforms, Inc., the parent company of Instagram, is developing new features to safeguard teenagers from explicit photos, sexual exploitation, and potentially criminal activities that aim to make it harder for individuals with malicious intentions to interact with minors.
Meta explained that it is testing an artificial intelligence (A.I.) tool within Instagram’s Direct Message (D.M.) system that can automatically detect and blur nude images sent to minors. When an image containing nudity is received, it will be automatically blurred on the warning screen. The recipient can choose whether or not to view the image without opening it.
This feature will be automatically applied to users under 18, and adults will be encouraged to activate it. Meta also plans to offer the option to block the sender and report the chat, accompanied by a message advising users that they do not have to respond to such approaches.
Additionally, Meta has announced its plans to develop technology aimed at identifying accounts potentially involved in sexual exploitation scams.
In January, Meta unveiled its intention to automatically block harmful content from being exposed to minors.
These initiatives come in response to growing pressure from U.S. authorities. In October of last year, the governments of 41 states in the U.S. filed lawsuits against Meta, alleging that Instagram and other platforms were intentionally designed to be excessively addictive, causing harm to the mental well-being of minors.
Most Commented