The European Commission gave Meta Platforms (META.O), the company that owns Facebook, and social media juggernaut Snap (SNAP.N) a deadline of December 1 to provide more details about how they protect children from illegal and dangerous content on their respective platforms.
The European Union sent a similar statement to Alphabet’s (GOOGL.O) YouTube and TikTok the day before the request for information about the measures the businesses have taken to strengthen the protection of children. The request for information comes a day after the European Union sent the message.
A month ago, the Commission also issued urgent orders to several, including Meta, X, and TikTok, requiring them to outline the steps they have taken to combat the dissemination of information on their platforms connected to terrorism, violent content, and hate speech.
If the Commission is unsatisfied with the corporation’s reply, it can launch inquiries into such businesses.
Major online platforms are expected to do more to remove unlawful and harmful information or face fines of up to 6% of their total worldwide revenue under new online content laws that came into effect recently and are known as the Digital Services Act (DSA). These new restrictions were just put into effect.
In conclusion, the European Union’s directive to Meta and Snap to outline child protection procedures by December 1 is a crucial step toward enhancing the safety of users under 18. This article offers a complete overview of the relevance of the directive by examining the regulatory landscape, essential areas of attention in child protection measures, and the possible industry-wide and worldwide repercussions of this directive.