Meta faces European Union investigation into child safety risks

Facebook and Instagram will be investigated for possible violations of European Union online content rules related to child safety, officials from the bloc said on Thursday, a process that could lead to heavy fines for the parent company, the Goal.

Tech companies are required to act to combat illegal and harmful content on their platforms under the EU's Digital Services Act (DSA), which came into force last year.

The European Commission said it decided to open an in-depth investigation into Facebook and Instagram due to concerns that they had not adequately addressed risks to children. Meta submitted a risk assessment report in September.

“The Commission is concerned that Facebook and Instagram's systems, including their algorithms, may encourage behavioral addictions in children, as well as create so-called 'rabbit hole effects',” the EU executive said in a communicated.

“Furthermore, the Commission is also concerned about the age verification and assurance methods implemented by Meta.” The regulatory body's concerns are related to children's access to inappropriate content.

Meta said it already has several tools to protect children. “We spent a decade developing more than 50 tools and policies designed to protect them,” said a Meta spokesperson.

The company is already in the EU's crosshairs over electoral disinformation, a key concern ahead of next month's European Parliament elections. Violations of the DSA can lead to fines of up to 6% of a company's annual global turnover.

Source: CNN Brasil

You may also like