untitled design

Meta and Google deny inertia and rule out increasing responsibility to big techs

Representatives of Meta (owner of Facebook, Instagram and WhatsApp) and Google said they did not expect court decisions to remove illegal content and defended the maintenance of the protection guaranteed to platforms by the Civil Rights Framework for the Internet.

The legislation states that social media companies can only be held responsible for user posts if they fail to remove content after a court order. The guarantee is in article 19 of the Marco.

Tech giants also reject increasing their responsibility for publications by social media users.

The statements were made during a public hearing held at the Federal Supreme Court (STF) to discuss points of the Civil Rights Framework for the Internet. Representatives of the Lula government, such as ministers Flávio Dino (Justice), Silvio Almeida (Human Rights) and Jorge Messias (AGU), participated.

Court ministers Dias Toffoli, Luiz Fux, Alexandre de Moraes, Gilmar Mendes and Roberto Barroso also followed the work.

For Google Brazil lawyer Guilherme Cardoso Sanchez, increasing the civil liability of platforms “is not the key to making the internet a safer place”.

“Platforms cannot be directly held responsible for content created by people on the internet,” he said.

He said that holding digital platforms accountable, as if they themselves were the authors of the content they display, “would lead to a generic duty to monitor all content produced by people”. According to the lawyer, the situation would contribute to “create pressure to remove any minimally controversial speech”.

According to Sanchez, Google is not waiting for a court decision to remove illegal content from the platforms.

He stated that the company is updating and improving its content policy, incorporating restrictions on publications that may pose a risk of real damage. “For example, YouTube’s policies against hate speech prohibit discrimination based on factors such as age and social class, which go beyond legal categories.”

He also said it was “simpler” to proactively identify and remove objective content such as nudity.

“It is much simpler to identify an unauthorized nudity scene than to interpret the legality of a controversial speech on a political issue, for example”.

Inertia

Rodrigo Ruf Martins, legal manager at Facebook Brazil, said that the demand for greater accountability on the platforms comes from the impression of supposed inertia on the part of companies in combating anti-democratic discourse and misinformation.

Martins said that Meta invests “billions of dollars” and develops technological and artificial intelligence tools to establish policies and terms of use, and strengthen security and integrity in applications.

He also stated that there was no omission by the company in the fight against illicit content during the 2022 elections and in the acts of January 8.

According to Meta data, cited by the lawyer, 135,000 ads of an electoral nature were removed in the first round of the election and 3 million contents on Facebook and Instagram “for violation of policies that prohibit violent content, incitement to violence and hate speech ”.

“These posts included very sensitive topics, such as requests for military intervention and other attempts to subvert the democratic rule of law,” he said.

“More than 3 million contents were proactively removed by META, that is, in a field of self-regulation without the need for judicial intervention”, he declared.

“Evidently, we recognize that more can be done by the platforms.”

For Martins, article 19 of the Marco Civil da Internet is constitutional.

“The idea that Article 19 encourages inertia is unfounded, as we understand that the online business model will never thrive in a toxic environment,” he said.

Source: CNN Brasil

You may also like

Get the latest

Stay Informed: Get the Latest Updates and Insights

 

Most popular