TikTok and social, how and how much can the Guarantor act?

Last Friday the Guarantor for the protection of personal data Italian issued a precautionary, urgent and temporary order the result of a dossier opened months ago, with which he ordered the immediate blocking of the use of TikTok user data for which the social network was unable to ascertain the age.

Since the platform, one of the most frequented by children but not only, does not have a system capable of carrying out this selective control either upstream (at the time of registration, when it is enough to lie about the date of birth) or subsequently, the the consequence is that technically that block should apply to all users.

It will not be so: the platform controlled by the Chinese ByteDance is in fact still operational, new profiles can be created, and it is likely that in the coming weeks a close confrontation with the Guarantor – which in the meantime has transferred the practice to that too Irish, knowledgeable about the company as TikTok established its European headquarters in Dublin – to understand how to get out of this impasse. However, many aspects remain still unclear. In the meantime, the authority has announced that it has a dossier also opened on Facebook and Instagram, for the same reasons.

What happened

The emergency measure was issued, confirmed by some members of the same college, last January 22, quickly finalizing a file open for some time, in the wake of a news story that took place in Palermo. A tragic event – the death of a 10-year-old girl engaged in an act of self-harm for a video that, perhaps, she would have produced in order to publish it or spread it on that or another social network – which at the moment it was not connected with certainty to TikTok or to any challenges, the challenges, that could have been carried out there.

However, it is worth keeping in mind that often that type of content circulates crosswise: you start on a social network, perhaps continue in chat, you end up somewhere else. Returning to the Guarantor: by cutting the times, also because it evidently did not consider the measures taken a few days earlier by the social network sufficient, the provision has put a point (for now theoretical and formal) on the platform. Which however sooner or later he will have to react, decide, negotiate or challenge the provision (to tell the truth without too much hope).

What is the point of the matter

For the Italian law, which legislated on the subject by acknowledging with the decree law n.101 of 2018 the European general regulation for the protection of personal data, i children under 14 they cannot autonomously consent to the commercial processing of their personal data. Even the conditions of use of the platform, aligned with a similar but much older US provision (the Cup, Children’s Online Privacy Protection Act in 1998), establish that limit in 13 years. Thresholds that therefore become the minimum ones to be able to use all the platforms that offer digital services, certainly not only TikTok but also Facebook, Instagram, WhatsApp, YouTube and many others, inside and outside the social ecosystem. Under those thresholds the guarantees to be provided are such that these platforms prevent their use or even design parallel products, such as Messenger Kids or YouTube Kids.

So what is the request of the Guarantor

The authority wants the company to field a certain mechanism that allows to verify the age of the users. The burden of proposing a solution lies with the group, as the Gdpr also reiterates, but some hypotheses are on the ground and the members of the college themselves, such as the lawyer Guido Scorza, have hinted at some solutions in recent days. «The hope is that these platform managers can use the data they already have to be able in some way to verify, if not the age, at least the belonging of a user to one age range».

How, then? For example, by inferring through a series of evaluations and information already in their possession – how to use the app, content viewed most frequently, likes and shares and other choices – which category a user may belong to. And, if there are any doubts, request a more thorough verification.

A little like what YouTube does, which nevertheless enjoys the Google account as a reference to which to anchor the possibility of viewing some content or not (this, incidentally, does not prevent you from bypassing the age-related filters). And which has recently introduced, but with reference to another continental provision (the EU Directive on audiovisual media that Italy has yet to implement) the request for a document to verify the age of majority. TikTok has less than a month to design this system, until February 15th.

Otherwise what happens?

The Guarantor does not obscure or seize the servers – materially impossible, given that TikTok as far as we know does not have any located in Italy, the company is Chinese but the services in Europe are managed by the US branch based in California – and does not suspend digital services: it imposes methods and guarantees on them in the processing and storage of user data. But if the platforms do not comply with those indications, the competent Guarantor for each country (for which, in this case, the Irish one at the request of our, after a new investigation) can issue administrative sanctions.

By virtue of the GDPR these penalties can be extremely high: they can reach 4% of global turnover of the fined company. Doing some math, ByteDance should have billed something like 22 billion euros in 2020: a possible sanction would be around 900 million euros. The Guarantor could also report the failure to comply with the provision to the Italian judicial authority. In short, TikTok – which despite the sanctions could still remain operational – should sit down at work to find a solution, which according to what the company claims it is somehow doing.

Moral of the provision

The move of the Italian Guarantor is a first time European and is very important, from the formal aspect. To understand if he will have actual practical consequences we must therefore wait. Even if the opening of a dossier also against Facebook and its subsidiary Instagram for the same violations appears not only a move obliged (the vast majority of platforms share the same problems with respect to verifying the age of their users) but also strategic: Turn the problem from one TikTok game to one question of social network system. One might say Finally, given that those platforms have always been full of children and that basically no one protects them.

The other social networks

The authority did not stop, as mentioned, e continues its work with other platforms. For example, he asked Facebook, which also controls Instagram, to provide a series of information in this case more specific and relating to the ten-year-old girl who disappeared in Palermo: he wants to understand how many and which profiles the child had and how it was possible that subscribe (not a secret: lying). But he asked for precise information on how to register for the two social networks and on the user’s age checks adopted to check compliance with the minimum age. The giant has two weeks to respond and the verification will gradually extend to the other platforms. «Security and privacy are top priorities for Facebook and Instagram. We will collaborate fully with the Italian Data Protection Authority ”explains a spokesperson for the group.

«The processing of the data of minors, who obviously are subjects against whom the platforms must have great caution and attention, first of all in the transparency with which they describe the processing activities, with reference to the information (the Gdpr says that the privacy information must be clear and understandable, even more so in the case of platforms dedicated to minors) and with reference to the verification of the age of users, it is a very delicate issue and in the past it had already been brought to the attention of experts, professionals and platforms – he explains to VanityFair Ernesto Belisario, a lawyer among the best-known experts in technology law and innovation in the Public Administration – mere self-certification is not considered sufficient.

Thus the authority, first with the provision against TikTok and now with this opening of the file, requires the adoption of articulated instruments. They can be the identity card on the one hand but on the other also the use of algorithms and automated solutions that make it possible to deduce the age of users from a series of other elements in their possession.

It is obvious that both methodologies must be reconcile with other needs and attentions: the large-scale collection of identity data and the errors that algorithms can make. However, as the experience of other platforms including YouTube demonstrates, they can be implemented with a view to an ever-growing effort to protect minors and ensure that the processing is conducted in compliance with the law as much as possible “.

You may also like