Brazil’s data protection office, the ANPD, has told Meta to stop processing personal data to train its AI. In its ruling, it criticised the basis on which Meta believes it has the authority to use that data, effectively accusing it of overreach and a lack of protections.

It is yet another setback for Meta, which has seen multiple countries and regions tell it to stop harvesting data. Meta wants to train its AI on the vast amount of user data it holds. The problem is that much of that data is personal, including images and supposedly private conversations.

In its statement, the ANPD stated, “Pursuant to Vote No. 11/2024/DIR-MW/CD, approved by the Board of Directors in a Deliberative Circuit, it was understood that sufficient preliminary findings were present to issue the Preventive Measure.

“They are: use of inappropriate legal hypothesis for the processing of personal data; lack of disclosure of clear, accurate and easily accessible information about the change in the privacy policy and the processing carried out; excessive limitations on the exercise of holders’ rights; and processing of personal data of children and adolescents without due safeguards.”

The ANPD also applied a daily fine of R$50,000 (£7,133, US$9,151) for non-compliance.

Why is Meta doing this?

Like other large technology vendors, Meta is scrambling to deliver a generative AI solution. There are two targets for that AI. The first is to provide a better experience for its users when they want to find something on its platform.

The second and more important thing to Meta is to improve the services it offers advertisers. While it claims in its terms and conditions that it doesn’t sell users data, it does state:

“We don’t sell your personal data to advertisers, and we don’t share information that directly identifies you (such as your name, email address or other contact information) with advertisers unless you give us specific permission. Instead, advertisers can tell us things such as the kind of audience that they want to see their ads, and we show those ads to people who may be interested.”

An AI built from users’ data will give it far greater insights into each individual’s interests. From an advertiser’s perspective, this means advertising can be ever more targeted and potentially deliver greater results.

Does Meta have any basis for its attempt to do it?

It depends on your perspective. So far, data protection controllers are all ruling against Meta’s actions with the statement by the ANPD being the strongest so far. Across Europe, privacy groups such as noyb, have also been vocal in their opposition to the Meta actions.

However, Meta points to its terms and conditions. In section 3.3 headed “The permissions you give us”, it states:

“However, to provide our services, we need you to give us some legal permissions (known as a “Licence”) to use this content. This is solely for the purposes of providing and improving our Products and services as described in Section 1 above.

“Specifically, when you share, post or upload content that is covered by intellectual property rights on or in connection with our Products, you grant us a non-exclusive, transferable, sub-licensable, royalty-free and worldwide licence to host, use, distribute, modify, run, copy, publicly perform or display, translate and create derivative works of your content (consistent with your privacy and application settings).”

It is that clause, especially the second paragraph above, where it believes users have granted permission to use their content to train the AI.

The ANPD view on Meta’s action

The ANPD says that Meta’s move could affect a substantial number of people. Facebook in Brazil alone has around 102 million active users. It gave no numbers on WhatsApp or Instagram, nor are there numbers from Meta on usage in Brazil.

As already noted, the ANPD criticised an inappropriate legal hypothesis for the processing of personal data, among other issues. It also commented, “In this specific case, the ANPD considered that the information available on Meta’s platforms is, in general, shared by holders for relationships with friends, nearby communities, or companies of interest.

“Given this, in a preliminary analysis, there would not necessarily be an expectation that all this information – including that shared many years ago – would be used to train AI systems, which were not even implemented when the information was shared.”

The latter part of that statement is interesting. It suggests that there is doubt about the legality of Meta relying on the non-exclusive licence it claims users granted it when they agreed to the terms and conditions.

The ANPD also criticised Meta’s safeguards, stating, “It was found that personal data from children and teenagers, such as photos, videos, and posts, could also be collected and used to train Meta’s AI systems.

“According to the LGPD, the processing of data from children and adolescents must always be carried out in their best interests, with the adoption of safeguards and risk mitigation measures, which was not verified in the scope of the preliminary analysis.”

The full 24-page judgement can be found here

Enterprise Times: What does this mean?

Once again, Meta has been told by a data protection authority, the Brazilian ANPD, to stop processing data to train its AI. This will continue to frustrate Meta, which will see itself as falling behind competitors in making the best use of user data. The longer this goes on, the longer it will take to monetise the vast amount of personal data it holds.

From the users’ perspective, it is refreshing to see that multiple data protection authorities all view Meta’s move as wrong. However, it remains to be seen if any orders to stop are challenged and overturned. Meta has yet to publish any appeals against the orders, but that does not mean it is letting this stand. Ultimately, there is too much for it to gain in terms of revenue from advertisers.

Many companies will be watching this case closely as they look for new ways to monetise data with AI.

LEAVE A REPLY

Please enter your comment!
Please enter your name here