Facebook shareholders under pressure over encryption plans (Image Credit: philm1310 from Pixabay )Facebook holds its online annual general meeting (AGM) next week and shareholders are coming under pressure to vote ‘yes’ for a motion over its plans for end-to-end encryption. The proposal wants Facebook to investigate if the encryption would increase the risk of child sexual exploitation on its platform. It calls for the company to issue a report on the issue by February 2021.

Concerns over the use of end-to-end encryption to hide child abuse are not new. Neither, it seems, is the use of Facebook as a platform for the distribution of such material. The Hastings Group wrote in a press release: “Even though most parents and grandparents may not have a clue about it, the Facebook platform used by America’s youngsters – including Facebook Messenger, Instagram, and WhatsApp – is home each year to nearly 16 million known instances of child abuse and exploitation.”

Who filed the resolution?

Proxy Impact has filed the resolution along with investor Lisette Cooper, the Maryknoll Sisters, the Dominican Sisters of Caldwell, New Jersey, and the Stardust Fund. It has the support of two major shareholder advisory services – Glass Lewis and Institutional Shareholder Services (ISS).

A statement from ISS reads: “Given the potential financial and reputational impacts of potential controversies related to child exploitation on the company’s platforms, shareholders would benefit from additional information on how the company is managing the risks related to child sexual exploitation, including risks associated with end-to-end encryption technologies.”   

Why target Facebook?

Supporters of the motion point to numerous reports that show Facebook is the #1 hub of reported child sexual abuse material (CSAM). Those reports include statements from US lawmakers.

The motion cites US Deputy Attorney General Jeffrey Rosen. He noted in October 2019 that Facebook (including subsidiaries) accounted for “well over 16 million reports” of CSAM globally in 2018, and “70% of Facebook’s reporting”… would likely not happen “if the company deploys end-to-end encryption across all of its platforms….” 

The supporters of the motion acknowledge that Facebook has worked to address the problem. They state: “Facebook’s efforts to date to address the problem are laudable but fall far short of what is needed.  Reported incidents of child sexual exploitation and grooming (where someone builds a relationship of trust with a child or young person in order to exploit and abuse them) increased dramatically from year to year over the past decade.  The bottom line is that Facebook’s efforts are not stopping these crimes against children — including infants and toddlers — on its platforms.”

Laws around takedown of CSAM getting tougher

This is not the first attempt to make a platform responsible for the content it hosts. Platforms claim that they are not publishers and have no direct involvement with any content that is posted. They also rely on freedom of speech laws saying that the authorities should be targeting individuals and not them.

That is changing. France and Germany have enacted hate laws that include fake news and illegal material. Germany passed its law Netzwerkdurchsetzungsgesetz (NetzDG) in 2017. It began enforcing it in January 2018 with fines of up to €50 million for those that do not respond. The German Federal Criminal Police Office (BKA) report for 2018 shows a marked increase in takedown requests. It also shows an improvement in the response rate from organisations to takedown CSAM from 2017 to 2018.

In France, the National Assembly recently passed a law “Lutte contre la haine sur internet” (“Fighting hate on the internet”). It requires social media sites operating in France to remove offending content with 24 hours of notification. Those failing to do so will face an initial fine of €1.25 million. It would rise to 4% of global revenue for repeat offenders.

Enterprise Times: What does this mean?

Facebook is in a difficult position. It is under increasing pressure to improve the privacy of online communication. At the same time, it has to address concerns from lawmakers. They do not want strong encryption without backdoors as it is seen as a blocker to investigations. The problem here is that once encryption is in place, even Facebook would be unable to verify what is happening on its platform.

The shareholders and organisations supporting this motion warn that Facebook could face major new legislative and consumer fallout if encryption increases the amount of CSAM on its platform. They are partly right. There is no evidence that the news stories around CSAM have dampened consumer or even business use of the platform so far.

The stock markets, similarly, are not reacting to the threat. The share price currently stands at an all-time high. Without evidence of a significant hit to the share price, winning the votes of other shareholders is likely to be tough.

However, the threat of new legislation that could impact the share price is there. France and Germany could coordinate their actions, and the US would be likely to support those moves. There is also a growing awareness among institutional shareholders that they can also find themselves targeted by activists over the issue.

Facebook is currently looking at how it can use AI to detect illegal content on its systems. It currently uses human moderators but two weeks ago reached a US$52 million settlement over PTSD claims from its moderators. This motion may well accelerate that use of technology for the better.

LEAVE A REPLY

Please enter your comment!
Please enter your name here