Rob Sherman, Deputy Chief Privacy Officer, Facebook has published a blog saying that Facebook will block the use of its data for surveillance. The blog states: “Today we are adding language to our Facebook and Instagram platform policies to more clearly explain that developers cannot “use data obtained from us to provide tools that are used for surveillance.”
Sherman goes on to say: “Over the past several months we have taken enforcement action against developers who created and marketed tools meant for surveillance, in violation of our existing policies; we want to be sure everyone understands the underlying policy and how to comply.” Privacy groups will welcome the statement. They have berated Facebook, Twitter and other social media companies for allowing data to be used to track demonstrators and individuals.
Sherman acknowledges that the American Civil Liberties Union and California, Color of Change and other groups have helped to refine their policies. He also says that this is part of an ongoing review of how data is used and highlights action already taken against some ad providers.
Is Facebook facing its Canute moment?
It is a grand statement by Facebook but does it really mean anything for users of the platform. This is a difficult question to answer. Facebook is using the right language and changing its policies. It is also demonstrating a willingness to enforce them against some offenders. But is this enough to make it a trustworthy platform?
Like other free social media platforms Facebook makes its money from advertising and the sale of the data it holds. For example, post to Facebook and you give it a non-exclusive right to use and resell your data, photos and other content. It then resells this to marketing companies who use the data to create campaigns to target individuals.
Marketing companies use the same analytics tools to track individuals across social media as intelligence agencies. Post something on Facebook and it can be easily correlated to posts on Twitter, LinkedIn and other places. Post an image of an event and it quickly becomes part of a wider analytics process. The big analytics platforms are already using facial recognition to match faces. For law enforcement this is essential when dealing with who was at a rally, riot or event.
Intelligence agencies have always used front companies to disguise their operations. There are a number of marketing and related companies that have been linked to those agencies. How Facebook will shut them down is not known. It doesn’t have a list that can be easily found to say who it has denied data access to. This is something that will be required in order to add trust to the systems.
Is there a case for law enforcement and intelligence agency access?
The answer here is yes. No matter what people think, the amount of data generated by individuals plays an important part in solving crimes. Photos taken by those at the scene of a crime could help identify suspects. Similarly data from the telematics inside a vehicle can show if someone was speeding or driving recklessly before an accident.
Photo evidence gathered after the Boston Marathon bombings was central to a quick resolution of that case. In terror incidents, social media accounts and images are used to track down co-conspirators. Across the world this has had beneficial effects.
However it has also had the opposite effect. It can be used as a tool to prevent people legally demonstrating if they think they will be tracked by government agencies. Privacy is an important right that is easily lost if everything we create is open to surveillance.
This is where the balance needs to be found. Ironically, the non-exclusive right to the data does provide law enforcement with a route to access the data. The only solution to that would be for Facebook to revoke its data grab from users and that will impact its commercial activities.
A failure to reign in developers
It is not just data being used for surveillance that needs to be looked at here. Facebook has long faced pressure over the excess data requested by developers. For example, every app wants access to your public profile but few realise what that contains. In addition many apps want permission to “post as you” and access your photos and videos as well as those shared with you by other people.
Importantly, much of that data is not required by many apps. Take a birthday app. Your name, date of birth and friends list so it can remind them make sense. However, your gender, photos, their birthdays and photos they’ve shared with you is excessive. Not all that data is legally yours to share. Facebook understands this and could tighten up its own developer rules by requiring them to check with the third-parties from whom data is being requested. To do that would add extra work for developers and likely reduce the data they can/will harvest.
As a result Facebook has done little to improve this area of privacy. It also tinkers regularly with its privacy settings which can cause some items to become less secure. This is something that it needs to address. Ironically, these are areas that the ACLU and others have not taken it to task about. If they had, Facebook might have dealt with it. It seems that it only responds when big enough organisations make a lot of noise.
As part of this article we randomly looked at the data 15 apps wanted. All wanted basic data and all bar one wanted photos and video. The majority also wanted to be able to post as the user.
A problem that is easily solved
This is not an insoluble problem. Read the Facebook developer documentation and it already makes it clear that some permissions require review. All it has to do is give the user better control over what is used. Instead of the single public profile entry which includes name, gender, age, date of birth, local, picture and other data it could allow a user to uncheck each item. Where an app wants access to things other than public profile it could make it mandatory that users are able to say yes or no. The app would then have to keep those in a configuration file that the user could edit or change later.
A more complex problem is ensuring that app developers are not able to sell on data harvested from users. Facebook could create a locked store where data gathered by apps is held. It could put in controls to prevent the data leaving the platform. This would give it an opportunity to develop and bring in analytics tools for developers and keep everything on its own platform. This has a commercial opportunity for Facebook as well.
The bigger issue is the Facebook firehose where it sells large blocks of data to third-party marketing companies. That data is not trackable or controllable once it has left Facebook. This is where it needs to look for a better solution and improve its controls. If it starts with better granular controls over the use of data by developers, it could extend those controls for other data users. Users could opt out of their data being sold on to third-parties but Facebook is unlikely to risk this as it would kill the golden goose.
A delicate balancing act
Of course there is also a balancing act to be played. Facebook needs the developers and marketers on its platform. Users need them there otherwise they’d have to pay for the platform and what evidence there is shows that users do not want to do that.
Facebook has resisted the two-speed platform option of allowing some to pay for access and not get hit with adverts. There is a good reason for this. Without knowing how many ads a user is exposed to, it doesn’t know what the loss of revenue per impression it would result in. It is also conscious that a two-speed platform would cause it more problems in data management and control. It would need to put other controls in place for paying users.
Advertisers would also want to know how many people are paying to avoid their content. This would cause many to rethink what they pay. Facebook would also have to deal with problems over its “promote your post” approach. It already restricts who sees your posts. To reach all your friends and followers you have to consider paying to have your posts promoted. If people could opt out of this as part of a paid account it would cause more loss of revenue.
What Facebook is saying is good news. Trying to block the use of advanced analytics to track people makes perfect sense. However, it is inconsistent with how it behaves and there is no evidence that it will stamp out this behaviour. Changing the wording in terms and conditions is meaningless unless backed up with legal action. Kicking someone off a platform has limited effect. Taking them to court and bringing other government agencies such as Data Protection Registrars into the case is far more powerful.
There are three core issues here. The first is user privacy and Facebook keeps telling everyone that this is paramount. It should be but there are gaps that still need to be fixed. The second is developer and corporate access to data. This requires better user controls over how their data is utilised and not the rights grab that Facebook and other social media platforms insist on. The last is reasonable government access for law enforcement purposes.
Ultimately this is all about revenue generation. Facebook knows that it has to keep the money rolling in and being too restrictive will hurt it. What it is doing now is trying to show it recognises that some things are bad. It will take a while to see if it can really enforce this new approach.