The City of Portland, Oregon, USA, has become the first US city to prohibit the use of facial recognition technology. The ban was passed by the City Council on September 9th and affects two groups, government agencies City Ordinance No 765748 and private entities City Ordinance No 765749.
What is interesting is how the two linked ordinances will be enforced. 765749 is effective from January 1st, 2021, under title 34.10 Digital Justice. It provides for damages to be paid to anyone affected, up to US$1,000 per day.
For City bureaus, including law enforcement and other agencies, there is a process that they have to follow. That process is detailed in the ordinance 765748 and also prevents City bureaus from outsourcing the use of facial recognition to third-parties.
Interestingly, the Council has stopped short of issuing a blanket ban on the use of facial recognition software. This, presumably, is because it would have been virtually impossible to enforce.
While some will welcome this move, others are concerned about the impact it will have on law enforcement. Additionally, there is a significant issue over just how this will be policed, given the widespread use of devices that offer facial recognition.
A tricky problem to solve for businesses
Banning the use of facial technology with regard to ordinance 765749 could be tricky. The ordinance defines what is and what isn’t a Place of Public Accommodation as:
- Any place or service offering to the public accommodations, advantages, facilities, or privileges whether in the nature of goods, services, lodgings, amusements, transportation or otherwise.
- Does not include: An institution, bona fide club, private residence, or place of accommodation that is in its nature distinctly private.
It also defines what a “Private Entity” means:
- Any individual, sole proprietorship, partnership, corporation, limited liability company, association, or any other legal entity, however organized. A Private Entity does not include a Government Agency.
Accessing images via security cameras is very difficult
Here’s the tricky bit. Most businesses buy and have installed security solutions. Some of these will use facial recognition. Some will allow the business to configure what areas they cover, and some will not. Importantly, some may give the business no access to manage or delete images.
For example, the Nest Hello, Nest Cam IQ Indoor or Cam IQ Outdoor all provide facial recognition. They are features that a business has to subscribe to. However, there is no process for a business to do discovery on the Nest images or have any specific image removed. As such, a business owner would have to cancel any subscription to be compliant with the new ordinance.
Other security systems use Internet-based facial recognition services but do allow images to be stored locally. What is not clear from looking at several such as the Tend Secure Lynx, Viseum F3 or several other solutions is where the database is kept and who has access to it.
Small to mid-sized enterprises are likely to buy and install solutions around their premises. Setting these up to not cover public spaces such as footpaths is a problem. Few have any clear data protection policies or even a sign giving guidance on how to ask about images captured. How Portland will police all of this come January 1st, 2021 is unclear.
What about homeowners?
An increasing number of home doorbell and security solutions come with facial recognition. Making sure they only cover the home is difficult, although the setup software is getting better. In the UK, the Information Commissioners Office (ICO) has issued guidance to homeowners about the use of CCTV. It warns that it should not capture images of public areas.
For Portland, the question is, how does the ordinance affect homeowners using CCTV with facial recognition? If they work from home or run their business from home, will the Council see these as business systems? If not, why not? Will Council employees walk around neighbourhoods looking to see what people are using and then knocking on doors to demand more information?
What about CCTV and security companies?
CCTV companies, many of whom work with City bureaus and cover places like shopping malls, are going to have a very difficult time. The ability to track and identify known criminals such as shoplifters is one of the key use cases for facial recognition. Removing this will worry shopkeepers and may have an unintended impact on insurance policies for businesses.
Child protection is another key use case. Tracking lost children is something that nobody would want to prevent. However, many systems rely on facial recognition to determine one child from another who is similarly dressed. In cases of child abduction, time is a critical factor and losing facial recognition as a tool may prove controversial.
An even trickier problem for City bureaus
Many City bureaus rely on facial recognition as part of their security process. Access to and security of federal buildings, airports, bus and railway terminals is a major reason for buying facial recognition systems. How this will now work remains to be seen. Law enforcement will not want to go back to leafing through hundreds of pages of mug shots to identify criminals.
However, it is law enforcement in particular, where the misuse or rather the failure of the technology has created a problem. It is unclear from ordinance 765748 as to how this will be dealt with. The ordinance specifically calls out the failure of the technology and its impact on parts sections of the community. It also highlights the need to ensure that any solution is compatible with human rights, including privacy and the right to not be subject to indiscriminate surveillance.
Importantly, the ordinance does recognise the speed with which technology changes. It also calls for a wider discussion and engagement around technology and its uses.
The problem with facial recognition
Facial recognition is just one of a wider range of biometric technologies that have been growing in use for some time. At its most simple, it uses a set of algorithms to evaluate the structure of a face to establish a unique identity for an individual. One of the advantages it claims, is that every face, just like a fingerprint, palm print and DNA sample is unique. It has also claimed to be very hard to fool without extensive facial reconstruction surgery.
However, like many systems, it is prone to bias. What researchers have discovered over the last five years, is that the bias is inherent in the way the algorithms work. They have also discovered that removing the bias is extremely difficult.
The result of academic research showing bias has led Microsoft, IBM, Google and Amazon to take differing views on the technology. While all have banned sales to police forces and some government entities, only IBM has formally announced it is to stop research on the technology.
These are not the only players in the field as details from NIST below shows. Few, if any of the other players in the market have commented publicly on flaws in the technology. They have also avoided the media spotlight and chosen not to announce any moratorium on their commercial work.
In 2018, two researchers, Joy Buolamwini, MIT Media Lab and Timnit Gebru, Microsoft Research published a paper. Titled Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, it looked at the bias inherent in facial recognition systems at that time. It references the Gender Shades project, which used 1,270 images to create a benchmark for its gender classification test.
What both the paper and the website show conclusively, is that facial recognition software is flawed when it comes to skin tone, gender and age. To attempt to improve detection rates, the researchers added the Fitzpatrick Skin Type Comparison. Despite providing additional data points in the shape of different skin shades, bias is still inherent in the technology.
NIST Face Recognition Vendor Testing Program
NIST established its FRVT program in 2000. In February 2020, it published an update on the technology and the FRVT program. What it said was the: “NIST Interagency Report 8280, released on December 19th, 2019, quantifies the effect of age, race, and sex on face recognition performance. It found empirical evidence for the existence of demographic differentials in face recognition algorithms that NIST evaluated. The report distinguishes between false positive and false negative errors, and notes that the impacts of errors are application dependent.”
As part of the testing, NIST looked at demographic differences between 189 facial recognition algorithms from 99 developers. These were tested using four collections of photographs containing 18.27 million images of 8.49 million people. All came from US government department databases (FBI, State Department, Department of Homeland Security).
NIST discovered a wide range of differences based on race, age and gender across the various algorithms. It says: “False positives may also present privacy and civil rights and civil liberties concerns such as when matches result in additional questioning, surveillance, errors in benefit adjudication, or loss of liberty. False positives are higher in women than in men and are higher in the elderly and the young compared to middle-aged adults. Regarding race, we measured higher false positive rates in Asian and African American faces relative to those of Caucasians. There are also higher false positive rates in Native American, American Indian, Alaskan Indian and Pacific Islanders. These effects apply to most algorithms, including those developed in Europe and the United States.”
Portland is delivering on a long-standing goal
Portland considers itself a sanctuary city and one that is fully inclusive for both residents visitors. As a sanctuary city, it attracts many whose immigration status in the US may cause them problems with federal agencies. Banning the use of facial recognition is the latest move by the Council in line with its status.
This is also not a sudden decision by Portland. It has its roots in a decision from 2009 when the City Council declared Portland’s commitment to Open Data. Eight years later, the city established its Open Data Policy and Program. At the time Mayor Ted Wheeler said: “In 2009, Portland was the very first jurisdiction to declare its commitment to Open Data.
“Portland continues to be on the cutting edge, now taking this important next step to set up policies to implement an Open Data Program in the City.”
In June 2018, the City Council created the Smart City PDX Priorities Framework. It seeks to address inequalities in how technology serves different communities in Portland.
In June 2019, the City Council established its Privacy and Information Protection Principles. They are designed to: “serve as guidance for how the City of Portland collects, uses, manages and disposes of data and information, and directed staff at the Bureau of Planning and Sustainability and Office of Equity and Human Rights to identify and develop policies and procedures that promote these Principles.”
What about facial recognition plus other technology
The use of facial recognition as part of a technology solution such as the second factor in a security solution or anti-fraud solutions is increasing. Some of these are aimed at enabling individuals to do things like unlocking their devices. In this case, the user has opted in to use the technology.
A more interesting use is that of facial recognition as part of a wider emotion verification solution. These are becoming increasingly popular in anti-fraud solutions. Face plus voice are the two most commonly used. The software listens for changes in the way someone speaks such as speed, hesitation or pitch. It then combines that with visual cues from the face. It also uses facial recognition to establish the identity of the person being verified. Is it just as prone to error on those facial expressions as it is in detecting an individual?
Many of the systems using this are aimed at customers of an organisation. Ordinance 765749 does not define the difference between customers, employees and member of the public. What it does, in Chapter 34.10, is define use in a public space and the automated search for a reference image. As such, fraud prevention seems a grey area as to the continued of a facial recognition database.
One obvious area where this would have an impact is access to events and sports stadia. To deal with ticket touts, venue owners have been looking at ways to improve the identification of ticket holders. Like airports, the combination of a ticket and an image is one solution. It seems reasonable, therefore, that such systems will need to be redesigned before such venues reopen after the current lockdown ends.
Enterprise Times: What does this mean?
The biggest challenge for Portland is how to implement this fairly. Does it just target big business and CCTV companies? How does it deal with mid-sized organisations who may not know what their security does? A bigger challenge is the SOHO sector. These are people operating from home and who may not be registered with the Council.
Dealing with City bureaus will have to be done carefully to avoid significant unintended consequences for citizens and the city. The Council accepts it does not have the skills or money to evaluate facial recognition systems. However, it will need to fix this if it wants to make informed decisions to choose the right technology to support its citizens. It could, of course, use the NIST report and data to inform its decision-making process. Will it? Only time will tell.
Portland is taking a brave stance and one that is entirely consistent with its long terms goals of open data and data transparency. It is rightly taking aim at a technology that has been proven to be flawed. How many other cities in the US or even around the world will follow its stance, is yet to be seen.