With new draft biometric regulations released in China and a wave of BIPA enforcement in the US, how can companies stay compliant when processing biometric data?

www.shutterstock.com/g/prostock_studio
Biometric data comprises of biological features that are unique characteristics of individual people including fingerprints, facial geometry scans, facial recognition technology (FRT), voice matching and iris patterns. It is increasingly being used to authenticate the identity of an individual, with passwords to open smartphones and devices being replaced with fingerprints and photo ID and employers using fingerprint authorisation scans for employees to access certain systems. Recently, companies have been turning to face scanning technologies to estimate a user’s age with increased age verification requirements.
Biometric data provides a permanent and completely unique identifier for each individual, which cannot be forged or forgotten. However, the technology poses significant privacy concerns, as many companies have built databases of highly sensitive customer personal data, which, unlike a password, is impossible to change if compromised.
If accessed, this sensitive data could be misused by bad actors in ransomware incidents or in conjunction with deepfake technology to impersonate people. Additionally, biometric data poses bias and discrimination risks, as in the US, African American and Asian faces are up to 10 to 100 times more likely to be misidentified by facial recognition technology (FRT) than Caucasian faces.
A recent up-tick in biometric data-related enforcement, particularly in the US state of Illinois which has seen hefty fines issued to Instagram, Bytedance and Facebook, shows that protecting sensitive data is becoming a top regulatory priority.
Biometric data processing requirements
Biometric data obligations for companies are increasing globally and compliance must be ensured to avoid costly enforcement action.
China
On 9 August 2023, China’s Cyberspace Administration of China (CAC) proposed rules to regulate the use of FRT in public places (Chinese language only), which will place strict limits on its use by private entities. The new rules would limit the use of FRT to instances where there is a specific purpose and sufficient necessity, requiring companies to prioritise the use of non-biometric identification methods where possible.
Businesses including banks, stadiums, airports and hotels would be banned from using FRT to verify identity unless required by law, and all entities which use the technology will need to carry out the burdensome task of gaining consent of each individual being identified. FRT would not be allowed to be used to analyse individual’s race, religion or other sensitive personal information, including health. The Cyberspace Administration of China is consulting on the proposed rule until 7 September 2023.
USA
There is currently no federal biometric data law in the US, instead a state-by-state patchwork of legislation and regulations. The Illinois Biometric Information Privacy Act 2008 (BIPA) is known as the world’s most stringent biometric privacy law, and only applies to conduct that has substantially taken place in the state. It allows companies to collect iris scans, fingerprints, voiceprints, facial geometry scans, but only with written consent to do so.
Damages can range from US$1,000 to US$5,000 per violation. Additionally, the Illinois Supreme Court ruled that BIPA claims accrue with each scan of biometric data without consent, and that a five year limitation period applies from the date that alleged BIPA violations took place for individuals to file a claim.
Tough sanctions result in huge costs for non-compliant businesses, with:
- Instagram being fined US$68.5 million on 31 July 2023 after it was accused of collecting and storing consumers’ biometric identifiers without informed consent;
- ByteDance’s video editing app, Capcut, being hit with a class action lawsuit on 28 July 2023 seeking US$1,000 for each negligent violation and US$5,000 per intentional violation for failing to obtain adequate consent;
- the railroad freight operator BNSF being fined US$228 million in October 2022 for not providing written disclosures regarding the purpose and duration of the use of its employees’ biometric data; and
- Facebook being fined US$650 million in March 2021 for tagging users in photos using FRT without their consent.
Beyond Illinois, Texas and Washington have also enacted biometric privacy laws, which impose similar requirements relating to notice, consent, and mandatory security measures. Texas has sued Google and Meta for allegedly capturing biometric data without consent under the Texas Capture or Use of Biometric Identifier Act 2009. Washington Biometric Privacy Protection Act 2017 has seen no enforcement action as of yet and does not permit individuals to bring a private right of action, as the Texas and Illinois laws do.
Other US states including California, New York, Colorado, North Carolina and Florida have provisions that cover biometric data collection, but no specific legislation thus far. A few states have proposed laws that are similar to existing legislation, including compliance and consent requirements, namely the:
- Arizona Act Relating to Biometric Information;
- Hawaii Biometric Information Privacy Act;
- Maryland Biometric Data Privacy Act;
- Massachusetts Biometric Information Privacy Act;
- Minnesota Act Relating to Private Data and Establishing Standards for Biometric Privacy;
- New York Biometric Privacy Act;
- Tennessee Consumer Biometric Data Protection Act; and
- Vermont Act Relating to Protection of Personal Information.
Additionally, Section 5 of the Federal Trade Commission Act 1914 (the Act) covers unfair or deceptive business conduct, and the Federal Trade Commission (FTC) has warned that false or unsubstantiated claims about the accuracy or efficacy of biometric information technologies or about the collection and use of biometric information may violate the Act. The FTC has taken action against Facebook and the developer of a photo app Everalbum for deceiving consumers about their use of facial recognition technology.
EU
Under the EU General Data Protection Regulation (GDPR), the processing of biometric data is prohibited unless:
- consent has been given explicitly;
- the data is necessary for carrying out the obligations of the data controller in the field of employment, social security and social protection law;
- it is essential to protect the individual’s vital interests and they are incapable of giving consent;
- it is for any legal claims; and
- it is necessary for reasons of public interest in the area of public health.
Approaches between EU Member States (MS) may vary, as the GDPR permits MS to impose additional conditions and limitations on the processing of biometric data.
The GDPR classes biometric data as sensitive personal data, so data controllers will need to conduct privacy impact assessments for many forms of biometric data processing. Additionally, any entity that uses FRT must delete the data within 28 days of the final use of the service and the data must be stored in Europe.
The incoming EU Artificial Intelligence Act also will aim to limit the use of biometric identification systems, particularly those that could lead to surveillance like FRT. In addition to the existing applicable legislation, the draft AI act proposes to introduce new rules governing the use of FRTs, which many of would be considered "high risk" systems which would be prohibited or need to comply with strict requirements which will include:
- creating a risk management system;
- increased data governance to ensure that datasets are relevant, representative and accurate; and
- record keeping and transparency requirements.
On 27 July 2023, the digital rights group NOYB filed a complaint against Ryanair for using FRT to verify customer’s identity when booking through third-party online travel agents. NOYB described the move as a bid to “obtain an unfair competitive advantage over alternative booking channels” and a violation of customer’s right to data protection.
OpenAI, parent company to the popular AI chatbot ChatGPT, launched a cryptocurrency project which scans user’s irises to confirm identity on 24 July 2023. Both France and the UK’s data watchdogs have questioned the legality of the biometric data collection and will be making further inquiries into the service, with no further details currently.
A Dutch employer was fined €725,000 (US$796,000) in April 2020 for processing employee’s biometric data without clear consent, without giving employees enough information about how their data would be used and retaining the biometric data of ex-employees for too long.
UK
The UK approach to regulating biometric data has been criticised for being fragmented and not robust enough. The technology is governed by a patchwork of laws, most notably the UK General Data Protection Regulation (GDPR) which prohibits the processing of biometric data unless:
- explicit consent has been given;
- it is needed for employment, social security and social protection;
- it concerns:
- vital interests;
- not-for-profit bodies;
- legal claims or judicial acts;
- it has been made public by the data subject;
- for reasons of substantial public interest;
- it concerns public health; and
- it will be used for archiving, research and statistics.
Under the UK GDPR, data controllers handling biometric data must also conduct privacy impact assessments.
In May 2023, FRT company Clearview AI was fined £7.5 million (US$9.4 million) and ordered to delete all data belonging to UK residents by the UK Information Commissioner’s Office (ICO). Clearview claimed to have scraped 20 billion images from online sources including Facebook and Instagram. The ICO, along with regulators in Australia, France and Italy, leveled that the company violated data protection law by using biometric data without consent, failing to have a lawful reason to collect such information, and failing to have processes in place to stop the data being retained indefinitely.
A major breach was found in the biometric system used by UK banks, police and defense firms, exposing over 17.8 million records including fingerprint data, facial recognition data, and face photos of users. Researchers found that the data was left unprotected and was discoverable on a publicly accessible database.
How can companies lawfully process biometric data?
Obtain consent through transparency
To comply with biometric data laws globally, if a company collects, captures or purchases biometric data, it should provide clear notice to consumers that biometric data collection is occurring.
For companies based in the US, specifically in Illinois, Washington or Texas, consumers or their legal representative must be notified in writing each time their biometric data is collected:
- that their biometric data is being collected or stored;
- the purposes for collecting, storing, and using the biometric data; and
- how long the data will be used or stored for.
Companies must also receive the data subject’s or their legal representative’s written consent to collect the biometric data and should store these consent agreements agreements for at least five years if based in Illinois in the event of a BIPA lawsuit. Companies can implement a system for providing and tracking notice and consent to keep track of such requests.
Third-party vendors providing biometric technology should also be wary, as although Illinois courts are split on whether third-party vendors must comply with the BIPA notice and consent provisions, a February 2023 ruling held that BIPA does not provide a carveout for third-party vendors. Therefore, third-party vendors should:
- not take active steps to collect or obtain biometric data;
- have a passive possession of biometric data, for example they can store the data but should not use the data; and
- contractually require clients to obtain written consent that includes the vendor from consumers.
Be clear about how the data is shared with third-parties
Companies should communicate to consumers if the biometric data will be shared or processed by third-party vendors and obtain consent. Additionally, companies should limit access to the biometric data – if it must be distributed to a third-party vendor, carefully and concisely craft the contract with that third-party vendor to clearly express the parameters surrounding the biometric data.
Implement privacy by design
The US Federal Trade Commission (FTC) has suggested that companies using FRT should design their services with consumer privacy in mind. The policy statement released in May 2023 by the FTC states that “if companies consider the issues of privacy by design, meaningful choice, and transparency at this early stage, it will help ensure that this industry develops in a way that encourages companies to offer innovative new benefits to consumers and respect their privacy interests.”
Check out this interview and how-to guide on how to implement privacy by design.
Establish clear policies
Publicly available, detailed biometric data-specific privacy policies should be established that include clear notice that biometric data is being collected, as well as additional information regarding the purposes for which the data is used and the company’s schedule and guidelines for the retention and destruction of this data.
In their data policies, companies should also outline how the data will be handled if the business is sold, closed or enters bankruptcy. Additionally, companies can check if their cyber-risk insurance covers biometric data claims.
Companies must also conduct privacy impact assessments as required in the EU and UK.
Check out these checklists for creating a privacy policy in the UK, EU and US.
Access and control features
Companies can give users control over their facial biometrics at a granular level so they can opt-out, correct, delete or restrict use.
Address risk of bias and discrimination
The ICO has suggested that companies address any risk of bias and discrimination with the biometric data technology they employ. To protect consumers, companies should use biometric data structures that include accountability and transparency requirements.