Legal: Face up to data fears

Technology is moving faster than regulation. How is travellers’ right to privacy protected?

Business travellers are likely to experience a changing security environment at airports both here in the UK and around the world. Heathrow has announced the installation of upgraded security scanners, using “computed tomography” to show 3-D images of the contents of baggage, which will finally remove the requirement to put liquids from baggage into sealed plastic bags and to remove laptops.

Heathrow will lead the way for this new technology and will also expand its current use of facial recognition technology (FRT) so that passengers will not have to verify their identity on various occasions as they pass through the terminal.

We are in an age where technology is moving faster than regulation, and questions have to be asked concerning data protection, privacy, and consent, and how this data is used, kept or destroyed.

Facial recognition law
New proposed legislation was introduced during March 2019 – with the Commercial Facial Recognition Privacy Act under which notification would be given when facial recognition information is used or collected, and third-party testing would be required to ensure the data is unbiased and limiting how it is to be used.

There is considerable support for these new laws to require privacy and tight regulation concerning use of FRT. The US Customs and Border Protection Department proposes to use facial recognition technology on more than 100 million passengers taking international flights out of the US. Critics say the technology has not been allowed to develop and may not always be accurate, leading to misidentity. Once passengers have passed through security checks, the data should be deleted. Strong technical and security safeguards should be put in place.

Europe: Personal data
The General Data Protection Regulation (GDPR) makes it clear biometric information is “personal data” controlled under these regulations. Article 9 sets out biometric data (including FRT used to identify a person) as within a special category banned from processing unless exceptional circumstances apply.

Any data which identifies an individual, or which might do so when combined with other data, counts as “personal data” and is regulated. This includes the digital fingerprints of facial recognition. There needs to be a lawful basis to process this data which could include the consent of the passenger, “legitimate interests” and processing necessary to perform a contract or comply with a legal obligation.

It is likely regulators will expect much higher levels of security

Relying on consent
If the processor of FRT intends to use the data for commercial purposes and seeks to rely upon consent of the subject, then the consent must be “informed” and the passenger should be given information before any processing of the “fingerprint” takes place.

This might be achieved by signs and displayed information and guidance. The individual should be given the right to object to their personal data being used, particularly for commercial activities, such as direct marketing.

It is likely that regulators will expect much higher levels of security and anonymisation to prevent misuse and breach of FRT data. Retention of the data “fingerprints” should be justified and must be carefully monitored.

Expect regulator action
The use of FRT is more than a security issue. Consent, privacy and strict rules should be in place to ensure data is not misused. With the rapid expansion of this technology, we should expect action from the regulators to prevent commercial exploitation of the harvested data.

Subscribe to the BBT Newsletter

Join the Buying Business Travel newsletter for the latest business travel news.

Thank you for signing up!