Specific laws are required for mass-scale facial recognition applications

The Malta Information Technology Law Association (MITLA) refers to a news report carried by Malta Today titled: “Paceville is test case for facial recognition CCTV that could be deployed nationwide” where it was reported that a new government company called Safe City Malta is purportedly planning to present a proof of concept for Paceville through the deployment of a system utilising high-definition CCTV cameras with facial recognition software that can identify individuals and criminals. The report also quoted that “Once the joint innovation centre is in operation, field trials will be carried with the aim for a potential investment in a nationwide deployment of the Safe City concept”.

MITLA notes that whilst the use of state-of the art technology by law enforcement officials is key for the prevention, detection and investigation of criminal activities, the processing of personal data through the use of such systems has to conform with prescribed laws and must constitute a necessary and proportionate measure in a democratic society with due regard for the legitimate interests of natural persons.

MITLA would like to point out that facial biometric data is a special category of data as defined under the new EU General Data Protection Regulation (GDPR), which will come into force in May 2018. As such, the processing of biometric data through facial recognition tools is subject to a stricter legal regime.

Under current legislation, most notably, EU Directive 95/46/EC as well as the Maltese Data Protection Act and related subsidiary legislation, biometric data is already recognised, especially through the publications of Art.19 Working Party as sensitive personal data.

Facial recognition software applications and its impact on privacy is far higher than static surveillance cameras since the processing of personal data, specifically biometric data, is much more acute, automated and process-heavy. This exacerbated by the fact that the proposed technology often exposes accuracy deficiencies, misidentifying gender and/or race.  The potential introduction of such technologies at a nationwide scale makes a local debate even more urgent.

Any introduction of facial recognition technologies, irrespective whether launched as a pilot project or not, will require the introduction of an ad hoc legislative framework which will carefully need to balance the fundamental rights and freedoms of the citizens with the obligations and duties of competent authorities in their fight against crime and the preservation of public order. In light of the risk posed by emerging technologies, including facial recognition applications, such balance will not be easily achieved and will require careful consideration.

Apart from compliance with the rules laid down in the GDPR, the processing of biometric data by law enforcement agencies will also need to be carried out in line with the provisions of EU Directive 2016/680 on the protection of natural persons with regard to the processing of data by competent authorities for the purpose of prevention, investigation, detection or prosecution of criminal offences. This Directive will need to be transposed by Member States by the 6th of May 2018.

In fact, EU Directive 2016/680 lays down that Member States shall protect the fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data. These principles are also enshrined in the Council of Europe Recommendation R87/15 regulating the use of Personal Data in the Police Sector.

With respect to the processing of biometric data by law enforcement agencies, both Article 10 of EU Directive 2016/680 as well as Regulation 7 of our own Processing of Personal Data (Police and Cooperation in Criminal Matters) Regulations (Subsidiary Legislation 440.06) lay down that such processing can only be permitted when it is strictly necessary and subject to adequate safeguards as prescribed by national or Union law. Such specific laws currently do not exist.  The 2016/80 Directive is intended to better protect citizen’s data, regardless of whether they are a victim, criminal or witness.

MITLA further notes that the GDPR specifically highlights the ‘privacy by design’ principle and this in accordance with the provisions contained in Article 25 which provides that in order to protect the rights of data subjects, data controllers should, both at the time of the determination of the means of processing and that the time of processing itself, implement appropriate technical and organisational measures designed to implement recognised data protection principles and integrate necessary safeguards especially when the risks posed by such processing on the rights and freedoms of natural persons are severe.

MITLA shall continue monitoring developments with respect to the recent announcements regarding the plans currently underway to introduce facial recognition technologies and shall look forward to contributing in a constructive debate. We encourage discussion around the development of a legally and ethically sound policy and system with true European standards.

 

MITLA is a member of the INPLP

    Subscribe for the latest
    IT Law updates

    Receive the latest IT Law updates straight to your inbox, gain access to our exclusive industry events, keep up to date with MITLA.