On June 10, the Office of the Privacy Commissioner of Canada (OPCC) released a draft “Privacy guidance on facial recognition for police agencies,” a proposed guideline for police to follow when using facial recognition technology (FRT), “with a view to ensuring any use of FR complies with the law, minimizes privacy risks, and respects privacy rights.” The OPCC sought feedback from stakeholders on this draft guidance, which was developed jointly by the federal, provincial and territorial privacy protection authorities, and outlines the current privacy and legal framework that applies to police use of FRT. The consultation’s feedback questions fell under one of two main considerations: Whether the guidance will be effective in ensuring police use of FRT is lawful and privacy protective, or whether FRT is appropriately regulated under the existing legal and policy framework. PIAC contributed a submission which expressed substantial doubt on both fronts.
PIAC comments on OPCC Facial Recognition 2021 Consultation
On June 10, the Office of the Privacy Commissioner of Canada (OPCC) released a draft “Privacy guidance on facial recognition for police agencies,” a proposed guideline for police to follow when using facial recognition technology (FRT), “with a view to ensuring any use of FR complies with the law, minimizes privacy risks, and respects privacy rights.” The OPCC sought feedback from stakeholders on this draft guidance, which was developed jointly by the federal, provincial and territorial privacy protection authorities, and outlines the current privacy and legal framework that applies to police use of FRT. The consultation’s feedback questions fell under one of two main considerations: Whether the guidance will be effective in ensuring police use of FRT is lawful and privacy protective, or whether FRT is appropriately regulated under the existing legal and policy framework.
PIAC contributed a submission which expressed substantial doubt on both fronts.
On the matter of whether the draft guidance will be effective in bringing police agencies in line with privacy obligations and the law, PIAC believes that the guidance is not truly enforceable, and therefore only aspirational. Police agencies may choose to implement all or some recommendations, or none at all. It is wholly dependent on whether police agencies are committed to prioritizing privacy obligations. PIAC reserves optimism in light of the RCMP’s use of Clearview AI, an FRT company that was found to have populated its massive database of facial images by illegally scraping images from social media. To date, the RCMP still does not believe the Privacy Act imposes upon them a responsibility to ensure private sector partners lawfully collected personal information.
Only an enforceable legal framework, combined with broader, consumer-centric privacy reforms, would sufficiently limit police powers in the context of its FRT use and partnerships. However, the current patchwork of law and policy being applied to police use of FRT is not actually specific to FRT, and as the OPCC itself has stated in the draft guidance, “do not specifically address the risks posed by the technology.” The current framework gives police too much discretion to decide how, why, when, where, and upon whom FRT is used.
PIAC proposes that police use of FRT in Canada should be regulated under a single, clear legal framework consistent with privacy and human rights. Until we have that law, continued FRT use by police poses a serious risk to individual privacy rights and democratic freedoms.
Read our full submission here