Teresa Scassa - Blog

Displaying items by tag: Privacy

 

Bill C-11, the Act to reform Canada’s private sector data protection legislation – contains a new provision – one that has no equivalent in the current Personal Information Protection and Electronic Documents Act. Section 39 will permit the disclosure of an individual’s personal information without their knowledge or consent where that the disclosure is for “socially beneficial purposes.” This post examines the proposed exception.

In the course of their commercial activities, many private sector organizations amass vast quantities of personal data. In theory, these data could be used for a broad range of purposes – some of them in the public interest. There are a growing number of instances where organizations offer to share data with governments or with other actors for public purposes. For example, some organizations have shared data with governments to assist in their research or modeling efforts during the pandemic.

There may also be instances where data sharing is part of the quid pro quo for a company’s partnership with the public sector. Los Angeles County, for example, has sought to require data-sharing in exchange for licence to operate dockless scooter rental businesses. The ill-fated Sidewalk Toronto project raised issues around data collected in public spaces, including who would be able to access and use such data and for what purposes. This led to debates about “data trusts”, and whether an entity could be charged with the oversight and licensing of ‘smart city’ data.

It is into this context that the proposed exception for “socially beneficial purposes” is introduced. Section 39 of Bill C-11 reads:

39 (1) An organization may disclose an individual’s personal information without their knowledge or consent if

(a) the personal information is de-identified before the disclosure is made;

(b) the disclosure is made to

(i) a government institution or part of a government institution in Canada,

(ii) a health care institution, post-secondary educational institution or public library in Canada,

(iii) any organization that is mandated, under a federal or provincial law or by contract with a government institution or part of a government institution in Canada, to carry out a socially beneficial purpose, or

(iv) any other prescribed entity; and

(c) the disclosure is made for a socially beneficial purpose.

The first thing to note about this provision is that it reflects a broader ambivalence within the Bill about de-identified data. The ambivalence is evident in the opening words of section 39. An organization “may disclose an individual’s personal information without their knowledge or consent” if it is first de-identified. Yet, arguably, de-identified information is not personal information. Many maintain that it should therefore be usable outside of the constraints of data protection law, as is the case under Europe’s General Data Protection Regulation. Canada’s government is no doubt sensitive to concerns that de-identified personal information poses a reidentification risk, leaving individuals vulnerable to privacy breaches. Even properly de-identified data could lead to reidentification as more data and enhanced computing techniques become available. Bill C-11 therefore extends its regulatory reach to deidentified personal data, even though the Bill contains other provisions which prohibit attempts to re-identify de-identified data, and provide potentially stiff penalties for doing so (sections 75 and 125).

The Bill defines “de-identify” as “to modify personal information – or create information from personal information – by using technical processes to ensure that the information does not identify an individual or could not be used in reasonably foreseeable circumstances, alone or in combination with other information, to identify an individual”. The idea that it would include information created from personal information makes the definition surprisingly broad. Consider that in the early days of the pandemic, a number of companies – including Google and Fitbit – released data about mobility – in the form of charts – as we moved into lockdown. These visualizations might well fit the category of ‘information created from personal information’. If this is so, the release of such data – if Bill C-11 were passed in its current form – might constitute a breach, since according to section 39, the disclosure without knowledge or consent must be to a specified entity and must also be for a socially beneficial purpose. Perhaps Bill C-11 intends to restrain this self-publishing of data visualizations or analyses based on personal information. It is just not clear that this is the case – or that if it is, it would not violate the right to freedom of expression.

Under section 39, de-identified data may be disclosed without knowledge or consent to specific actors, including government or health care institutions, public libraries and post-secondary institutions. Data may also be disclosed to any other “prescribed entity”, thus allowing other entities to be added to the list by regulation. In the current list, the most interesting category – in light of debates and discussions around data trusts – is “any organization that is mandated, under a federal or provincial law or by contract with a government institution or part of a government institution in Canada, to carry out a socially beneficial purpose”. This category allows for a range of different kinds of “data trusts” – ones created by law or by contract. They may be part of government, operating under a mandate from government, or engaged by contract with government. Such arrangements must be for a “socially beneficial purpose”, which is defined in subsection 39(2) as “a purpose related to health, the provision or improvement of public amenities or infrastructure, the protection of the environment or any other prescribed purpose.”

While a data trust-type exception to facilitate data sharing is intriguing, the proposed definition of “socially beneficial purpose” may be too limiting. Consider a private sector company that wishes to provide de-identified data from personal fitness devices to a university for research purposes. If these data are used for health-related research there is no problem under section 39. But what if a social scientist seeks to examine other phenomena revealed by the data? What if a business scholar seeks to use the data to understand whether counting steps leads to more local shopping or dining? If the research is not about health, the provision or improvement of public amenities or infrastructure, or the protection of the environment, then it does not appear to fall within the exception. This might mean that some researchers can use the data and others cannot. There is a separate exception to the requirements of knowledge or consent for research or statistical purposes, but it is not for de-identified personal information and is more complex in its requirements as a result.

There are also some rather odd potential consequences with this scheme. What if a short-term rental company is willing to share de-identified data with a provincial government that is looking for better ways to target its tourism marketing efforts? Or perhaps it seeks to use the data to better regulate short term accommodation. It is not clear that either of these purposes would fit within the “improvement of public amenities or infrastructure” category of a socially beneficial purpose. And, although Bill C-11 sets out to regulate what private sector companies do with their data and not what data provincial or municipal governments are entitled to use, it does seem that these provisions could limit the access of provincial public sector actors to data that might otherwise be made available to them. By allowing private sector actors to share de-identified data without knowledge or consent in some circumstances, the implication is that such data cannot be shared in other circumstances – even if appropriate safeguards are in place.

Finally, it seems as if the de-identification of the data and a reference to socially beneficial purposes are the only safeguards mandated for the personal data under this scheme. The wording of section 39 suggests that shared data cannot simply be made available as open data (since it can only be shared with a specific entity for a specific purpose). Yet, there is no further requirement that the new custodians of the data – the public sector or prescribed entities – allow access to the data only under licenses that ensure that any downstream use is for the prescribed socially beneficial purposes – or that impose any other necessary limitations. For entities such as post-secondary institutions, public libraries, and ‘data trusts’, use by third parties must surely be contemplated. Section 39 should therefore require appropriate contractual terms for data-sharing.

Overall, the concept behind s. 39 of Bill C-11 is an important one, and the effort to facilitate data sharing by the private sector for public purposes in privacy-friendly ways is laudable. It is also important to consider how to place limits on such sharing in order to protect against privacy breaches that might flow from re-identification of de-identified data. However, section 39 as drafted raises a number of questions about its scope, not all of which are easily answered. It would benefit from a better definition of ‘de-identify’, a more flexible definition of a socially beneficial purposes, and a further requirement that any data sharing arrangements be subject to appropriate contractual limitations. And, even though individual knowledge of the sharing arrangements may not be feasible, there should be some form of transparency (such as notice to the Commissioner) so that individuals know when their de-identified personal data is being shared, by whom, and for what socially beneficial purposes.

Published in Privacy

 

The federal government’s new Bill C-11 to reform its antiquated private sector data protection law has landed on Parliament’s Order Paper at an interesting moment for Ontario. Earlier this year, Canada’s largest province launched a consultation on whether it should enact its own private sector data protection law that would apply instead of the federal law to intraprovincial activities.

The federal Personal Information Protection and Electronic Documents Act was enacted in 2000, a time when electronic commerce was on the rise, public trust was weak, and transborder flows of data were of growing economic importance. Canada faced an adequacy assessment under the European Union’s Data Protection Directive, in order to keep data flowing to Canada from the EU. At the time, only Quebec had its own private sector data protection law. Because a federal law in this area was on a somewhat shaky constitutional footing, PIPEDA’s compromise was that it would apply nationally to private sector data collection, use or disclosure in the course of commercial activity, unless a province had enacted “substantially similar” legislation. In such a case, the provincial statute would apply within the province, although not to federally-regulated industries or where data flowed across provincial or national borders. British Columbia and Alberta enacted their own statutes in 2004. Along with Quebec’s law, these were declared substantially similar to PIPEDA. The result is a somewhat complicated private sector data protection framework made workable by co-operation between federal and provincial privacy commissioners. Those provinces without their own private sector laws have seemed content with PIPEDA – and with allowing Ottawa picking up the tab for its oversight and enforcement.

Twenty years after PIPEDA’s enactment, data thirsty technologies such as artificial intelligence are on the ascendance, public trust has been undermined by rampant overcollection, breaches and scandals, and transborder data flows are ubiquitous. The EU’s 2018 General Data Protection Regulation (GDPR) has set a new and higher standard for data protection and Canada must act to satisfy a new adequacy assessment. Bill C-11 is the federal response.

There are provisions in Bill C-11 that tackle the challenges posed by the contemporary data environment. For example, organizations will have to provide upfront a “general account” of their use of automated decision systems that “make predictions, recommendations or decisions about individuals that could have significant impacts on them” (s. 62(1)(c)). The right of access to one’s personal information will include a right to an explanation of any prediction, recommendation or decision made using an automated decision system (s. 63(3)). There are also new exceptions to consent requirements for businesses that seek to use their existing stores of personal information for new internal purposes. C-11 will facilitate some sharing of de-identified data for “socially beneficial purposes”. These are among the Bill’s innovations.

There are, however, things that the Bill does not do. Absent from Bill C-11 is anything specifically addressing the privacy of children or youth. In fact, the Bill reworks the meaning of “valid consent”, such that it is no longer assessed in terms of the ability of those targeted for the product or service to understand the consequences of their consent. This undermines privacy, particularly for youth. Ontario could set its own course in this area.

More importantly, perhaps, there are some things that a federal law simply cannot do. It cannot tread on provincial jurisdiction, which leaves important data protection gaps. These include employee privacy in provincially regulated sectors, the non-commercial activities of provincial organizations, and provincial political parties. The federal government clearly has no stomach for including federal political parties under the CPPA. Yet the province could act – as BC has done – to impose data protection rules on provincial parties. There is also the potential to build more consistent norms, as well as some interoperability where necessary, across the provincial public, health and private sectors under a single regulator.

The federal bill may also not be best suited to meet the spectrum of needs of Ontario’s provincially regulated private sector. Many of the bill’s reforms target the data practices of large corporations, including those that operate transnationally. The enhanced penalties and enforcement mechanisms in Bill C-11 are much needed, but are oriented towards penalizing bad actors whose large-scale data abuses cause significant harm. Make no mistake – we need C-11 to firmly regulate the major data players. And, while a provincial data protection law must also have teeth, it would be easier to scale such a law to the broad diversity of small and medium-sized enterprises in the Ontario market. This is not just in terms of penalties but also in terms of the compliance burden. Ontario’s Information and Privacy Commissioner could play an important role here as a conduit for information and education and as a point of contact for guidance.

Further, as the failed Sidewalk Toronto project demonstrated, the province is ripe with opportunities for public-private technology partnerships. Having a single regulator and an interoperable set of public and private sector data protection laws could offer real advantages in simplifying compliance and making the environment more attractive to innovators, while at the same time providing clear norms and a single point of contact for affected individuals.

In theory as well, the provincial government would be able to move quickly if need be to update or amend the law. The wait for PIPEDA reform has been excruciating. It it is not over yet, either. Bill C-11 may not be passed before we have to go to the polls again. That said, timely updating has not been a hallmark of either BC or Alberta’s regimes. Drawbacks of a new Ontario private sector data protection law would include further multiplication of the number of data protection laws in Canada, and the regulatory complexity this can create. A separate provincial law will also mean that Ontario will assume the costs of administering a private sector data protection regime. This entails the further risk that budget measures could be used by future governments to undermine data protection in Ontario. Still, the same risks – combined with considerably less control – exist with federal regulation. There remains a strong and interesting case for Ontario to move forward with its own legislation.

Published in Privacy

 

It’s been a busy privacy week in Canada. On November 16, 2020 Canada’s Department of Justice released its discussion paper as part of a public consultation on reform of the Privacy Act. On November 17, the Minister of Industry released the long-awaited bill to reform Canada’s private sector data protection legislation. I will be writing about both developments over the next while. But in this initial post, I would like to focus on one overarching and obvious omission in both the Bill and the discussion paper: the failure to address privacy as a human right.

Privacy is a human right. It is declared as such in international instruments to which Canada is a signatory, such as the Universal Declaration of Human Rights and the International Convention on Civil and Political Rights. Data protection is only one aspect of the human right to privacy, but it is an increasingly important one. The modernized Convention 108 (Convention 108+), a data protection originating with the Council of Europe but open to any country, puts human rights front and centre. Europe’s General Data Protection Regulation also directly acknowledges the human right to privacy, and links privacy to other human rights. Canada’s Privacy Commissioner has called for Parliament to adopt a human rights-based approach to data protection, both in the public and private sectors.

In spite of all this, the discussion paper on reform of the Privacy Act is notably silent with respect to the human right to privacy. In fact, it reads a bit like the script for a relationship in which one party dances around commitment, but just can’t get out the words “I love you”. (Or, in this case “Privacy is a human right”). The title of the document is a masterpiece of emotional distancing. It begins with the words: “Respect, Accountability, Adaptability”. Ouch. The “Respect” is the first of three pillars for reform of the Act, and represents “Respect for individuals based on well established rights and obligations for the protection of personal information that are fit for the digital age.” Let’s measure that against the purpose statement from Convention 108+: “The purpose of this Convention is to protect every individual, whatever his or her nationality or residence, with regard to the processing of their personal data, thereby contributing to respect for his or her human rights and fundamental freedoms, and in particular the right to privacy.” Or, from article 1 of the GDPR: “This Regulation protects fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data.” The difference is both substantial and significant.

The discussion paper almost blurts it out… but again stops short in its opening paragraph, which refers to the Privacy Act as “Canada’s quasi-constitutional legal framework for the collection, use, disclosure, retention and protection of personal information held by federal public bodies.” This is the romantic equivalent of “I really, really, like spending time with you at various events, outings and even contexts of a more private nature.”

The PIPEDA reform bill which dropped in our laps on November 17 does mention the “right to privacy”, but the reference is in the barest terms. Note that Convention 108+ and the GDPR identify the human right to privacy as being intimately linked to other human rights and freedoms (which it is). Section 5 of the Bill C-11 (the Consumer Privacy Protection Act) talks about the need to establish “rules to govern the protection of personal information in a manner that recognizes the right to privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.” It is pretty much what was already in PIPEDA, and it falls far short of the statements quoted from Convention 108+ and the GDPR. In the PIPEDA context, the argument has been that “human rights” are not within exclusive federal jurisdiction, so talking about human rights in PIPEDA just makes the issue of its constitutionality more fraught. Whether this argument holds water or not (it doesn’t), the same excuse does not exist for the federal Privacy Act.

The Cambridge Analytica scandal (in which personal data was used to subvert democracy), concerns over uses of data that will perpetuate discrimination and oppression, and complex concerns over how data is collected and used in contexts such as smart cities all demonstrate that data protection is more than just about a person’s right to a narrow view of privacy. Privacy is a human right that is closely linked to the enjoyment of other human rights and freedoms. Recognizing privacy as a human right does not mean that data protection will not not require some balancing. However, it does mean that in a data driven economy and society we keep fundamental human values strongly in focus. We’re not going to get data protection right if we cannot admit these connections and clearly state that data protection is about the protection of fundamental human rights and freedoms.

There. Is that so hard?

Published in Privacy

 

Early in the COVID-19 pandemic, as discussions swirled around the adoption of exposure notification or contact-tracing apps (CT/EN apps), Jason Millar, Kelly Bronson and I, and our outstanding students, Tommy Friedlich and Ryan Mosoff, began to explore the privacy and socio-ethical implications related to the creation, adoption and deployment of these apps. As part of this research we gathered data about contact-tracing apps around the world. This week we are launching a website – Global Pandemic App Watch (GPAW) – that uses some of the data we have gathered to provide a global look at the state of adoption of CT/EN apps. The website hosts three main maps; each focuses on different issues. There is also a series of country pages providing additional information about CT/EN apps adopted in different countries. GPAW is a work in progress. Gaining access to meaningful data on use and adoption of apps has been difficult, in part because in many jurisdictions good data is not being collected or, if collected, is not shared. Language has been a barrier to obtaining information in some cases. This is also a rapidly evolving area so that it is possible to miss or be behind with respect to new developments. Our focus is on apps, and not on more complex multi-technology disease and human surveillance strategies; these latter are not represented on the map. However, we believe that GPAW offers an interesting perspective on CT/EN apps.

One issue facing governments that adopt contact tracing or exposure notification apps is what underlying technology to use. Singapore’s early TraceTogether app offered a centralized model that proved interesting to other governments. Australia has been its most notable adopter. The province of Alberta in Canada adopted an app based on TraceTogether, although its future is now in doubt. Other countries considered the development of their own apps, exploring options – usually with local developers – that would integrate in different ways with public health agency needs. Some of these countries have proceeded with their own apps (e.g. France). Others faced public pressure and backed away from apps that might collect and store data centrally (e.g. Germany). The Google-Apple Exposure Notification (GAEN) API, which evolved in this period, offered a model of decentralized data storage that was touted as more privacy-friendly. GAEN (and particularly the more recent Exposure Notification Express) also streamlines the app development process making it easier to adopt. Nevertheless, because of the fully decentralized model, it is more difficult to integrate GAEN with broader goals or data needs of public health authorities.

Although the GAEN garnered considerable attention, a glance at our first map shows that its adoption is far from universal. Many other countries have chosen different options. We are interested in this diversity of apps. As noted above, there may be reasons to choose a more centralized model in order to better integrate a CT/EN app with public health agency activities. We suspect that there is also interest in many jurisdictions in using a CT/EN app as a vehicle for supporting local IT developers. The GAEN model perhaps responds best to concerns over privacy and civil liberties; it is interesting to see that its adoption is strong in the EU and in Canada where such concerns may be heightened. Several US states have also adopted or are in the process of adopting GAEN-based contact tracing apps.

Another feature of the first map that we wish to highlight is the interesting challenge posed by federal states. Canada is shown as uniformly adopting a GAEN model, but this does not reflect the story of how it moved in this direction. As noted earlier, Alberta was an early adopter of a different sort of app based on Singapore’s TraceTogether API. Ontario was the first to adopt the federal COVIDAlert. Other provinces have since announced that they will follow suit, including Alberta, but adoption is not yet universal across the country. The US is represented on the map as “various” because it too is a federal state, and no national app has been adopted. Some states have begun to adopt or develop their own apps (these developments are reported on a separate US Map.) Suffice it to say – federalism presents unique challenges. Where different apps are adopted in a single country or region, there can be issues of interoperability. This pits broader concerns about functionality against issues of regional autonomy. The Canadian and US stories around app adoption would make an interesting comparison. The EU, with its multiple apps and free movement of citizens within the EU is similarly an interesting case study in this regard.

Our second map visualizes uptake. CT/EN apps require significant levels of adoption to be useful, although there are differences of opinion as to what level of adoption is optimal. We have found it challenging to get accurate data about adoption levels and our map reflects this. Where data is available, numbers tend to be relatively low, although it will be interesting to see if a second wave (or third in some countries) pushes adoption to higher levels – and at what rate. It is interesting to consider what may lie behind poor access to adoption data. A lack of transparency about the metrics for evaluating the success/failure of these rapidly deployed technologies is worrisome. It may be that providing such data is a low priority in a pandemic; it may also be that governments do not wish to potentially discourage adoption by revealing low uptake numbers. Some countries, have had much better adoption rates than others. It would be interesting to consider why this might be the case. Are higher adoption rates related to greater trust in government? Greater concerns about the pandemic? Better government messaging?

Our final map looks at whether CT/EN apps have been made optional or mandatory in the countries in which they have been launched. This has been a hot button issue. Some groups or individuals have argued for making such apps mandatory in order to meet public health objectives. Others have raised civil liberties concerns. Underlying this, of course, is the reality that the technology on which these apps operate is not universally available. It would be difficult to make mandatory an app if not everyone could comply because they did not have a smart phone with a compatible operating system. Some jurisdictions have begun to explore the adoption of ‘fob’ technologies that could give greater access to contact tracing applications. Our map shows that currently only one country has adopted a mandatory CT/EN app. The overwhelming majority are voluntary. That said, we have begun to hear of cases in which the use of these apps is required by employers who are looking for tools to track exposure within the workplace, undermining their voluntary nature. As far as we are aware, only Australia has addressed this function-creep in specific app-related legislation.

The actual usefulness of CT/EN apps remains an open question. Concerns have been raised about their technological deficiencies leading to potential false positives and false negatives. It is not clear whether CT apps actually catch many cases not already caught by manual contact-tracing activities. Of course, when normal contact tracing starts to buckle under pressure from a second wave, as is the case now in Ontario, the app takes on a greater role. Nevertheless, with GAEN apps, there is no two-way interface with the public health system. It would be interesting to know how many people receive notifications through the app, and how many of those present themselves for testing – or take recommended actions such as self-quarantining. It would be useful to have a variety of data to assess the reliability and effectiveness of these technologies, but it is not clear that such data will be collected by governments or public health authorities or, if it is collected, whether it will be made readily available.

We hope that you will find our maps interesting and useful. The country pages, as they develop will provide additional information, as well as links to numerous articles and other materials about the apps that we have been collecting since the beginning of the pandemic. We think the project raises a number of interesting unanswered questions that will linger well into the future. Our current plan is to update the site weekly. There is space on the site to provide feedback, comments, and even data should you have it to share.

Our work has been generously supported This project is partially funded by the Scotiabank Fund for AI and Society at the University of Ottawa, SSHRC and the Centre for Law, Technology and Society at the University of Ottawa.

Published in Privacy

 

The BC Court of Appeal has handed down a decision that shakes up certain assumptions about recourse for privacy-related harms in that province – and perhaps in other provinces as well.

The decision relates to a class action lawsuit filed after a data breach. The defendant had stored an unencrypted copy of a database containing customer personal information on its website. The personal information included: “names, addresses, email addresses, telephone numbers, dates of birth, social insurance numbers, occupations, and, in the case of credit card applicants, their mothers' birth names.” (at para 4) This information was accessed by hackers. By the time of this decision, some of the information had been used in phishing scams but the full extent of its use is still unknown.

As is typical in privacy class action lawsuits, the plaintiffs sought certification on multiple grounds. These included: “breach of contract, negligence, breach of privacy, intrusion upon seclusion, beach of confidence, unjust enrichment and waiver of tort.” (at para 6) The motions judge certified only claims in contract, negligence, and the federal common law of privacy.

The defendants appealed, arguing that the remaining grounds were not viable and that the action should not have been certified. They also argued that a class action lawsuit was not the preferable procedure for the resolution of the common issues. While the plaintiffs cross-appealed the dismissal of the claim for breach of confidence, they did not appeal the decision that there was no recourse for breach of privacy or the tort of intrusion upon seclusion under BC law.

This post focuses what I consider to be the three most interesting issues in the case. These are: whether there is recourse for data breaches other than via data protection legislation; whether the tort of breach of privacy exists in B.C.; and whether there is a federal common law of privacy.

1. Is PIPEDA a complete code

The defendants argued that the class action lawsuit was not the preferred procedure because the federal Personal Information Protection and Electronic Documents Act (PIPEDA) constituted a “complete code in respect of the collection, retention, and disclosure of personal information by federally-regulated businesses, and that no action, apart from the application to the Federal Court contemplated by the Act can be brought in respect of a data breach.” (at para 18) Justice Groberman, writing for the unanimous Court, noted that while it was possible for a statute to constitute a complete code intended to fully regulate a particular domain, it is not inevitable. He observed that the Ontario Court of Appeal decision in Hopkins v. Kay had earlier determined that Ontario’s Personal Health Information Protection Act (PHIPA) did not constitute a complete code when it came to regulating personal health information, allowing a lawsuit to proceed against a hospital for a data breach. In Hopkins, the Ontario Court of Appeal noted that PHIPA was primarily oriented towards addressing systemic issues in the handling of personal health information, rather than dealing with individual disputes. Although there was a complaints mechanism in the statute, the Commissioner had the discretion to decline to investigate a complaint if a more appropriate procedure were available. Justice Groberman noted that PIPEDA contained a similar provision in s. 12. He observed that “[t]his language, far from suggesting that the PIPEDA is a complete code, acknowledges that other remedies continue to be available, and gives the Commissioner the discretion to abstain from conducting an investigation where an adequate alternative remedy is available to the complainant.” (at para 28) In his view, PIPEDA is similarly oriented towards addressing systemic problems and preventing future breaches, and that “[w]hile there is a mechanism to resolve individual complaints, it is an adjunct to the legislative scheme, not its focus.” (at para 29) He also found it significant that PIPEDA addressed private rather than public sector data protection. He stated: “[w]ithin a private law scheme, it seems to me that we should exercise even greater caution before concluding that a statute is intended to abolish existing private law rights.” (at para 30) He concluded that nothing in PIPEDA precluded other forms of recourse for privacy harms.

2. Do common law privacy torts exist in BC?

In 2012 the Ontario Court of Appeal recognized the privacy tort of intrusion upon seclusion in Jones v. Tsige. However, since British Columbia has a statutory privacy tort in its Privacy Act, the motions judge (like other BC judges before him) concluded that the statutory tort displaced any possible common law tort in BC. Justice Groberman was clearly disappointed that the plaintiffs had chosen not to appeal this conclusion. He stated: “In my view, the time may well have come for this Court to revisit its jurisprudence on the tort of breach of privacy.” (at para 55) He proceeded to review the case law usually cited as supporting the view that there is no common law tort of breach of privacy in BC. He distinguished the 2003 decision in Hung v. Gardiner on the basis that in that case the judge at first instance had simply stated that he was not convinced by the authorities provided that such a tort existed in BC. On appeal, the BCCA agreed with the judge’s conclusion on an issue of absolute privilege, and found it unnecessary to consider any of the other grounds of appeal.

The BCCA decision in Mohl v. University of British Columbia is more difficult to distinguish because in that case the BCCA stated “[t]here is no common-law claim for breach of privacy. The claim must rest on the provisions of the [Privacy] Act.” (Mohl at para 13) Nevertheless, Justice Groberman indicated that while this statement was broad, “it is not entirely clear that it was intended to be a bold statement of general principle as opposed to a conclusion with respect to the specific circumstances of Mr. Mohl's case. In any event, the observation was not critical to this Court's reasoning.” (at para 62)

Justice Groberman concluded that “The thread of cases in this Court that hold that there is no tort of breach of privacy, in short, is a very thin one.” (at para 64) He also noted that the privacy context had considerably changed, particularly with the Ontario Court of Appeal’s decision in Jones v. Tsige. He stated:

It may be that in a bygone era, a legal claim to privacy could be seen as an unnecessary concession to those who were reclusive or overly sensitive to publicity, though I doubt that that was ever an accurate reflection of reality. Today, personal data has assumed a critical role in people's lives, and a failure to recognize at least some limited tort of breach of privacy may be seen by some to be anachronistic. (at para 66)

He indicated that the Court of Appeal might be inclined to reconsider the issue were it to be raised before them, although he could not do so in this case since the plaintiffs had not appealed the judge’s ruling on this point.

3. There is no federal common law of privacy

However keen Justice Groberman might have been to hear arguments on the common law tort of privacy, he overturned the certification of the privacy claims as they related to the federal common law of privacy. He characterized this approach as ‘creative’, but inappropriate. He noted that while common law principles might evolve in areas of federal law (e.g. maritime law), in cases where there was shared jurisdiction such as in privacy law, there was no separate body of federal common law distinct from provincial common law. He stated “there is only a single common law, and it applies within both federal and provincial spheres.” (at para 76) More specifically, he stated:

Where an area of law could be regulated by either level of government, it is not sensible to describe the situation in which neither has enacted legislation as being a situation of either "federal" or "provincial" common law. It is simply a situation of the "common law" applying. The plaintiffs cannot choose whether to bring their claims under "federal" or "provincial" common law as if these were two different regimes. (at para 86)

Because the claim advanced by the plaintiff had nothing to do with any specific area of federal jurisdiction, Justice Groberman rejected the idea that a cause of action arose under “federal” common law.

Overall, this decision is an interesting one. Clearly the Court of Appeal is sending strong signals that it is time to rethink recourse for breach of privacy in the province. It may now be that there is both a statutory and a common law action for breach of privacy. If this is so, it will be interesting to see what scope is given to the newly recognized common law tort. “Complete code” arguments have arisen in other lawsuits relating to breach of privacy; the BCCA’s response in this case adds to a growing body of jurisprudence that rejects the idea that data protection laws provide the only legal recourse for the mishandling of personal data. Finally, a number of class action lawsuits have asserted the “federal common law of privacy”, even though it has been entirely unclear what this is. The BCCA suggests that it is a fabrication and that no such distinct area of common law exists.

Published in Privacy

 

This is a copy of my submission in response to the Elections Canada consultation on Political Communications in Federal Elections. The consultation closes on August 21, 2020. Note that this submission has endnotes which are at the end of the document. Where possible these include hyperlinks to the cited sources.

16 August 2020

I appreciate the invitation to respond to Election Canada’s consultation on the overall regulatory regime that governs political communications in federal elections. I hold the Canada Research Chair in Information Law and Policy at the University of Ottawa, where I am also a law professor. I provide the following comments in my capacity as an individual.

The consultation raises issues of great importance to Canadians. My comments will focus on Discussion Paper 3: The Protection of Electors’ Personal Information in the Federal Electoral Context.[1]

Concerns over how political parties handle personal information have increased steadily over the years. Not surprisingly, this coincides with the rise of big data analytics and artificial intelligence (AI) and the capacity of these technologies to use personal data in new ways including profiling and manipulating. Discussion Paper 3 hones in on the Cambridge Analytica scandal[2] and its implications for the misuse of personal data for voter manipulation. This egregious case illustrates why, in a big data environment, we need to seriously address how voter personal data is collected, used and disclosed.[3] The potential misuse of data for voter manipulation is an expanding threat.[4] Yet this kind of high-profile voter manipulation scandal is not the only concern that Canadians have with how their personal information is handled by political parties. Additional concerns include lax security;[5] unwanted communications;[6] targeting based on religion, ethnicity or other sensitive grounds;[7] data sharing;[8] lack of transparency,[9] and voter profiling.[10] In addition, there is a troubling lack of transparency, oversight and accountability.[11] All of these are important issues, and they must be addressed through a comprehensive data protection regime.[12]

Public concern and frustration with the state of data protection for Canadians when it comes to political parties has been mounting. There have been reports and studies,[13] op-eds and editorials,[14] privacy commissioner complaints,[15] a competition bureau complaint,[16] and even legal action.[17]

There is a growing gulf between what Canadians expect when it comes to the treatment of their personal data and the obligations of political parties. Canadians now have two decades of experience with the Personal Information Protection and Electronic Documents Act (PIPEDA)[18] which governs the collection, use, and disclosure of personal data in the private sector. Yet PIPEDA does not apply to political parties, and there is a very wide gap between PIPEDA’s data protection norms and the few rules that apply to federal political parties. There is also considerable unevenness in the regulatory context for use of personal data by political parties across the country. For example, B.C.’s Personal Information Protection Act (PIPA)[19] already applies to B.C. political parties, and while there have been some problems with compliance,[20] the democratic process has not been thwarted. A recent interpretation of PIPA by the B.C. Privacy Commissioner also places federal riding offices located in B.C. under its jurisdiction.[21] This means that there are now different levels of data protection for Canadians with respect to their dealings with federal parties depending upon the province in which they live and whether, if they live in B.C., they are interacting with their riding office or with the national party itself.. Further, if Quebec’s Bill 64 is enacted, it would largely extend the province’s private sector data protection law to political parties. Ontario, which has just launched a consultation on a new private sector data protection law for that province is considering extending it to political parties.[22] Internationally, The EU’s General Data Protection Regulation (GDPR)[23] applies to political parties, with some specially tailored exceptions. Frankly put, it is becoming impossible to credibly justify the lack of robust data protection available to Canadians when it comes to how their personal data is handled by political parties. Lax data protection is neither the rule in Canada, nor the norm internationally.

There are points at which Discussion Paper 3 is overly defensive about the need for political parties to collect, use and disclose personal information about voters in the course of their legitimate activities. This need is not contested. But for too long it has gone virtually unrestrained and unsupervised. To be clear, data protection is not data prohibition. Data protection laws explicitly acknowledge the need of organizations to collect, use and disclose personal information.[24] Such laws set the rules to ensure that organizations collect, use, and disclose personal data in a manner consistent with the privacy rights of individuals. In addition, they protect against broader societal harms that may flow from unrestrained uses of personal data, including, in the political context, the manipulation of voters and subversion of democracy.

1. Information provided to parties by Elections Canada

Discussion Paper 3 sets out the current rules that protect electors’ personal information. For the most part, they are found in the Canada Elections Act (CEA).[25] In some instances, these rules provide less protection than comparable provincial election laws. For example, security measures, including the use of fictitious information in lists of electors to track unauthorized uses are in place in some jurisdictions, but not at the federal level. Discussion Paper 3 notes that while such measures are not part of the CEA, best practices are provided for in Elections Canada guidelines.[26] These guidelines are not mandatory and are insufficient to protect electors’ information from deliberate or unintentional misuse.

The CEA also contains new provisions requiring political parties to adopt privacy polices and to publish these online. While such privacy policies offer some improved degree of transparency, they do not provide for adequate enforcement or accountability. Further, they do not meet the threshold, in terms of prescribed protections, of the fair information principles that form the backbone of most data protection laws including PIPEDA.

There are some matters that should be addressed by specific provisions in the CEA. These relate to information that is shared by the CEA with political parties such as the list of electors. The CEA should maintain accountability for this information by imposing security obligations on parties or candidates who receive the list of electors. It would be appropriate in those circumstances to have specific data breach notification requirements relating to the list of electors contained in the CEA. However, with respect to the wealth of other information that political parties collect or use, they should have to comply with PIPEDA and be accountable under PIPEDA for data breaches.

2. Fair Information Principles Approach

Discussion Paper 3 takes the position that fair information principles should be applied to political parties, and frames its questions in terms of how this should be accomplished. There are two main options. One is to craft a set of rules specifically for political parties which might be incorporated into the CEA, with oversight by either the Privacy Commissioner and/or the Chief Electoral Officer. Another is to make political parties subject to PIPEDA, and to add to that law any carefully tailored exceptions necessary in the political context. The latter approach is better for the following reasons:

· The data protection landscape in Canada is already fragmented, with separate laws for federal and provincial public sectors; separate laws for the private sector, including PIPEDA and provincial equivalents in B.C., Alberta and Quebec; and separate laws for personal health information. There is a benefit to simplicity and coherence. PIPEDA can be adapted to the political context. There are many obligations which can and should be the same whether for private sector organizations or political parties. If particular exceptions tailored to the political context are required, these can be added.

· Political parties in BC (including federal riding associations) are already subject to data protection laws. Quebec, in Bill 64, proposes to make political parties subject to their private sector data protection law. The same approach should be followed federally.

· It is expected that PIPEDA will be amended in the relatively short term to bring it into line with the contemporary big data context. Creating separate norms in the CEA for political parties risks establishing two distinct privacy schemes which may not keep up with one another as the data context continues to evolve. It is much simpler to maintain one set of norms than to have two sets of privacy norms that are initially similar but that diverge over time.

 

3. Fair Information Principles: Specific Provisions

Discussion Paper 3 considers certain of the Fair Information Principles and how they apply to political parties. This discussion seems to assume in places that the solution will be to introduce new provisions in the CEA, rather than applying PIPEDA to political parties, subject to certain exceptions. For example, the first question under Accountability asks “Besides publishing their privacy policies, what other requirements could parties be subject to in order to make them accountable for how they collect, use and disclose personal information?”[27] As noted above, my view is that political parties should be subject to PIPEDA. The “other requirements” needed are those found in PIPEDA. There is no need to reinvent the wheel for political parties.

On the issue of data breaches, I note with concern that Discussion Paper 3 takes an overly cautious approach. For example, it states, presumably referring to PIPEDA, that “There are also penalties for organizations that knowingly fail to report a breach, which could be ruinous for a smaller party.”[28] In the first place, these penalties are for knowingly failing to report a breach, not for experiencing a breach. A party that experiences a data breach that creates a real risk of serious harm to an individual (the reporting threshold) and does not report it, should not complain of the fines that are imposed for this failure. Secondly, the amounts set out in the legislation are maximum fines and courts have discretion in imposing them. In any event, a class action law suit following a data breach is much more likely to be the ruination of a smaller party; liability for such a data breach could be mitigated by being able to demonstrate not only that the party complied with data protection norms but that it also responded promptly and appropriately when the breach took place. In my view, the data breach notification requirements can and should be applied to political parties.

Discussion Paper 3 also floats the idea of a voluntary code of practice as an alternative to parties being subject to data protection laws. It states: “A voluntary code may be more palatable to political parties than legislated change, while at the same time moving towards increasing electors’ privacy”.[29] It is fair to say that ‘soft’ guidance with no enforcement is always more palatable to those to whom it would apply than real obligations. However, we are long past the time for a gentle transition to a more data protective approach. Political parties have embraced big data and data analytics and now collect, use, and disclose unprecedented amounts of personal information. They need to be subject to the same data protection laws as other actors in this environment. While those laws may need a few carefully tailored exceptions to protect the political process, on the whole, they can and should apply.

It would be wasteful, confusing, and unsatisfactory to create a parallel regime for data protection and political parties in Canada. Given their embrace of the big data environment and their expanding use of personal data, these parties should be held to appropriate and meaningful data protection norms, with oversight by the Privacy Commissioner of Canada. Federal political parties should be subject to PIPEDA with some carefully tailored exceptions.



[1] Elections Canada, Discussion Paper 3: The Protection of Electors’ Personal Information in the Federal Electoral Context, May 2020, online: https://www.elections.ca/content.aspx?section=res&dir=cons/dis/compol/dis3&document=index&lang=e.

[2] See, e.g.: Office of the Privacy Commissioner of Canada, PIPEDA Report of Findings #2019-004: Joint investigation of AggregateIQ Data Services Ltd. by the Privacy Commissioner of Canada and the Information and Privacy Commissioner for British Columbia, November 26 2019, online: https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2019/pipeda-2019-004/.

[3] Cherise Seucharan and Melanie Green, “A B.C. scandal has pulled back the curtain on how your online information is being used”, November 29, 2019, online: https://www.thestar.com/vancouver/2019/11/29/heres-how-companies-and-political-parties-are-getting-their-hands-on-your-data.html.

[4] Brian Beamish, 2018 Annual Report: Privacy and Accountability for a Digital Ontario, Office of the Information and Privacy Commissioner of Ontario, June 27, 2019, at p. 30, online: https://www.ipc.on.ca/wp-content/uploads/2019/06/ar-2018-e.pdf. Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, online: https://www.oipc.bc.ca/investigation-reports/2278.

[5] Joan Bryden, “Elections Canada chief warns political parties are vulnerable to cyberattacks”, 4 February 2019, Global News, online: https://globalnews.ca/news/4925322/canada-political-parties-cyberattack-threat/; Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 6 (noting the number of complaints received relating to lax security practices), and pp. 27-31 (outlining security issues), online: https://www.oipc.bc.ca/investigation-reports/2278.

[6] Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 22, online: https://www.oipc.bc.ca/investigation-reports/2278. Note that the complaint that led to the ruling that that province’s Personal Information Protection Act applied to federal riding associations in B.C. was based on an unconsented to use of personal data. See: OIPC BC, Courtenay-Alberni Riding Association of The New Democratic Party of Canada, Order No. P19-02, 28 August 2019, online: https://www.oipc.bc.ca/orders/2331.

[7] See, e.g.: Michael Geist, “Why Political Parties + Mass Data Collection + Religious Targeting + No Privacy Laws = Trouble”, October 11, 2019, online: http://www.michaelgeist.ca/2019/10/why-political-parties-mass-data-collection-religious-targeting-no-privacy-laws-trouble/; Sara Bannerman, Julia Kalinina, and Nicole Goodman, “ Political Parties’ Voter Profiling Is a Threat to Democracy”, The Conversation, 27 January 2020, online: https://thetyee.ca/Analysis/2020/01/27/Political-Parties-Profiling-Democracy/.

[8] See: Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 25, online: https://www.oipc.bc.ca/investigation-reports/2278.

[9] Colin Bennett, “They’re spying on you: how party databases put your privacy at risk”, iPolitics, September 1, 2015, online: https://ipolitics.ca/2015/09/01/theyre-spying-on-you-how-party-databases-put-your-privacy-at-risk/

[10] Colin J. Bennett, “Canadian political parties are gathering more and more data on voters all the time. It’s time we regulated what data they glean, and what they can do with it”, Policy Options, 1 February 2013, online: https://policyoptions.irpp.org/magazines/aboriginality/bennett/.

[11] See, e.g.: Yvonne Colbert, “What's in your file? Federal political parties don't have to tell you”, CBC, 30 July 2019, online: https://www.cbc.ca/news/canada/nova-scotia/privacy-federal-political-parties-transparency-1.5226118; Katharine Starr, “Privacy at risk from Canadian political parties, says U.K. watchdog”, CBC, 10 November 2018, online: https://www.cbc.ca/news/politics/uk-information-commissioner-canadian-parties-data-privacy-1.4898867.

[12] Federal, Provincial and Territorial Privacy Commissioners of Canada support meaningful privacy obligations for political parties. See: Securing Trust and Privacy in Canada’s Electoral Process: Resolution of the Federal, Provincial and Territorial Information and Privacy Commissioners, Regina, Saskatchewan, September 11-13, 2018, online: https://www.priv.gc.ca/en/about-the-opc/what-we-do/provincial-and-territorial-collaboration/joint-resolutions-with-provinces-and-territories/res_180913/.

[13] See, e.g.: Colin J. Bennett and Robyn M. Bayley, “Canadian Federal Political Parties and Personal Privacy Protection: A Comparative Analysis”, March 2012, online: https://www.priv.gc.ca/en/opc-actions-and-decisions/research/explore-privacy-research/2012/pp_201203/; Colin Bennett, “Data Driven Elections and Political Parties in Canada: Privacy Implications, Privacy Policies and Privacy Obligations”, (April 12, 2018). Canadian Journal of Law and Information Technology, Available at SSRN: https://ssrn.com/abstract=3146964; Colin J. Bennett, “Privacy, Elections and Political Parties: Emerging Issues For Data Protection Authorities”, 2016, online: https://www.colinbennett.ca/wp-content/uploads/2016/03/Privacy-Elections-Political-Parties-Bennett.pdf; House of Commons, Standing Committee on Access to Information, Privacy and Ethics, Democracy Under Threat: Risks and Solutions in the Era of Disinformation and Data Monopoly (December 2018), online: <https://www.ourcommons.ca/Content/Committee/421/ETHI/Reports/RP10242267/ethirp17/ethirp17-e.pdf>, archived: https://perma.cc/RV8T-ZLWW.

[14] See, e.g.: Samantha Bradshaw, “Data-protection laws must be extended to political parties”, Globe and Mail, 22 March 2018, online: https://www.theglobeandmail.com/opinion/article-data-protection-laws-must-be-extended-to-political-parties/; Michael Morden, “Politicians say they care about privacy. So why can political parties ignore privacy law?”, Globe and Mail, 29 May 2019, online: https://www.theglobeandmail.com/opinion/article-politicians-say-they-care-about-privacy-so-why-can-political-parties/; Colin Bennett, “Politicians must defend Canadians’ online privacy from Big Tech – and from politicians themselves”, Globe and Mail, 26 December 2019, online: https://www.theglobeandmail.com/opinion/article-politicians-must-defend-canadians-online-privacy-from-big-tech-and/; Sabrina Wilkinson, “Voter Privacy: What Canada can learn from abroad”, OpenCanada.org, 4 October 2019, online: https://www.opencanada.org/features/voter-privacy-what-canada-can-learn-abroad/ Fraser Duncan, “Political Parties and Voter Data: A Disquieting Gap in Canadian Privacy Legislation”, Saskatchewan Law Review, June 21 2019, online: https://sasklawreview.ca/comment/political-parties-and-voter-data-a-disquieting-gap-in-canadian-privacy-legislation.php; Colin Bennett, “They’re spying on you: how party databases put your privacy at risk”, iPolitics, September 1, 2015, online: https://ipolitics.ca/2015/09/01/theyre-spying-on-you-how-party-databases-put-your-privacy-at-risk/.

[15] See: Office of the Information and Privacy Commissioner of British Columbia, “Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 25, online: https://www.oipc.bc.ca/investigation-reports/2278; OIPC BC, Courtenay-Alberni Riding Association of The New Democratic Party of Canada, Order No. P19-02, 28 August 2019, online: https://www.oipc.bc.ca/orders/2331.

[16] See: Rachel Aiello, “Major political parties under competition probe over harvesting of Canadians' personal info”, CTV News 15 January 2020, online: https://www.ctvnews.ca/politics/major-political-parties-under-competition-probe-over-harvesting-of-canadians-personal-info-1.4768501.

[17] Rachel Gilmore, “Privacy group going to court over alleged improper use of voters list by Liberals, Tories and NDP”, CTV News, 10 August 2020, online: https://www.ctvnews.ca/politics/privacy-group-going-to-court-over-alleged-improper-use-of-voters-list-by-liberals-tories-and-ndp-1.5058556.

[19] SBC 2003, c 63, http://canlii.ca/t/52pq9.

[20] Investigation Report P19-01: Full Disclosure: Political parties, campaign data, and voter consent”, February 6, 2019, at 22, online: https://www.oipc.bc.ca/investigation-reports/2278.

[21] OIPC BC, Courtenay-Alberni Riding Association of The New Democratic Party of Canada, Order No. P19-02, 28 August 2019, online: https://www.oipc.bc.ca/orders/2331.

[22] Ministry of Government and Community Services, “Ontario Private Sector Privacy Reform: Improving private sector privacy for Ontarians in a digital age”, 13 August 2020, online: https://www.ontariocanada.com/registry/showAttachment.do?postingId=33967&attachmentId=45105.

[23] L119, 4 May 2016, p. 1–88; online: https://gdpr-info.eu/.

[24] See, e.g., PIPEDA, s. 3.

[26] Elections Canada, Guidelines for the Use of the List of Electors, https://www.elections.ca/content.aspx?section=pol&document=index&dir=ann/loe_2019&lang=e.

[27] Elections Canada, Discussion Paper 3: The Protection of Electors’ Personal Information in the Federal Electoral Context, May 2020, at 11, online: https://www.elections.ca/content.aspx?section=res&dir=cons/dis/compol/dis3&document=index&lang=e.

[28] Ibid at 16.

[29] Ibid at 17.

Published in Privacy

 

The Ontario Government has just launched a public consultation and discussion paper to solicit input on a new private sector data protection law for Ontario.

Currently, the collection, use and disclosure of personal information in Ontario is governed by the Personal Information Protection and Electronic Documents Act (PIPEDA). This is a federal statute overseen by the Privacy Commissioner of Canada. PIPEDA allows individual provinces to pass their own private sector data protection laws so long as they are ‘substantially similar’. To date, Quebec, B.C. and Alberta are the only provinces to have done so.

Critics of this move by Ontario might say that there is no need to add the cost of overseeing a private sector data protection law to the provincial budget when the federal government currently bears this burden. Some businesses might also balk at having to adapt to a new data protection regime. While many of the rules might not be significantly different from those in PIPEDA, there are costs involved simply in reviewing and assessing compliance with any new law. Another argument against a new provincial law might relate to the confusion and uncertainty that could be created around the application of the law, since it would likely only apply to businesses engaged in intra-provincial commercial activities and not to inter-provincial or international activities, which would remain subject to PIPEDA. Although these challenges have been successfully managed in B.C., Alberta and Quebec, there is some merit in having a single, overarching law for the whole of the private sector in Canada.

Nevertheless, there are many reasons to enthusiastically embrace this development in Ontario. First, constitutional issues limit the scope of application of PIPEDA to organizations engaged in the collection, use or disclosure of personal information in the course of commercial activity. This means that those provinces that rely solely on PIPEDA for data protection regulation have important gaps in coverage. PIPEDA does not apply to employees in provincially regulated sectors; non-commercial activities of non-profits and charities are not covered, nor are provincial (or federal, for that matter) political parties. The issue of data protection and political parties has received considerable attention lately. B.C.’s private sector data protection law applies to political parties in B.C., and this has recently been interpreted to include federal riding associations situated in B.C. Bill 64, a bill to amend data protection laws in Quebec, would also extend the application of that province’s private sector data protection law to provincial political parties. If Ontario enacts its own private sector data protection law, it can (and should) extend it to political parties, non-commercial actors or activities, and provide better protection for employee personal data. These are all good things.

A new provincial law will also be designed for a digital and data economy. A major problem with PIPEDA is that it has fallen sadly out of date and is not well adapted to the big data and AI environment. For a province like Ontario that is keen to build public trust in order to develop its information economy, this is a problem. Canadians are increasingly concerned about the protection of their personal data. The COVID-19 crisis appears to have derailed (once again) the introduction of a bill to amend PIPEDA and it is not clear when such a bill will be introduced. Taking action at the provincial level means no longer being entirely at the mercy of the federal agenda.

There is something to be said as well for a law, and a governance body (in this case, it would be the Office of the Ontario Information and Privacy Commissioner) that is attuned to the particular provincial context while at the same time able to cooperate with the federal Commissioner. This has been the pattern in the other provinces that have their own statutes. In Alberta and B.C. in particular, there has been close collaboration and co-operation between federal and provincial commissioners, including joint investigations into some complaints that challenge the boundaries of application of federal and provincial laws. In addition, Commissioners across the country have increasingly issued joint statements on privacy issues of national importance, including recently in relation to COVID-19 and contact-tracing apps. National co-operation combined with provincial specificity in data protection could offer important opportunities for Ontario.

In light of this, this consultation process opens an exciting new phase for data protection in Ontario. The task will not simply to be to replicate the terms of PIPEDA or even the laws of Alberta and B.C. (all of which can nonetheless provide useful guidance). None of these laws is particularly calibrated to the big data environment (B.C.’s law is currently under review), and there will be policy choices to be made around many of the issues that have emerged in the EU’s General Data Protection Regulation. This consultation is an opportunity to weigh in on crucially important data protection issues for a contemporary digital society, and a made-in-Ontario statute.

Published in Privacy
Monday, 10 August 2020 08:58

How Will COVID Alert Measure Up?

 

Canada’s new exposure notification app – COVID Alert has launched in Ontario. This shifts the focus from whether to adopt an app and what type to how we will know if the app is a success.

COVID Alert is built upon the Google Apple Exposure Notification System (GAEN) which is a completely decentralized model. This means that none of the proximity data collected via the app is shared with public authorities. GAEN apps must be entirely voluntary. Users choose whether to download the app and whether to upload positive test results for COVID-19. If a user is notified that they have been in proximity to someone who has tested positive for COVID-19 the app will advise what steps to take – but it will be up to the user to take those steps. Although there are privacy risks with any app (and here, they would be predominantly ones related to security and the possibility of malicious attacks), this could be the app on most users’ phones that collects the least personal data. COVID Alert has been vetted by the Privacy Commissioner of Canada and by Ontario’s Privacy Commissioner. It will also be reviewed by privacy commissioners in those provinces that choose to deploy it.

All of this is good news. As we start returning to workplaces, bars, restaurants and public transit, our daily lives will involve more and more moments of proximity with strangers. If nearly everyone is using COVID Alert – and if COVID Alert actually works the way it should – then it should help alert us to potential exposure to COVID-19 so that we can take steps to get tested and/or to isolate ourselves from those we might harm.

Although it is likely to be useful, authorities are quick to point out it is only one tool among many. This is because there is much that is unknown about the actual performance of GAEN exposure notification apps. Such apps have only recently been launched in other countries. The threshold for recording a proximity event is one issue. For COVID Alert, a proximity event is recorded when two app users are within 2 metres of each other for 15 minutes or more. An EU guidance document describes this as “a starting point for the definition of a high-risk exposure contact”, but also indicates that “evaluation and calibration will be key to define the optimal time-and distance settings that adequately capture people at risk of infection.” The apps cannot detect whether people are separated by plexiglass or wearing masks or face shields, and may not function as well when phones are in purses or backpacks. These factors may impact the accuracy of the apps. People may receive exposure notifications due to contacts that are very unlikely to result in infection (on opposite sides of plexiglass, for example) but will experience stress and disruption (perhaps having to miss work while waiting for test results) as a result. These inconveniences might be disproportionately experienced by those whose work demands that they interact with the public or ride transit, and there may be problematic sociodemographic impacts as a result. On the other hand, for those who have to be out and about, the app may provide some level of comfort. There is much that we do not yet know, but that we need to learn. Noting some of the uncertainties around these types of apps, the Privacy Commissioner has recommended “that the government closely monitor and evaluate the app’s effectiveness once it is used, and decommission it if effectiveness cannot be demonstrated.”

One way to learn about the app and its impacts is to gather data and develop metrics to assess its performance. The highly decentralized GAEN model makes this more challenging, since no data is shared with governments via the app. The number of downloads can reveal how many people are willing to try the app. But it does not do much more than that. Useful data would include data about how many people who get tested do so because they received an app notification. It would be interesting to be able to correlate this data with positive or negative test results. In other words, what percentage of people who are prompted to get tested by the app actually test positive for COVID-19? It would also be useful to know how many of the people who receive exposure notifications are also separately contacted by contact tracers. Does the app amplify the reach of conventional contact tracing or does it largely duplicate it? Jurisdictions such as Australia, which has a centralized model, are beginning to collect and analyze such data. Alberta’s contact tracing app uses a centralized system and it might be particularly interesting to compare the domestic performance of a centralized app with the decentralized one. And, while the GAEN is fully decentralized, it does allow for additional data to be collected, with user consent, so long as this is separate from the exposure notification system. The Irish app, built on GAEN, has a voluntary user survey which allows consenting users to share data about the performance of the app. As provinces begin to deploy COVID Alert, both they and the federal government should be thinking about what data they need to evaluate this technology, and how they will gather it. According to the Privacy Commissioner’s assessment, the new Advisory Council established to oversee the use of the app will evaluate its effectiveness. Any such evaluation should be shared with the public.

As the app rolls out in Ontario, individuals will be asked to download it, and broad uptake will be important to its success. Using the app may provide individuals with added protection; it also means that they will be contributing to an experiment to assess the utility of this type of technology to assist in pandemic control. COVID Alert aims to help contain a disease which we know can spread wildly and at great personal and societal cost. Carefully calibrated metrics, and transparency about the successes or failures of the app should, and hopefully will, be part of this experiment.

Published in Privacy

 

On July 10, 2020, the Supreme Court of Canada issued a split decision in a constitutional law case with interesting implications for privacy law. The Quebec government had challenged the constitutionality of Canada’s 2017 Genetic Non-Discrimination Act (the Act), arguing that it fell outside federal jurisdiction over criminal law and intruded upon areas of provincial jurisdiction. It had brought its challenge by way of a reference case to the Quebec Court of Appeal. That Court ruled that the law was unconstitutional, and the decision was appealed to the Supreme Court of Canada.

The hearing before the Supreme Court of Canada was unusual. The appeal was brought, not by the federal Attorney-General, but rather by the Canadian Coalition for Genetic Fairness, a group that had been an intervenor in the Court of Appeal proceedings. Notwithstanding the fact that the validity of a federal law was at issue, the federal Attorney-General sided with the Attorney-General of Quebec in arguing that the law was unconstitutional. This strange situation was due to the origins of the law. It was introduced as a Senate bill, championed by Senator James Cowan. The bill was passed by the Senate and came to the House of Commons. The federal government believed that it was unconstitutional and cabinet did not support it. However, given the nature of the subject matter of the bill, the government allowed a free vote in Parliament. The result was that the bill passed by a vote of 222 in favour and 60 against.

A majority of five Supreme Court of Canada judges, in two separate decisions, ruled that the Act was a valid exercise of Canada’s jurisdiction over criminal law under s. 91(27) of the Constitution Act, 1867. Four dissenting judges were of the opinion that the law was, in “pith and substance” a matter of provincial jurisdiction.

The Act does three things, only one of which was challenged before the Court. In the first place (and most controversially) it makes it an offence for anyone to require an individual to undergo a genetic test or to provide the results of an existing genetic test as a condition of receiving goods or services or of entering into a contract. The law also prohibits anyone from collecting the results of a genetic test taken by a person from a third-party source without that person’s written consent. The non-controversial parts of the bill consisted of amendments to the Canadian Human Rights Act to prohibit genetic discrimination, and amendments to the Canada Labour Code to protect employees against forced genetic testing or requirements to disclose test results. It was accepted that the federal government had jurisdiction to amend these statutes in this way. Thus, the only issue before the court was the constitutional basis for the parts of the Act that dealt with the provision of goods and services and the entering into contracts.

It was no secret that a major concern of the proponents of the Act was that individuals might be compelled to reveal their genetic history, and might be adversely impacted when doing so. The chief areas of concern were in relation to insurance and employment. Insurance contracts and employment outside of the federally regulated sectors, are typically matters within provincial jurisdiction, as is contract law. The issue, therefore, was whether this law, which made it an offence to insist upon genetic testing or to access to the results of genetic tests, was a matter of criminal law, or a pretext for intruding upon provincial jurisdiction.

Justice Karakatsanis, writing for three of the five justices in the majority, found that the ‘pith and substance’ of the Act was “to protect individuals’ control over their detailed personal information disclosed by genetic tests, in the broad areas of contracting and the provision of goods and services, in order to address Canadians’ fears that their genetic test results will be used against them and to prevent discrimination based on that information.” (at para 4). She characterized this as falling under Parliament’s criminal law power because “they respond to a threat of harm to several overlapping public interests traditionally protected by the criminal law” (at para 4) that include “autonomy, privacy, equality and public health” (at para 4).

Justice Moldaver, writing for himself and Justice Côté agreed that the law fell within federal jurisdiction, but differed as to the reasons for this. However, he characterized the ‘pith and substance’ of the law as “prohibiting conduct that undermines individuals’ control over the intimate information revealed by genetic testing.” (at para 111) He found that the Act was an exercise of the criminal law power because it contained “prohibitions accompanied by penalties [. . .] backed by the criminal law purpose of a suppressing a threat to health.” (at para 112)

Justice Kasirer, writing for the dissenting justices characterized the ‘pith and substance’ of the law as “to regulate contracts and the provision of goods and services, in particular contracts of insurance and employment, by prohibiting some perceived misuses of one category of genetic tests, the whole with the view to promoting the health of Canadians.’ (at para 154). As a result, in his view, the matter falls within provincial jurisdiction over property and civil rights under s. 92(13) of the Constitution Act, 1867.

The point of divergence for majority and dissent was with respect to whether the law primarily regulates contracts and the provision of goods and services, or whether it principally imposes penalties for activities that threaten values traditionally protected by the criminal law. The fact that the majority justified the legislation under the federal criminal law power has interesting implications for privacy law in Canada.

First, both sets of reasons for the majority clearly consider that privacy values are appropriate subject matter for criminal legislation. In a way, we knew this already – for example, no one challenges the constitutionality of provisions that criminalize voyeurism. However, voyeurism in the Criminal Code is not just a matter of privacy – there is also an element of sexual exploitation or predation – the control of which is firmly rooted in criminal law. This situation is notably different. What is criminalized (legitimately, from the point of view of the majority) is requiring people to take genetic tests or to disclose the results of such tests, or for someone to seek out this data from a third party in order to use it in relation to contracts, goods or services. This is largely a matter of informational privacy. The difference between the two sets of majority reasons is that two of the five majority justices anchor the informational privacy concerns very specifically in the link between the (mis)use of these tests and the objective of protecting public health. Three of the justices are open to grounding the Act, not just in public health protection, but in the need to protect autonomy, privacy and equality.

On the privacy issues, Justice Karakatsanis begins by noting that “individuals have powerful interests in autonomy and privacy, and in dignity more generally, protected by various Charter guarantees” (at para 82). She also noted that individuals have “a clear and pressing interest in safeguarding information about themselves” (at para 82). According to Justice Karakatsanis, compelling people to undergo genetic testing “poses a clear threat to autonomy and to an individual’s privacy interest in not finding out what their genetic makeup reveals about them and their health prospects.” (at para 85) She notes that some people might not want to know their genetic ‘destiny’. Further, forcing individuals to share this information as a condition of receiving goods or services or entering into a contract compromises “an individual’s control over access to their detailed genetic information” (at para 85).

Justice Karakatsanis also describes genetic information as being at an individual’s “biographical core” of information. This ‘biographical core’ represents that information that is most closely tied to individual identity. She notes that the Act reflects Parliament’s view that “The dignity, autonomy and privacy interests in individuals’ detailed genetic information were understood by Parliament to be unique and strong” (at para 87). She noted as well that genetic testing technology is evolving rapidly, making the volume of information they may reveal about individuals something that “will undoubtedly continue to evolve alongside technological abilities to interpret test results” (at para 88). The sensitivity of the information is matched by the potential for its abuse.

Although Justice Karakatsanis finds that the legislation also serves to protect public health (by removing individual fears of the consequences for them of seeking genetic testing), she rules that it is also within federal jurisdiction because of its “response to the risk of harm that the prohibited conduct and discrimination based on genetic test results pose to autonomy, privacy and equality” (at para 92). For Justice Moldaver, who also supports the constitutionality of the Act, the pith and substance of the legislation lies in Parliament’s goal “to protect health by prohibiting conduct that undermines individuals’ control over the intimate information revealed by genetic testing” (at para 111)[my emphasis]. This is a subtle but important distinction. He grounds constitutionality in the protection of public health; protecting intimate information is simply the means by which the public health goal is achieved. Justice Kasirer, writing for the four dissenting justices was prepared to recognize the Attorney-General of Canada’s concession “that Parliament could enact legislation targeting a threat to privacy and autonomy hat might well constitute a valid criminal law purpose” (at para 251). But recognizing a concession is not necessarily agreeing with it. He notes that in this case, because the pith and substance of the legislation is not to protect privacy or autonomy, but rather to regulate contracts and the provision of goods and services, the matter is moot.

Justice Kasirer, in particular, notes the slippery slope that could result from finding that privacy, dignity and autonomy are freestanding anchors for the federal criminal law power. He notes that “Such a holding would encourage the view that any new technology with implications bearing on public morality might form the basis for the criminal law power, and potentially, bring a wide range of scientific developments within federal jurisdiction on no principled constitutional basis” (at para 253).

Indeed, this is at the heart of what is so interesting from a privacy perspective in this decision. Justice Karakatsanis, writing for herself and two other justices, seems to recognize that the protection of privacy can find an anchor in the criminal law power by virtue of the impact of intrusions of privacy on dignity and autonomy. Justice Moldaver and Justice Côté recognize the informational control dimensions of the genetic testing issue, but anchor the law’s constitutionality squarely in the goal of protecting public health, something long recognized as a matter open to regulation under the criminal law power. Justice Kasirer rejects federal jurisdiction entirely, but is alert to the potential for a focus on privacy, in an era of rapidly emerging technology, to dramatically impact the constitutional balance between federal and provincial governments.

These are interesting times for privacy, digital innovation, and the constitution. It is expected that the federal government will soon introduce a bill to reform the Personal Information Protection and Electronic Documents Act (PIPEDA). The Privacy Commissioner has pressed the government to adopt a ‘privacy as a human rights’ approach in this reform process, but the government has seemed hesitant because of concerns that any emphasis on the human rights dimension of privacy might threaten the law’s fragile constitutional footing under the trade and commerce power. The Supreme Court of Canada in the Reference re Genetic Non-Discrimination suggests that such an approach might not be as constitutionally risky as previously thought, although the risks are evidently there.

The regulation of artificial intelligence (AI) technologies will also be a matter of future legislative concern as they continue to rapidly evolve and impact all aspects of our lives. This case therefore may foreshadow debates about where jurisdiction might lie over possible prohibitions on certain uses of AI, on the automation of certain types of decision-making, or other measures to protect privacy, dignity or autonomy in the face of this new technology. Justice Kasirer is clearly concerned that at least three of his colleagues have opened a door for much wider-ranging federal jurisdiction over technologies that can impact privacy, dignity, and autonomy.

Published in Privacy

On May 29, 2020 I was invited to a meeting of the House of Commons INDU Committee which is considering Canada's response to the COVID-19 Pandemic. On May 29, it was focusing its attention on contact tracing apps. A copy of my brief oral presentation is below. The videotape of the hearing and the Q & A with fellow invitee Michael Bryant of the Canadian Civil Liberties Association can be found here.

 

Speaking Notes for Teresa Scassa, Canada Research Chair in Information Law and Policy, University of Ottawa, INDU – Canadian Response to the COVID-19 Pandemic, May 29, 2020

Thank you, Madame Chair and Committee members for the opportunity to address this committee on privacy in Canada’s COVID-19 response.

We are currently in a situation in which Canadians are very vulnerable – economically, socially, and in terms of their physical and mental health. Canadians know that sacrifices are necessary to address this crisis – and have already made sacrifices of different magnitudes. Most Canadians accept that this is necessary to save lives and to begin to return to ‘normal’. They accept that some degree of privacy may need to be sacrificed in some contexts. But there is no binary choice between privacy and no privacy. Instead, there must be a careful balancing of privacy with other public interests.

There are two overarching privacy concerns when it comes to Canada’s response to the pandemic. The first is that there is a risk that poorly thought out collection, use or disclosure of personal information will create privacy and security vulnerabilities with little real benefit, or with benefits disproportionate to risks and harms. The second is that the pandemic may lead to the introduction of data gathering or processing technologies that will create a ‘new normal’ leading to even greater inroads on privacy, dignity and autonomy. Importantly, surveillance often has the most significant adverse impacts on the most vulnerable in our society.

The pandemic context raises a broad range of privacy issues, from government or law enforcement access to location and personal health information, to contact tracing apps and beyond. As we begin the ‘return to normal’, we will also see issues of workplace surveillance, as well as tracking tools and technologies used to help determine who gets into stores, who receives services, or who gets on airplanes. Personal health information, generally considered to be among our most sensitive personal information, may become a currency we are required to use in order to carry out ordinary daily activities.

Since I am limited only to 5 minutes, I would like to tease out 3 main themes.

1) A first theme is trust. “Trust” is referenced in the Digital Charter and is essential when asking Canadians to share personal information with government. But trust is complicated by a pandemic context in which issues evolve rapidly and are often unprecedented. One thing that trust requires is transparency, and governments have struggled with transparency – whether it is with respect to sharing data that models the spread of COVID-19 with the public or (as was the case of Alberta) launching a contact-tracing app without releasing the Privacy Impact Assessment. Transparency is essential to trust.

2) A second theme is necessity and proportionality. The Privacy Commissioner of Canada, along with his provincial and territorial counterparts, supports an approach to privacy based on necessity and proportionality. This is derived from the human rights context. Necessity and proportionality provide a robust analytical framework for balancing privacy rights against other public interests, and should already be part of an amended Privacy Act.

The importance of this approach cannot be overemphasized. We are in a data driven society. It is easy to become enthused about technological solutions, and innovators promise that data analytics, including AI, can solve many of our problems. We need to remember that while technology can provide astonishing benefits, there is already a long history of poorly designed, poorly implemented, and often rushed technological solutions that have created significant risks and harms. Novel technological solutions often fail. This is becoming a reality, for example, with many recently launched national contact tracing apps. Rushed, flawed schemes to harvest personal data – even if for laudable goals – will erode trust at best and cause harm at worst.

This is why clear guidelines – such as those developed by the Commissioners – are crucial. There should be an emphasis on purpose and time-limited solutions that minimize privacy impacts.

3) A third theme is human rights. Privacy is closely tied to human rights, but this relationship is increasingly complex in a data driven society. Privacy laws govern data collection, use and disclosure, and it is increasingly common for data uses to have significant impacts on human rights and civil liberties, including the freedom of association, freedom of speech, and the right to be free from discrimination. Until recently, public conversations about contact tracing have been predominantly about government-adopted apps to deal with public health disease tracking. As businesses reopen and people go back to work, the conversation will shift to contact-tracing and disease monitoring in the private sector, including the possible use of so-called immunity passports. We will see workplace surveillance technologies as well as technologies that might be used to limit who can enter retail stores, who can access services, who can get on airplanes, and so on. While there are obviously serious public health and safety issues here, as well as issues important to economic recovery and the ability of people to return to work, there is also significant potential for harm, abuse, and injustice. Much of this private sector surveillance will be in areas under provincial jurisdiction, but by no means all of it. The federal government must play a leadership role in setting standards and imposing limitations.

I will end my remarks here and look forward to your questions.


Published in Privacy
<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>
Page 4 of 19

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law