Teresa Scassa - Blog

Displaying items by tag: Privacy

An interesting case from Quebec demonstrates the tension between privacy and transparency when it comes to public registers that include personal information. It also raises issues around ownership and control of data, including the measures used to prevent data scraping. The way the litigation was framed means that not all of these questions are answered in the decision, leaving some lingering public policy questions.

Quebec’s Enterprise Registrar oversees a registry, in the form of a database, of all businesses in Quebec, including corporations, sole corporations and partnerships. The Registrar is empowered to do so under the Act respecting the legal publicity of enterprises (ALPE), which also establishes the database. The Registrar is obliged to make this register publicly accessible, including remotely by technological means, and basic use of the database is free of charge.

The applicant in this case is OpenCorporates, a U.K.-based organization dedicated to ensuring total corporate transparency. According to its website, OpenCorporates has created and maintains “the largest open database of companies in the world”. It currently has data on companies located in over 130 jurisdictions. Most of this data is drawn from reliable public registries. In addition to providing a free, searchable public resource, OpenCorporates also sells structured data to financial institutions, government agencies, journalists and other businesses. The money raised from these sales finances its operations.

OpenCorporates gathers its data using a variety of means. In 2012, it began to scrape data from Quebec’s Enterprise Register. Data scraping involves the use of ‘bots’ to visit and automatically harvest data from targeted web pages. It is a common data-harvesting practice, widely used by journalists, civil society actors and researchers, as well as companies large and small. As common as it may be, it is not always welcome, and there has been litigation in Canada and around the world about the legality of data scraping practices, chiefly in contexts where the defendant is attempting to commercialize data scraped from a business rival.

In 2016 the Registrar changed the terms of service for the Enterprise Register. These changes essentially prohibited web scraping activities, as well as the commercialization of data extracted from the site. The new terms also prohibit certain types of information analyses; for example, they bar searches for data according to the name and address of a particular person. All visitors to the site must agree to the Terms of Service. The Registrar also introduced technological measures to make it more difficult for bots to scrape its data.

Opencorporates Ltd. C. Registraire des entreprises du Québec is not a challenge to the Register’s new, restrictive terms and conditions. Instead, because the Registrar also sent OpenCorporates a cease and desist letter demanding that it stop using the data it had collected prior to the change in Terms of Service, OpenCorporates sought a declaration from the Quebec Superior Court that it was entitled to continue to use this earlier data.

The Registrar acknowledged that nothing in the ALPE authorizes it to control uses made of any data obtained from its site. Further, until it posted the new terms and conditions for the site, nothing limited what users could do with the data. The Registrar argued that it had the right to control the pre-2016 data because of the purpose of the Register. It argued that the ALPE established the Register as the sole source of public data on Quebec businesses, and that the database was designed to protect the personal information that it contained (i.e. the names and addresses of directors of corporations). For example, it does not permit extensive searches by name or address. OpenCorporates, by contrast, permits the searching of all of its data, including by name and address.

The court characterized the purpose of the Register as being to protect individuals and corporations that interact with other corporations by assuring them easy access to identity information, including the names of those persons associated with a corporation. An electronic database gives users the ability to make quick searches and from a distance. Quebec’s Act to Establish a Legal Framework for Information Technology provides that where a document contains personal information and is made public for particular purposes, any extensive searches of the document must be limited to those purposes. This law places the onus on the person responsible for providing access to the document to put in place appropriate technological protection measures. Under the ALPE, the Registrar can carry out more comprehensive searches of the database on behalf of users who must make their request to the Registrar. Even then, the ALPE prohibits the Registrar from using the name or address of an individual as a basis for a search. According to the Registrar, a member of the public has right to know, once one they have the name of a company, with whom they are dealing; they do not have the right to determine the number of companies to which a physical person is linked. By contrast, this latter type of search is one that could be carried out using the OpenCorporates database.

The court noted that it was not its role to consider the legality of OpenCorporates’ database, nor to consider the use made by others of that database. It also observed that individuals concerned about potential privacy breaches facilitated by OpenCorporates might have recourse under Quebec privacy law. Justice Rogers’ focus was on the specific question of whether the Registrar could prevent OpenCorporates from using the data it gathered prior to the change of terms of service in 2016. On this point, the judge ruled in favour of OpenCorporates. In her view, OpenCorporates’ gathering of this data was not in breach of any law that the Registrar could rely upon (leaving aside any potential privacy claims by individuals whose data was scraped). Further, she found that nothing in the ALPE gave the Registrar a monopoly on the creation and maintenance of a database of corporate data. She observed that the use made by OpenCorporates of the data was not contrary to the purpose of the ALPE, which was to create greater corporate transparency and to protect those who interacted with corporations. She ruled that nothing in the ALPE obligated the Registrar to eliminate all privacy risks. The names and addresses of those involved with corporations are public information; the goal of the legislation is to facilitate digital access to the data while at the same time placing limits on bulk searches. Nothing in the ALPE prevented another organization from creating its own database of Quebec businesses. Since OpenCorporates did not breach any laws or terms of service in collecting the information between 2012 and 2016, nothing prevented it from continuing to use that information in its own databases. Justice Rogers issued a declaration to the effect that the Registrar was not permitted to prevent OpenCorporates from publishing and distributing the data it collected from the Register prior to 2016.

While this was a victory for OpenCorporates, it did not do much more than ensure its right to continue to use data that will become increasingly dated. There is perhaps some value in the Court’s finding that the existence of a public database does not, on its own, preclude the creation of derivative databases. However, the decision leaves some important questions unanswered. In the first place, it alludes to but offers no opinion on the ability to challenge the inclusion of the data in the OpenCorporates database on privacy grounds. While a breach of privacy argument might be difficult to maintain in the case of public data regarding corporate ownership, it is still unpredictable how it might play out in court. This is far less sensitive data that that involved in the scraping of court decisions litigated before the Federal Court in A.T. v. Globe24hr.com; there is a public interest in making the specific personal information available in the Registry; and the use made by OpenCorporates is far less exploitative than in Globe24hr. Nevertheless, the privacy issues remain a latent difficulty. Overall, the decision tells us little about how to strike an appropriate balance between the values of transparency and privacy. The legislation and the Registrar’s approach are designed to make it difficult to track corporate ownership or involvement across multiple corporations. There is rigorous protection of information with low privacy value and with a strong public dimension; with transparency being weakened as a result. It is worth noting that another lawsuit against the Register may be in the works. It is reported that the CBC is challenging the decision of the Registrar to prohibit searches by names of directors and managers of companies as a breach of the right to freedom of expression.

Because the terms of service were not directly at issue in the case, there is also little to go on with respect to the impact of such terms. To what extent can terms of service limit what can be done with publicly accessible data made available over the Internet? The recent U.S. case of hiQ Labs Inc. v. LinkedIn Corp. raises interesting questions about freedom of expression and the right to harvest publicly accessible data. This and other important issues remain unaddressed in what is ultimately an interesting but unsatisfying court decision.

 

Published in Privacy

The second discussion paper in Ontario’s lightning-quick consultation on a new data strategy for the province was released on September 20, 2019. Comments are due by October 9, 2019. If you blink, you will miss the consultation. But if you read the discussion paper, it will make you blink – in puzzlement. Although it is clear from its title that Ontario wants to “create economic benefits” through data, the discussion paper is coy, relying mainly on broad generalities with occasional hints at which might actually be in the works.

Governments around the world are clearly struggling to position their countries/regions to compete in a burgeoning data economy. Canada is (until the election period cooled things off) in the middle of developing its own digital and data strategy. Ontario launched its data strategy consultation in February 2019. The AI industry (in which Canada and Ontario both aspire to compete) is thirsty for data, and governments are contemplating the use of AI to improve governance and to automate decision-making. It is not surprising, therefore, that this document tackles the important issue of how to support the data economy in Ontario.

The document identifies a number of challenges faced by Ontario. These include skill and knowledge deficits in existing industries and businesses; the high cost of importing new technologies, limited digital infrastructure outside urban core areas, and international competition for highly qualified talent for the data economy. The consultation paper makes clear that the data strategy will need to address technology transfer, training/education, recruitment, and support for small businesses. Beyond this, a key theme of the document is enhancing access to data for businesses.

It is with respect to data that the consultation paper becomes troublingly murky. It begins its consideration of data issues with a discussion of open government data. Ontario has had an open data portal for a number of years and has been steadily developing it. A new law, pushed through in the omnibus budget bill that followed the Ford government’s election is the first in Canada to entrench open government data in law. The consultation document seems to suggest that the government will put more resources into open data. This is good. However, the extent of the open data ambitions gives pause. The consultation document notes, “it is important for governments to ensure that the right level of detailed data is released while protecting government security and personal privacy.” Keep in mind that up until now, the approach to open data has been to simply not release as open data datasets that contain personal information. This includes data sets that could lead to the reidentification of individuals when combined with other available data. The consultation paper states “Ontario’s government holds vast amounts of data that can help businesses develop new products and services that make Ontarian’s lives easier, while ensuring that their privacy is protected.” These references to open data and privacy protection are indications that the government is contemplating that it will make personal data in some form or another available for sharing. Alarmingly, businesses may be invited to drive decision-making around what data should be shared. The document states, “New collaboration with businesses can help us determine which data assets have the greatest potential to drive growth.” An out-of-the-blue example provided in the consultation paper is even more disturbing. At a point where the document discusses classic categories of important open data such as geospatial reference and weather data, it suddenly states “Given that Ontario has a wealth of data in digital health assets, clinical and administrative health data can also be considered a high-value dataset that may present various opportunities for Ontario.”

If personal data is on the table (and the extent to which this is the case should be a matter of serious public consultation and not lightning-round Q & A), then governance becomes all the more important. The consultation paper acknowledges the importance of governance – of a sort. It suggests new guidelines (the choice of words here is interesting – as guidelines are not laws and are usually non-binding) to help govern how data is shared. The language of standards, guidance and best practices is used. Words such as law, regulation and enforcement are not. While “soft law” instruments can have a role to play in a rapidly changing technological environment, Canadians should be justifiably wary of a self-regulating private sector – particularly where there is so much financially at stake for participating companies. It should also be wary of norms and standards developed by ‘stakeholder’ groups that only marginally represent civil society, consumer and privacy interests.

If there is one thing that governments in Canada should have learned from the Sidewalk Toronto adventure, it is that governments and the private sector require social licence to collect and share a populations’ personal data. What this consultation does instead is say to the public, “the data we collect about you will be very valuable to businesses and it is in the broader public interest that we share it with them. Don’t worry, we’re thinking about how to do it right.” That is an illustration of paternalism, not consultation or engagement. It is certainly not how you gain social licence.

The Ontario government’s first Consultation Paper, which I discuss here was about “promoting trust and confidence”, and it ostensibly dealt with privacy, security and related issues. However, the type of data sharing that is strongly hinted at in the second discussion paper is not discussed in that first paper and the consultation questions in that document do not address it either.

There is a great deal of non-personal government data that can be valuable for businesses and that might be used to drive innovation. There is already knowledge and experience around open data in Ontario, and building upon this is a fine objective. Sharing of personal and human behavioural data may also be acceptable in some circumstances and under some conditions. There are experiments in Canada and in other countries with frameworks for doing this that are worth studying. But this consultation document seems to reflect a desire to put all government data up for grabs, without social licence, with only the vaguest plans for protection, and with a clear inclination towards norms and standards developed outside the usual democratic processes. Yes, there is a need to move quickly – and to be “agile” in response to technological change. But speed is not the only value. There is a difference between a graceful dive and a resounding belly flop – both are fast, only one is agile.

 

Published in Privacy

A ruling under B.C.’s Personal Information Protection Act (PIPA) will add new fuel to the fires burning around the issue of whether Canada’s federal political parties should have to comply with data protection laws. In Order P19-02, B.C. Privacy Commissioner Michael McEvoy rejected constitutional challenges and ruled that B.C.’s data protection law applied not just to provincial political parties (something it indisputably does), but also to electoral district associations in B.C. established under the Canada Elections Act. The decision means that the hearing into a complaint against the Courtenay-Alberni Riding Association of the New Democratic Party of Canada will now proceed. The riding association will still have the opportunity to argue, within the factual context of the complaint, that the application of specific provisions of PIPA place unacceptable limits on the right to vote and the freedom of expression under the Canadian Charter of Rights and Freedoms (the Charter).

There has been considerable attention paid to the relatively unregulated information handling practices of Canadian political parties in the last few years. A 2012 report commissioned by the Office of the Privacy Commissioner of Canada laid out the legal landscape. In the fall of 2018, federal, provincial and territorial privacy commissioners issued a joint call for meaningful privacy regulation of political parties in Canada. In late 2018, the House of Commons Standing Committee on Access to Information, Privacy and Ethics issued its report titled Democracy Under Threat: Risks and Solutions in the Era of Disinformation and Data Monopoly in which it recommended, among other things, that Canadian political parties be made subject to the Personal Information Protection and Electronic Documents Act (PIPEDA). Instead, the federal government chose to amend the Canada Elections Act to add some fairly tepid requirements for parties to have and make available privacy policies. Meaningful oversight and enforcement mechanisms are notably absent. In April 2019, Office of the Privacy Commissioner of Canada issued guidance for political parties on how to protect privacy. On August 7, Open Media conducted a review of the privacy policies of Canada’s federal political parties, measuring them against the guidelines issued by the OPC. The review reveals a fairly dismal level of privacy protection. As noted above, B.C.’s PIPA applies to B.C.’s provincial political parties. A review of those parties’ privacy practices earlier this year resulted in an investigation report that makes interesting reading.

It is within this context that a B.C. couple filed a complaint with the B.C. Office of the Information and Privacy Commissioner after each received and email from the NDP’s Courtenay-Alberni Riding Association inviting them to attend a meet and greet with the federal party’s leader. The couple wrote a letter to the local NDP seeking to know what information the party had on them, from whom it had been sourced, with whom it had been shared, and how the information had been and would be used. When they did not receive a satisfactory response, they filed a complaint with the OIPC. Since the NDP objected to the jurisdiction of the OIPC in the matter, the OIPC issued a notice of hearing to determine the preliminary issue of whether BC’s PIPA applied to the Courtney-Alberni Riding Association (the Organization).

The Organization made three constitutional arguments objecting to the jurisdiction of the OIPC. The first is that PIPA cannot apply to federally registered political entities because s. 41 of the Constitution Act, 1867 gives the federal government sole jurisdiction over the conduct of federal elections. The second is that PIPA cannot apply because other federal laws, including the Canada Elections Act and PIPEDA are paramount. The third argument was that, if PIPA were found to apply, to the extent that it did so, it would place unjustified limits on the right to vote and the freedom of expression guaranteed under the Charter. As noted above, on this third issue, the adjudicator ruled that there was an insufficient factual context to make a determination. Because Commissioner McAvoy ultimately decided that PIPA applies, the third question will be considered in the context of the hearing into the actual complaint.

Commisioner McAvoy noted that PIPA applies to every “organization” in BC. “Organization” is defined broadly to include: “a person, an unincorporated association, a trade union, a trust or a not for profit organization.” The Riding Association, as an unincorporated association, falls within this definition. He ruled that it made no difference that the organization was established under the constitution of a federal political party or that it is involved in federal politics. He rejected the Organization’s rather convoluted argument that since PIPEDA also applied to ‘organizations’, it precluded the application of BC’s statute. The Commissioner noted that because there is no commercial activity, PIPEDA did not apply to the collection, use or disclosure of personal information by the organization, and thus did not preclude the application of PIPA.

Commissioner McAvoy rejected the first constitutional argument on the basis that PIPA does not attempt to regulate the conduct of federal elections. PIPA’s purpose relates to “the regulation of the collection, use and disclosure of personal information by organizations.” (at para 45) It has nothing to do with any election-related issues such as the establishment of political parties, voting processes, or campaign financing. PIPA itself falls within provincial jurisdiction over “property and civil rights” in B.C. The Organization argued that by applying to federal riding associations in the province, it attempted to affect matters outside the province, but the adjudicator disagreed. He stated: “Analysis of incidental effects should be kept distinct from assessment of whether a provincial statute is validly enacted under the Constitution Act, 1867” (at para 52). He noted that in any event, incidental effects do not necessarily render a statute unconstitutional.

The Commissioner also rejected the paramountcy argument. The Organization argued that PIPA’s provisions conflicted with the Canada Elections Act, as well as the Telecommunications Act and Canada’s Anti-Spam Legislation (CASL) and frustrated a federal purpose and therefore could not apply to federal riding associations in B.C. Commissioner McEvoy found that there was no actual conflict between the federal and provincial laws. The Canada Elections Act imposes no substantive obligations around, for example, consent to the collection of personal information. It is not a situation where one statute says consent is not required and another says that it is. The Canada Elections Act is simply more permissive when it comes to personal information. Because the do-not-call list established under the Telecommunications Act does not address email communications, which is the subject matter of the actual complaint, there is no conflict with that law. Similarly, he found no conflict with the CASL. Although the CASL permits political parties or organizations to send emails with out consent to solicit donations, the email that was the subject of the complaint before the OIPC did not solicit a donation, but was rather an invitation to an event. As a result there is no conflict between the laws. Further, case law does not support the view that a conflict is found simply because a provincial law has more restrictive elements than a federal law. The Commissioner stated: “the fact that the Canada Elections Act and the two other federal laws take a permissive approach to use of certain personal information of electors does not of itself establish a conflict with PIPA’s requirements (even if one assumes, for discussion purposes only, that PIPA actually prohibits that which federal law permits.) . . . It is possible to comply with both PIPA and the federal laws [. . .]” (at para 79).

Commissioner McAvoy also rejected the argument that the application of PIPA would frustrate the federal purpose pursued under the Canada Elections Act. He found that the Organization had not adequately established the federal purpose nor had it managed to demonstrate how PIPA frustrated it.

Clearly this particular skirmish is far from complete. It is entirely possible that the Organization will challenge the Commissioner’s decision, and the matter may head to court. Nevertheless, the decision is an important one, as it raises the clear possibility that riding associations of federal political parties in BC might be held to a far stricter standard of data protection that that required of political parties elsewhere in Canada. This will increase the growing pressure on the federal government to take real, concrete steps to ensure that political parties are held to the same standards as private sector organizations when it comes to collecting, using and disclosing personal information. Given vast amounts of data available, the potential for intrusive and inappropriate uses, the controversies around profiling and targeting, and the growing risks of harm from data breaches, this is an unacceptable legislative gap.

 

Published in Privacy

On July 31, 2019 the Ontario Government released a discussion paper titled Promoting Trust and Confidence in Ontario’s Data Economy. This is the first in a planned series of discussion papers related to the province’s ongoing Data Strategy consultation. This particular document focuses on the first pillar of the strategy: Promoting Trust and Confidence. The other pillars are: Creating Economic Benefit; and Enabling Better, Smarter Government. The entire consultation process is moving at lightning speed. The government plans to have a final data strategy in place by the end of this calendar year.

My first comment on the document is about timing. A release on July 31, with comments due by September 6, means that it hits both peak vacation season and mad back to school rush. This is not ideal for gathering feedback on such an important set of issues. A further timing issue is the release of this document and the call for comments before the other discussion papers are available. The result is a discussion paper that considers trust and confidence in a policy vacuum, even though it makes general reference to some pretty big planned changes to how the public sector will handle Ontarians’ personal information as well as planned new measures to enable businesses to derive economic benefit from data. It would have been very useful to have detailed information about what the government is thinking about doing on these two fronts before being asked what would ensure ongoing trust and confidence in the collection, use and disclosure of Ontarians’ data. Of course, this assumes that the other two discussion documents will contain these details – they might not.

My second comment is about the generality of this document. This is not a consultation paper that proposes a particular course of action and seeks input or comment. It describes the current data context in broad terms and asks questions that are very general and open-ended. Here are a couple of examples: “How can the province help businesses – particularly small and medium-sized businesses – better protect their consumers’ data and use data-driven practices responsibly?” “How can the province build capacity and promote culture change concerning privacy and data protection throughout the public sector (e.g., through training, myth-busting, new guidance and resources for public agencies)?” It’s not that the questions are bad ones – most of them are important, challenging and worth thinking about. But they are each potentially huge in scope. Keep in mind that the Data Strategy that these questions are meant to inform is to be released before the end of 2019. It is hard to believe that anything much could be done with responses to such broad questions other than to distil general statements in support of a strategy that must already be close to draft stage.

That doesn’t mean that there are not a few interesting nuggets to mine from within the document. Currently, private sector data protection in Ontario is governed by the federal Personal Information Protection and Electronic Documents Act. This is because, unlike Alberta, B.C. and Quebec, Ontario has not enacted a substantially similar private sector data protection law. Is it planning to? It is not clear from this document, but there are hints that it might be. The paper states that it is important to “[c]larify and strengthen Ontario’s jurisdiction and the application of provincial and federal laws over data collected from Ontarians.” (at p. 13) One of the discussion questions is “How can Ontario promote privacy protective practices throughout the private sector, building on the principles underlying the federal government’s private sector privacy legislation (the Personal Information Protection and Electronic Documents Act)?” Keep in mind that a private member’s bill was introduced by a Liberal backbencher just before the last election that set out a private sector data protection law for Ontario. There’s a draft text already out there.

Given that this is a data strategy document for a government that is already planning to make major changes to how public sector data is handled, there are a surprising number of references to the private sector. For example, in the section on threats and risks of data-driven practices, there are three examples of data breaches, theft and misuse – none of which are from Ontario’s public sector. This might support the theory that private sector data protection legislation is in the offing. On the other hand, Ontario has jurisdiction over consumer protection; individuals are repeatedly referred to as “consumers” in the document. It may be that changes are being contemplated to consumer protection legislation, particularly in areas such as behavioural manipulation, and algorithmic bias and discrimination. Another question hints at possible action around online consumer contracts. These would all be interesting developments.

There is a strange tension between public and private sectors in the document. Most examples of problems, breaches, and technological challenges are from the private sector, while the document remains very cagey about the public sector. It is this cageyness about the public sector that is most disappointing. The government has already taken some pretty serious steps on the road to its digital strategy. For example, it is in the process of unrolling much broader sharing of personal information across the public sector through amendments to the Freedom of Information and Protection of Privacy Act passed shortly after the election. These will take effect once data standards are in place (my earlier post on these amendments is here). The same bill enacted the Simpler, Faster, Better, Services Act. This too awaits regulations setting standards before it takes effect (my earlier post on this statute is here). These laws were passed under the public radar because they were rushed through in an omnibus budget bill and with little debate. It would be good to have a clear, straightforward document from the government that outlines what it plans to do under both of these new initiatives and what it will mean for Ontarians and their personal data. Details of this kind would be very helpful in allowing Ontarians to make informed comments on trust and confidence. For example, the question “What digital and data-related threats to human rights and civil liberties pose the greatest risk for Ontarians” (p. 14) might receive different answers if readers were prompted to think more specifically about the plans for greater sharing of personal data across government, and a more permissive approach to disclosures for investigatory purposes (see my post on this issue here).

The discussion questions are organized by category. Interestingly, there is a separate category for ‘Privacy, Data Protection and Data Governance’. That’s fine – but consider that there is a later category titled Human Rights and Civil Liberties. Those of us who think privacy is a human right might find this odd. It is also odd that the human rights/civil liberties discussion is separated from data governance since they are surely related. It is perhaps wrong to read too much into this, since the document was no doubt drafted quickly. But thinking about privacy as a human right is important. The document’s focus on trust and confidence seems to relegate privacy to a lower status. It states: “A loss of trust reduces people’s willingness to share data or give social license for its use. Likewise, diminishing confidence impedes the creative risk-taking at the heart of experimentation, innovation and investment.” (at p. 8) In this plan, protection of privacy is about ensuring trust which will in turn foster a thriving data economy. The fundamental question at the heart of this document is thus not: ‘what measures should be taken to ensure that fundamental values are protected and respected in a digital economy and society”. Rather, it is: ‘What will it take to make you feel ok about sharing large quantities of personal information with business and government to drive the economy and administrative efficiencies?’ This may seem like nitpicking, but keep in mind that the description of the ‘Promoting Trust and Confidence’ pillar promises “world-leading, best-in-class protections that benefits the public and ensures public trust and confidence in the data economy” (page 4). Right now, Europe’s GDPR offers the world-leading, best-in-class protections. It does so because it treats privacy as a human right and puts the protection of this and other human rights and civil liberties at the fore. A process that puts feeling ok about sharing lots of data at the forefront won’t keep pace.

Published in Privacy
Friday, 19 July 2019 09:15

Open Banking in Canada - A Primer

I wrote a short paper on Open Banking in Canada for a presentation I gave to the Digital Strategy Committee of the Board of Directors of Vancity, a Vancouver-based credit union.  The text of this paper is available by clicking on "read more" and/or downloading the PDF attachment below.

Smart city data governance has become a hot topic in Toronto in light of Sidewalk Labs’ proposed smart city development for Toronto’s waterfront. In its Master Innovation Development Plan (MIDP), Sidewalk Labs has outlined a data governance regime for “urban data” that will be collected in the lands set aside for the proposed Sidewalk Toronto smart city development. The data governance scheme sets out to do a number of different things. First, it provides a framework for sharing ‘urban data’ with all those who have an interest in using this data. This could include governments, the private sector, researchers or civil society. Because the data may have privacy implications, the governance scheme must also protect privacy. Sidewalk Labs is also proposing that the governance body be charged with determining who can collect data within the project space, and with setting any necessary terms and conditions for such collection and for any subsequent use or sharing of the data. The governance body, named the Urban Data Trust (UDT), will have a mandate to act in the public interest, and it is meant to ensure that privacy is respected and that any data collection, use or disclosure – even if the data is non-personal or deidentified – is ethical and serves the public interest. They propose a 5-person governance body, with representation from different stakeholder communities, including “a data governance, privacy, or intellectual property expert; a community representative; a public-sector representative; an academic representative; and a Canadian business industry representative” (MIDP, Chapter 5, p. 421).

The merits and/or shortcomings of this proposed governance scheme will no doubt be hotly debated as the public is consulted and as Waterfront Toronto develops its response to the MIDP. One thing is certain – the plan is sure to generate a great deal of discussion. Data governance for data sharing is becoming an increasingly important topic (it is also relevant in the Artificial Intelligence (AI) context) – one where there are many possibilities and proposals and much unexplored territory. Relatively recent publications on data governance for data sharing include reports by Element AI, MaRS, and the Open Data Institute). These reflect both the interest in and the uncertainties around the subject. Yet in spite of the apparent novelty of the subject and the flurry of interest in data trusts, there are already many different existing models of data governance for data sharing. These models may offer lessons that are important in developing data governance for data sharing for both AI and for smart city developments like Sidewalk Toronto.

My co-author Merlynda Vilain and I have just published a paper that explores one such model. In the early 2000’s the Ontario government decided to roll out mandatory smart metering for electrical consumption in the province. Over a period of time, all homes and businesses would be equipped with smart meters, and these meters would collect detailed data in real time about electrical consumption. The proposal raised privacy concerns, particularly because detailed electrical consumption data could reveal intimate details about the activities of people within their own homes. The response to these concerns was to create a data governance framework that would protect customer privacy while still reaping the benefits of the detailed consumption data.

Not surprisingly, as the data economy surged alongside the implementation of smart metering, the interest in access to deidentified electrical consumption data grew across different levels of government and within the private sector. The data governance regime had therefore to adapt to a growing demand for access to the data from a broadening range of actors. Protecting privacy became a major concern, and this involved not just applying deidentification techniques, but also setting terms and conditions for reuse of the data.

The Smart Metering Entity (SME), the data governance body established for smart metering data, provides an interesting use case for data governance for data sharing. We carried out our study with this in mind; we were particularly interested in seeing what lessons could be learned from the SME for data governance in other context. We found that the SME made a particularly interesting case study because it involved public sector data, public and private sector stakeholders, and a considerable body of relatively sensitive personal information. It also provides a good example of a model that had to adapt to changes over a relatively short period of time – something that may be essential in a rapidly evolving data economy. There were changes in the value of the data collected, and new demands for access to the data by both public and private sector actors. Because of the new demand and new users, the SME was also pushed to collect additional data attributes to enrich the value of its data for potential users.

The SME model may be particularly useful to think about in the smart cities context. Smart cities also involve both public and private sector actors, they may involve the collection of large volumes of human behavioural data, and this gives rise to a strong public interest in appropriate data governance. Another commonality is that in both the smart metering and smart cities contexts individuals have little choice but to have their data collected. The underlying assumption is that the reuse and repurposing of this data across different contexts serves the public interest in a number of different ways. However, ‘public interest’ is a slippery fish and is capable of multiple interpretations. With a greatly diminished role for consent, individuals and communities require frameworks that can assist not just in achieving the identified public interests – but in helping them to identify and set them. At the same time protecting individual and community privacy, and ensuring that data is not used in ways that are harmful or exploitative.

Overall, our study gave us much to think about, and its conclusion develops a series of ‘lessons’ for data governance for data sharing. A few things are worthy of particular note in relation to Sidewalk Labs’ proposed Urban Data Trust. First, designing appropriate governance for smart metering data was a significant undertaking that took a considerable amount of time, particularly as demands for data evolved. This was the case even though the SME was dealing only with one type of data (smart metering data), and that it was not responsible for overseeing new requests to collect new types of data. This is a sobering reminder that designing good data governance – particularly in complex contexts – may take considerable time and resources. The proposed UDT is very complex. It will deal with many different types of data, data collectors, and data users. It is also meant to approve and set terms and conditions for new collection and uses. The feasibility of creating robust governance for such a complex context is therefore an issue – especially within relatively short timelines for the project. Defining the public interest – which both the SME and the UDT are meant to serve – is also a challenge. In the case of the SME, the democratically elected provincial government determines the public interest at a policy level, and it is implemented through the SME. Even so, there are legitimate concerns about representation and about how the public interest is defined. With the UDT, it is not clear who determines the public interest or how. There will be questions about who oversees appointments to the UDT, and how different stakeholders and their interests are weighted in its composition and in its decision-making.

Our full paper can be found in open access format on the website of the Centre for International Governance Innovation (CIGI): here.

 

Published in Privacy

In June 2019, the Standing Senate Committee on Banking, Trade and Commerce (BANC) released its report on open banking following hearings it held in the spring of 2019. The federal government, which has been conducting its own consultation into open banking, has yet to issue a report.

For those who have not been following discussions around this issue, ‘open banking’ refers to a framework that enables consumers to share their personal financial data with financial services providers in a secure manner. The anticipated benefits of open banking include providing consumers of financial services (both individuals and small businesses) with more and better financial planning and payment options, and stimulating innovation in the fintech sector. Open banking is no small undertaking. To work, it will require major financial institutions to adopt standardized formats for data. It will also require the adoption of appropriate security measures. A regulator will have to create a list of approved open banking fintech providers. There will also need to be oversight from competition and privacy commissioners. For consumer privacy to be adequately protected there will have to be an overhaul of Canada’s Personal Information Protection and Electronic Documents Act.

The BANC committee report reviews the testimony it heard and makes a number of recommendations. It begins by noting that approximately 4 million Canadians already make use of fintech apps to obtain financial services not otherwise available. These apps require users to provide their banking usernames and passwords in order to enable them to repeatedly access and screen-scrape financial data. It is a risky practice and one that may violate the terms of service for those customer accounts, leaving consumers vulnerable and unprotected. The Senate report notes that the legal and regulatory changes needed to implement open banking in Canada – as well as the necessary work on standards and interoperability – will take time. As a result, the first part of the report makes a number of recommendations to address, in the short term, the protection of Canadians who engage in screen-scraping.

The BANC committee notes that other countries – including Australia and the UK – are already further ahead than Canada in launching open banking initiatives. It expresses concern that Canada may be falling behind what is an international shift towards open banking, noting that “without swift action, Canada may become an importer financial technology rather than an exporter” (at pr. 14). The report makes a number of recommendations to facilitate the adoption of open banking in Canada, urging a “principles-based, industry-led open banking framework that would be integrated with existing financial sector and privacy legislation” (Recommendation III). The recommendations include work on developing standards, creating a registry of accredited providers of fintech services, legislating limits on the use of standardized and interoperable consumer financial data, creating a framework in which provincially regulated credit unions and caisses populaires can participate, improving broadband access for rural and remote communities, reforming PIPEDA, and creating appropriate regulatory oversight and enforcement mechanisms.

The BANC committee correctly links open banking to a broader data portability right. This portability right, which is present in the EU’s General Data Protection Regulation (GDPR), is one of the 10 principles articulated in the federal government’s new Digital Charter. The federal government’s recent discussion paper on PIPEDA reform also references data portability. Data portability is a mechanism by which individuals are given much greater control over their data – allowing them to ‘port’ their data from one provider to another. It also has potential to encourage competition and to stimulate innovation in the tech sector. However, for the BANC committee, consumer control is at the heart of open banking. The Committee clearly sees open banking as something that should benefit consumers. They characterize it as giving consumers more control over their personal financial information, and something that can provide them with a “more personalized, convenient digital banking experience” (at p. 37).

Indeed, the BANC committee report as a whole places consumer interests at the centre of the move towards open banking. As noted earlier, its first recommendations are oriented towards taking action to protect consumers who are engaging in screen-scraping to obtain the fintech services they want. It is also sharply critical of the federal government for not appointing a consumer advocate to its Advisory Committee on Open Banking, even though the Department of Finance indicates that it has consulted widely to obtain consumer and civil society input. The BANC committee expressed concern that not enough is known about the potential impacts on consumers of open banking, and recommends that more research be carried out as soon as possible on these issues, funded by the federal government.

 

Published in Privacy

On May 21, 2019, Canada’s federal government launched its Digital Charter, along with several other supporting documents, including its action plan for the Charter and proposals for modernizing the Personal Information Protection and Electronic Documents Act (PIPEDA). Together, the documents discuss the outcomes of the recent federal digital strategy consultation and chart a path forward for federal policy in this area. The documents reflect areas where the government is already forging ahead, and they touch on a number of issues that have been at the centre of media attention, as well as public and private sector engagement.

As a strategy document (which, launched less than six months away from an election, it essentially is) the Digital Charter hits many of the right notes, and its accompanying documentation reveals enough work already underway to give shape to its vision and future directions. Navdeep Bains, the Minister of Innovation, Science and Economic Development, describes the Digital Charter as articulating principles that “are the foundation for a made in Canada digital approach that will guide our policy thinking and actions and will help to build an innovative, people-centred and inclusive digital and data economy.”

The Digital Charter features 10 basic principles. Three relate to digital infrastructure: universal access to digital services; safety and security; and open and modern digital government. Another three touch on human rights issues: data and digital for good; strong democracy; and freedom from hate and violent extremism. Two principles address data protection concerns: control and consent; and transparency, portability and interoperability — although the latter principle blends into the marketplace and competition concerns that are also reflected in the principle of ensuring a level playing field. Perhaps the most significant principle in terms of impact is the tenth, an overarching commitment to strong enforcement and real accountability. Weak enforcement has undermined many of our existing laws that apply in the digital context, and without enforcement or accountability, there is little hope for a credible strategy. Taken together, the 10 principles reflect a careful and thorough synthesis of some of the issues confronting Canada’s digital future.

Yet, this digital charter might more accurately be described as a digital chart. In essence, it is an action plan, and while it is both credible and ambitious, it is not a true charter. A charter is a document that creates legal rights and entitlements. The Digital Charter does not. Its principles are framed in terms of open-ended goals: “Canadians will have control over what data they are sharing,” “All Canadians will have equal opportunity to participate in the digital world,” or “Canadians can expect that digital platforms will not foster or disseminate hate, violent extremism or criminal content.” Some of the principles reflect government commitments: “The Government of Canada will ensure the ethical use of data.” But saying that some can “expect” something is different from saying they have a right to it.

The goals and commitments in the Digital Charter are far from concrete. That is fair enough — these are complex issues — but concepts such as universal digital access and PIPEDA reform have been under discussion for a long time now with no real movement. A chart shows us the way, but it does not guarantee we’ll arrive at the destination.

It is interesting to note as well that privacy as a right is not squarely a part of the Digital Charter. Although privacy has (deservedly) been a high-profile issue in the wake of the Cambridge Analytica scandal and the controversies over Sidewalk Labs’ proposed smart city development in Toronto, this Digital Charter does not proclaim a right to privacy. A right to be free from unjustified surveillance (by public or private sector actors) would be a strong statement of principle. An affirmation of the importance of privacy in supporting human autonomy and dignity would also acknowledge the fundamental importance of privacy, particularly as our digital economy plows forward. The Digital Charter does address data protection, stating that Canadians will have control over the data they share and will “know that their privacy is protected.” They will also have “clear and manageable access to their personal data.” While these are important data protection goals, they are process-related commitments and are aimed at fostering trust for the purpose of data sharing.

Indeed, trust is at the the core of the government strategy. Minister Bains makes it clear that, in his view, “innovation is not possible without trust.” Further, “trust and privacy are key to ensuring a strong, competitive economy and building a more inclusive, prosperous Canada.”

Privacy, however, is the human right; trust is how data protection measures are made palatable to the commercial sector. Trust is about relationships — in this case, between individuals and businesses and, to some extent, between individuals and governments. In these relationships, there is a disparity of power that leaves individuals vulnerable to exploitation and abuse. A trust-oriented framework encourages individuals to interact with businesses and government — to share their data in order to fuel the data economy. This is perhaps the core conundrum in creating digital policy in a rapidly shifting and evolving global digital economy: the perceived tension between protecting human rights and values on the one hand, and fostering a competitive and innovative business sector on the other. In a context of enormous imbalance of power, trust that is not backed up by strong, enforceable measures grounded in human rights principles is a flimsy thing indeed.

And this, in a nutshell, is the central flaw in an otherwise promising Digital Charter. As a road map for future government action, it is ambitious and interesting. It builds on policies and actions that are already underway, and sets a clear direction for tackling the many challenges faced by Canada and Canadians in the digital age. It presents a pre-election digital strategy that is in tune with many of the current concerns of both citizens and businesses. As a charter, however, it falls short of grounding the commitments in basic rights and enshrining values for our digital future. That, perhaps, is a tall order and it may be that a transparent set of principles designed to guide government law and policy making is as much as we can expect at this stage. But calling it a Charter misleads, and creates the impression that we have done the hard work of articulating and framing the core human rights values that should set the foundational rules for the digital society we are building.

Published in Privacy

On May 3, 2019 I was very pleased to give a keynote talk at the Go Open Data 2019 Conference in Toronto (video recordings of the conference proceedings are now available from the site). The following post includes the gist of my talk, along with hyperlinks to the different sources and examples I referenced. My talk was built around the theme of the conference: Inclusive, Equitable, Ethical, and Impactful.

In my talk this morning I am going to use the conference’s themes of Inclusive, Equitable, Ethical and Impactful to shape my remarks. In particular, I will apply these concepts to data in the smart cities context as this has been garnering so much attention lately. But it is also important to think about these in the artificial intelligence (AI) context which is increasingly becoming part of our everyday interactions with public and private sector actors, and is a part of smart cities as well.

As this is an open data conference, it might be fair to ask what smart cities and AI have to do with open data. In my view, these contexts extend the open data discussion because both depend upon vast quantities of data as inputs. They also complicate it. This is for three broad reasons:

First, the rise of smart cities means that there are expanding categories and quantities of municipal data (and provincial) that could be available as open data. There are also growing quantities of private sector data gathered in urban contexts in a variety of different ways over which arguments for sharing could be made. Thus, there is more and more data and issues of ownership, control and access become complex and often conflictual. Open government data used to be about the operations and activities of government, and there were strong arguments for making it broadly open and accessible. But government data is changing in kind, quality and quantity, particularly in smart cities contexts. Open data may therefore be shifting towards a more nuanced approach to data sharing.

Second, smart cities and AI are just two manifestations of the expanding demand for access to data for multiple new uses. There is not just MORE data, there are more applications for that data and more demand from public, private sector and civil society actors for access to it. Yet the opacity of data-hungry analytics and AI contribute to a deepening unease about data sharing.

Third, there is a growing recognition that perhaps data sharing should not be entirely free and open. Open data, under an open licence, with few if any restrictions and with no registration requirement was a kind of ideal, and it fit with the narrower concept of government data described earlier. But it is one that may not be best suited to our current environment. Not only are there potential use restrictions that we might want to apply to protect privacy or to limit undesirable impacts on individuals or communities, but there might also be arguments for cost recovery as data governance becomes more complex and more expensive. This may particularly be the case if use is predominantly by private sector actors – particularly large foreign companies. The lack of a registration requirement limits our ability to fully understand who is using our data, and it reduces the possibility of holding users to account for misuse. Again this may be something we want to address.

I mentioned that I would use the themes of this conference as a frame for my comments. Let me start with the first – the idea of inclusiveness.

Inclusive

We hear a lot about inclusiveness in smart cities – and at the same time we hear about privacy. These are complicated and intertwined.

The more we move towards using technology as an interface for public and private sector services, for interaction with government, for public consultations, elections, and so on, the more we need to focus on the problem of the digital divide and what it means to include everyone in the benefits of technology. Narrowing the digital divide will require providing greater access to devices, access to WIFI/broadband services, access to computer and data literacy, and access in terms of inclusiveness of differently-abled individuals. These are all important goals, but their achievement will inevitably have the consequence of facilitating the collection of greater quantities and more detailed personal information about those formerly kept on the other side of the digital divide. The more we use devices, the more data we generate. The same can be said of the use of public WIFI. Moving from analog to digital increases our data exhaust, and we are more susceptible to tracking, monitoring, profiling, etc. Consider the controversial LinkNYC Kiosks in New York. These large sidewalk installations include WiFi Access, android tablets, charging stations, and free nation-wide calling. But they have also raised concerns about enhanced tracking and monitoring. This is in part because the kiosks are also equipped with cameras and a range of sensors.

No matter how inclusiveness is manifested, it comes with greater data collection. The more identifiable data collected, the greater the risks to privacy, dignity, and autonomy. But de-identified data also carries its own risks to groups and communities. While privacy concerns may prompt individuals to share less data and to avoid data capture, the value of inclusiveness may actually require having one’s data be part of any collection. In many ways, smart cities are about collecting vast quantities of data of many different kinds (including human behavioural data) for use in analytics in order to identify problems, understand them, and solve them. If one is invisible in the data, so are one’s particular needs, challenges and circumstances. In cases where decisions are made based on available data, we want that data to be as complete and comprehensive as possible in order to minimize bias and to make better diagnoses and decisions. Even more importantly, we want to be included/represented in the data so that our specificity is able to influence outcomes. Inclusiveness in this sense is being counted, and counting.

Yet this type of inclusion has privacy consequences – for individuals as well as groups. One response to this has been to talk about deidentification. And while deidentification may reduce some privacy risks, but it does not reduce or eliminate all of them. It also does not prevent harmful or negative uses of the data (and it may evade the accountability provided by data protection laws). It also does not address the dignity/autonomy issues that come from the sense of being under constant surveillance.

Equitable and Ethical

If we think about issues of equity and ethics in the context of the sharing of data it becomes clear that conventional open data models might not be ideal. These models are based on unrestricted data sharing, or data sharing with a bare minimum of restrictions. Equitable and ethical data sharing may require more restrictions to be placed on data sharing – it may require the creation of frameworks for assessing proposed uses to which the data may be put. And it may even require changing how access to data is provided.

In the privacy context we have already seen discussion about reforming the law to move away from a purely consent-based model to one in which there may be “no-go zones” for data use/processing. The idea is that if we can’t really control the collection of the information, we should turn our attention to identifying and banning certain inappropriate uses. Translated into the data sharing context, licence agreements could be used to put limits on what can be done with data that is shared. Some open data licences already explicitly prohibit any attempts to reidentify deidentified data. The Responsible Data Use Assessment process created by Sidewalk Labs for its proposed data governance framework for Toronto’s Quayside development similarly would require an ‘independent’ body to assess whether a proposed use of urban data is acceptable.

The problem, of course, is that licence-based restrictions require oversight and enforcement to have any meaning. I wrote about this a couple of years ago in the context of the use of social media data for analytics services provided to police services across North America. The analytics companies contracted for access to social media data but were prohibited in their terms of use from using this data in the way they ultimately did. The problem was uncovered after considerable effort by the ACLU and the Brennan Center for Justice – it was not discovered by the social media companies who provided access to their data or who set the terms of use. In the recent Report of Findings by the Privacy Commissioner of Canada into Facebook’s role in the Cambridge Analytica scandal, the Commissioner found that although Facebook’s terms of service with developers prohibited the kind of activities engaged in by Dr Kogan who collected the data, they failed in their duty to safeguard personal information, and in particular, ignored red flags that should have told them that there was a problem. Let’s face it; companies selling access to data may have no interest in policing the behaviour of their customers or in terminating their access. An ‘independent’ body set up to perform such functions may lack the resources and capacity to monitor and enforce compliance.

Another issue that exists with ethical approaches is, of course, whose ethics? Taking an ethical approach does not mean being value-neutral and it does not mean that there will not be winners and losers. It is like determining the public interest – an infinitely malleable concept. This is why the composition of decision-making bodies and the location of decision-making power, when it comes to data collection and data sharing, is so important and so challenging.

Impactful

In approaching this last of the conference’s themes – impactful – I think it is useful to talk about solutions. And since I am almost out of time and this is the start of the day’s events, I am going to be very brief as solutions will no doubt be part of the broader discussion today.

The challenges of big data, AI and smart cities have led to a broad range of different proposed data governance solutions. Some of these are partial; for example, deidentification/anonymization or privacy by design approaches address what data is collected and how, but they do not necessarily address uses.

Some are aspirational. For example, developing ethical approaches to AI such as the Montreal Declaration for a Responsible Development of Artificial Intelligence. Others attempt to embed both privacy and ethics into concrete solutions – for example the federal Directive on Automated Decision-Making for the public sector, which sets parameters for the adoption, implementation and oversight of AI deployment in government. In addition, there are a number of models emerging, including data trusts in all their variety (ODI), or bottom-up solutions such as Civic Data Trusts (see, e.g.: MaRS, Element AI, SeanMcDonald), which involve access moderated by an independent (?), representative (?) body, in the public interest (?) according to set principles.

Safe sharing sites is another concept discussed by Lisa Austin and David Lie of the University of Toronto – they are not necessarily independent of data trusts or civic data trusts. Michel Girard is currently doing very interesting work on the use of data standards (see his recent CIGI paper).

Some solutions may also be rooted in law reform as there are deficiencies in our legal infrastructure when it comes to data governance. One key target of reform is data protection laws, but context-specific laws may also be required.

Many of these solutions are in the debate/discussion/development stage. Without a doubt there is a great deal of work to be done. Let’s start doing it.

 

 

Published in Privacy

On April 25 the federal Privacy Commissioner and the Privacy Commissioner of British Columbia released a joint Report of Findings in an investigation into Facebook’s handling of personal information in relation to the Cambridge Analytica scandal. Not surprisingly, the report found that Facebook was in breach of a number of different obligations under the Personal Information Protection and Electronic Documents Act (PIPEDA). Somewhat more surprisingly, the Report also finds that the corresponding obligations under BC’s Personal Information Protection Act (PIPA) were also breached. The Report criticizes Facebook for being less than fully cooperative in the investigation. It also notes that Facebook has disputed the Commissioners’ findings and many of their recommendations. The Report concludes by stating that each Commissioner will “proceed to address the unresolved issues in accordance with our authorities” under their respective statutes. Since the federal Commissioner has no order-making powers, the next step for him will be the Federal Court seeking a court order to compel changes. This will be a hearing de novo – meaning that the same territory will be covered before the Court, and Facebook will be free to introduce new evidence and argument to support its position. The court will owe no deference to the findings of the Privacy Commissioner. Further, while the Federal Trade Commission in the US contemplates fines to impose on Facebook in relation to its role in this scandal, Canada’s Commissioner does not have such a power, nor does the Federal Court. This is the data protection law we have – it is not the one that we need. Just as the Cambridge Analytica scandal drew attention to the dynamics and scale of personal data use and misuse, this investigation and its outcomes highlight the weaknesses of Canada’s current federal data protection regime.

As for the BC Commissioner – he does have order making powers under PIPA, and in theory he could order Facebook to change its practices in accordance with the findings in the Report. What the BC Commissioner lacks, however, with all due respect, is jurisdiction, as I will discuss below.

While the substantive issues raised in the complaint are important and interesting ones, this post will focus on slightly less well-travelled territory. (For comment on these other issues see, for example, this op-ed by Michael Geist). My focus is on the issue of jurisdiction. In this case, the two Commissioners make joint findings about the same facts, concluding that both statutes are breached. Although Facebook challenges their jurisdiction, the response, in the case of the BC Commissioner’s jurisdiction is brief and unsatisfactory. In my view, there is no advantage to Canadians in having two different data protection laws apply to the same facts, and there is no benefit in a lack of clarity as to the basis for a Commissioner’s jurisdiction.

This investigation was carried out jointly between the federal and the BC Privacy Commissioner. There is somewhat of a BC nexus, although this is not mentioned in the findings. One of the companies involved in processing data from Facebook is Aggregate IQ, a BC-based analytics company. There is an ongoing joint investigation between the BC and federal Privacy Commissioners into the actions of Aggregate IQ. However, this particular report of findings is in relation to the activities of Facebook, and not Aggregate IQ. While that other joint investigation will raise similar jurisdictional questions, this one deals with Facebook, a company over whose activities the federal Privacy Commissioner has asserted jurisdiction in the past.

There is precedent for a joint investigation of a privacy complaint. The federal privacy commissioners of Australia and Canada carried out a joint investigation into Ashley Madison. But I that case each Commissioner clearly had jurisdiction under their own legislation. This, I will argue, is not such a case. Within Canada, only one privacy Commissioner will have jurisdiction over a complaint arising from a particular set of facts. In this case, it is the federal Privacy Commissioner.

Unsurprisingly, Facebook raised jurisdictional issues. It challenged the jurisdiction of both commissioners. The challenge to the federal Commissioner’s jurisdiction was appropriately dismissed – there is a sufficient nexus between Facebook and Canada to support the investigation under PIPEDA. However, the challenge to the jurisdiction of the BC Commissioner was more serious. Nevertheless, it was summarily dismissed in the findings.

Uneasiness about the constitutional reach of PIPEDA in a federal state has meant that the law, which relies on the federal trade and commerce power for its constitutional legitimacy, applies only in the context of commercial activity. It applies across Canada, but it carves out space for those provinces that want to enact their own data protection laws to assert jurisdiction over the intra-provincial collection, use and disclosure of personal information. To oust PIPEDA in this sphere, these laws have to be considered “substantially similar” to PIPEDA (s. 26(2)(b)). Three provinces – BC, Alberta and Quebec, have substantially similar private sector data protection laws. Even within those provinces, PIPEDA will apply to the collection, use or disclosure by federally-regulated businesses (such as banks or airline companies). It will also apply to cross-border activities by private sector actors (whether international or inter-provincial). This split in jurisdiction over privacy can be complicated for individuals who may not know where to direct complaints, although the different commissioners’ offices will provide assistance. This does not mean there is no room for collaboration. The federal and provincial Commissioners have taken common positions on many issues in the past. These instances are conveniently listed on the website of Alberta’s privacy commissioner.

What has happened in this case is quite different. This is described as a joint investigation between the two Commissioners, and it has resulted in a joint set of recommendations and findings. Both PIPEDA and BC’s PIPA are cited as being applicable laws. In response to the challenge to the BC Privacy Commissioner’s jurisdiction, the Report tersely states that “PIPA (Personal Information Protection Act (British Columbia)) applies to Facebook’s activities occurring within the province of BC”. Yet no information is given as to what specific activities of Facebook were exclusively within the province of BC. No distinction is made at any point in the report between those activities subject to PIPA and those falling under PIPEDA. In this respect, it seems to me that Facebook is entirely correct in challenging the BC Privacy Commissioner’s jurisdiction. Facebook collects, uses and discloses personal information across borders, and its activities with respect to Canadians are almost certainly covered by PIPEDA. If that is the case, then they are not also subject to PIPA. The Exemption Order that finds PIPA BC to be substantially similar to PIPEDA provides:

1. An organization, other than a federal work, undertaking or business, to which the Personal Information Protection Act, S.B.C. 2003, c. 63, of the Province of British Columbia, applies is exempt from the application of Part 1 of the Personal Information Protection and Electronic Documents Act, in respect of the collection, use and disclosure of personal information that occurs within the Province of British Columbia.

Section 3(2) of the Personal Information Protection Act provides:

(2) This Act does not apply to the following:

(c) the collection, use or disclosure of personal information, if the federal Act applies to the collection, use or disclosure of the personal information;

The “federal Act” is defined in s. 1 of PIPA to mean PIPEDA. The scheme is quite simple: if PIPEDA applies then PIPA does not. If the federal Commissioner has jurisdiction over the activities described in the Report, the provincial Commissioner does not. The only way in which the BC Commissioner would have jurisdiction is if there are purely local, provincial activities of Facebook that would not be covered by PIPEDA. Nothing in the Findings suggests that there are. At a minimum, if there are separate spheres of legislative application, these should be made explicit in the Findings.

Jurisdictional issues matter. We already have a complex mosaic of different data protection laws (federal, provincial, public sector, private sector, health sector) in Canada. Individuals must muddle through them to understand their rights and recourses; while organizations and entities must likewise understand which laws apply to which of their activities. Each statute has its own distinct sphere of operation. We do not need the duplication that would result from the adjudication of the same complaint under two (or more) different statutes; or the confusion that might result from different results flowing from different complaint resolutions. If there are separate sets of facts giving rise to separate breaches under different statutes, this has to be spelled out.

Federal-provincial cooperation on data protection is important; it is also valuable for the different privacy commissioners to reach consensus on certain principles or approaches. But creating overlapping jurisdiction over complaints flies in the face of the law and creates more problems than it solves. We have enough data protection challenges to deal with already.

Published in Privacy
<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>
Page 6 of 19

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law