Teresa Scassa - Blog

Displaying items by tag: cybersecurity

On May 13, 2024, the Ontario government introduced Bill 194. The bill addresses a catalogue of digital issues for the public sector. These include: cybersecurity, artificial intelligence governance, the protection of the digital information of children and youth, and data breach notification requirements. Consultation on the Bill closes on June 11, 2024. Below is my submission to the consultation. The legislature has now risen for the summer, so debate on the bill will not be moving forward now until the fall.

 

Submission to the Ministry of Public and Business Service Delivery on the Consultation on proposed legislation: Strengthening Cyber Security and Building Trust in the Public Sector Act, 2024

Teresa Scassa, Canada Research Chair in Information Law and Policy, University of Ottawa

June 4, 2024

I am a law professor at the University of Ottawa, where I hold the Canada Research Chair in Information Law and Policy. I research and write about legal issues relating to artificial intelligence and privacy. My comments on Bill 194 are made on my own behalf.

The Enhancing Digital Security and Trust Act, 2024 has two schedules. Schedule 1 has three parts. The first relates to cybersecurity, the second to the use of AI in the broader public service, and the third to the use of digital technology affecting individuals under 18 years of age in the context of Children’s Aid Societies and School Boards. Schedule 2 contains a series of amendments to the Freedom of Information and Protection of Privacy Act (FIPPA). My comments are addressed to each of the Schedules. Please note that all examples provided as illustrations are my own.

Summary

Overall, I consider this to be a timely Bill that addresses important digital technology issues facing Ontario’s public sector. My main concerns relate to the sections on artificial intelligence (AI) systems and on digital technologies affecting children and youth. I recommend the addition of key principles to the AI portion of the Bill in both a reworked preamble and a purpose section. In the portion dealing with digital technologies and children and youth, I note the overlap created with existing privacy laws, and recommend reworking certain provisions so that they enhance the powers and oversight of the Privacy Commissioner rather than creating a parallel and potentially conflicting regime. I also recommend shifting the authority to prohibit or limit the use of certain technologies in schools to the Minister of Education and to consider the role of public engagement in such decision-making. A summary of recommendations is found at the end of this document.

Schedule 1 - Cybersecurity

The first section of the Enhancing Digital Security and Trust Act (EDSTA) creates a framework for cybersecurity obligations that is largely left to be filled by regulations. Those regulations may also provide for the adoption of standards. The Minister will be empowered to issue mandatory Directives to one or more public sector entities. There is little detail provided as to what any specific obligations might be, although section 2(1)(a) refers to a requirement to develop and implement “programs for ensuring cybersecurity” and s. 2(1)(c) anticipates requirements on public sector entities to submit reports to the minister regarding cyber security incidents. Beyond this, details are left to regulations. These details may relate to roles and responsibilities, reporting requirements, education and awareness measures, response and recovery measures, and oversight.

The broad definition of a “public sector entity” to which these obligations apply includes hospitals, school boards, government ministries, and a wide range of agencies, boards and commissions at the provincial and municipal level. This scope is important, given the significance of cybersecurity concerns.

Although there is scant detail in Bill 194 regarding actual cyber security requirements, this manner of proceeding seems reasonable given the very dynamic cybersecurity landscape. A combination of regulations and standards will likely provide greater flexibility in a changeable context. Cybersecurity is clearly in the public interest and requires setting rules and requirements with appropriate training and oversight. This portion of Bill 194 would create a framework for doing this. This seems like a reasonable way to address public sector cybersecurity, although, of course, the effectiveness will depend upon the timeliness and the content of any regulations.

Schedule 1 – Use of Artificial Intelligence Systems

Schedule 1 of Bill 194 also contains a series of provisions that address the use of AI systems in the public sector. These will apply to AI systems that meet a definition that maps onto the Organization for Economic Co-operation and Development (OECD) definition. Since this definition is one to which many others are being harmonized (including a proposed amendment to the federal AI and Data Act, and the EU AI Act), this seems appropriate. The Bill goes on to indicate that the use of an AI system in the public sector includes the use of a system that is publicly available, that is developed or procured by the public sector, or that is developed by a third party on behalf of the public sector. This is an important clarification. It means, for example, that the obligations under the Act could apply to the use of general-purpose AI that is embedded within workplace software, as well as purpose-built systems.

Although the AI provisions in Bill 194 will apply to “public service entities” – defined broadly in the Bill to include hospitals and school boards as well as both federal and municipal boards, agencies and commissions – the AI provisions will only apply to a public sector entity that is “prescribed for the purposes of this section if they use or intend to use an artificial intelligence system in prescribed circumstances” (s. 5(1)). The regulations also might apply to some systems (e.g., general purpose AI) only when they are being used for a particular purpose (e.g., summarizing or preparing materials used to support decision-making). Thus, while potentially quite broad in scope, the actual impact will depend on which public sector entities – and which circumstances – are prescribed in the regulations.

Section 5(2) of Bill 194 will require a public sector entity to which the legislation applies to provide information to the public about the use of an AI system, but the details of that information are left to regulations. Similarly, there is a requirement in s. 5(3) to develop and implement an accountability framework, but the necessary elements of the framework are left to regulations. Under s. 5(4) a public sector entity to which the Act applies will have to take steps to manage risks in accordance with regulations. It may be that the regulations will be tailored to different types of systems posing different levels of risk, so some of this detail would be overwhelming and inflexible if included in the law itself. However, it is important to underline just how much of the normative weight of this law depends on regulations.

Bill 194 will also make it possible for the government, through regulations, to prohibit certain uses of AI systems (s. 5(6) and s. 7(f) and (g)). Interestingly, what is contemplated is not a ban on particular AI systems (e.g., facial recognition technologies (FRT)); rather, it is potential ban on particular uses of those technologies (e.g., FRT in public spaces). Since the same technology can have uses that are beneficial in some contexts but rights-infringing in others, this flexibility is important. Further, the ability to ban certain uses of FRT on a province-wide basis, including at the municipal level, allows for consistency across the province when it comes to issues of fundamental rights.

Section 6 of the bill provides for human oversight of AI systems. Such a requirement would exist only when a public entity uses an AI system in circumstances set out in the regulations. The obligation will require oversight in accordance with the regulations and may include additional transparency obligations. Essentially, the regulations will be used to customize obligations relating to specific systems or uses of AI for particular purposes.

Like the cybersecurity measures, the AI provisions in Bill 194 leave almost all details to regulations. Although I have indicated that this is an appropriate way to address cybersecurity concerns, it may be less appropriate for AI systems. Cybersecurity is a highly technical area where measures must adapt to a rapidly evolving security landscape. In the cybersecurity context, the public interest is in the protection of personal information and government digital and data infrastructures. Risks are either internal (having to do with properly training and managing personnel) or adversarial (where the need is for good security measures to be in place). The goal is to put in place measures that will ensure that the government’s digital systems are robust and secure. This can be done via regulations and standards.

By contrast, the risks with AI systems will flow from decisions to deploy them, their choice and design, the data used to train the systems, and their ongoing assessment and monitoring. Flaws at any of these stages can lead to errors or poor functioning that can adversely impact a broad range of individuals and organizations who may interact with government via these systems. For example, an AI chatbot that provides information to the public about benefits or services, or an automated decision-making system for applications by individuals or businesses for benefits or services, interacts with and impacts the public in a very direct way. Some flaws may lead to discriminatory outcomes that violate human rights legislation or the Charter. Others may adversely impact privacy. Errors in output can lead to improperly denied (or allocated) benefits or services, or to confusion and frustration. There is therefore a much more direct impact on the public, with effects on both groups and individuals. There are also important issues of transparency and trust. This web of considerations makes it less appropriate to leave the governance of AI systems entirely to regulations. The legislation should, at the very least, set out the principles that will guide and shape those regulations. The Ministry of Public and Business Service Delivery has already put considerable work into developing a Trustworthy AI Framework and a set of (beta) principles. This work could be used to inform guiding principles in the statute.

Currently, the guiding principles for the whole of Bill 194 are found in the preamble. Only one of these directly relates to the AI portion of the bill, and it states that “artificial intelligence systems in the public sector should be used in a responsible, transparent, accountable and secure manner that benefits the people of Ontario while protecting privacy”. Interestingly, this statement only partly aligns with the province’s own beta Principles for Ethical Use of AI. Perhaps most importantly, the second of these principles, “good and fair”, refers to the need to develop systems that respect the “rule of law, human rights, civil liberties, and democratic values”. Currently, Bill 194 is entirely silent with respect to issues of bias and discrimination (which are widely recognized as profoundly important concerns with AI systems, and which have been identified by Ontario’s privacy and human rights commissioners as a concern). At the very least, the preamble to Bill 194 should address these specific concerns. Privacy is clearly not the only human rights consideration at play when it comes to AI systems. The preamble to the federal government’s Bill C-27, which contains the proposed Artificial Intelligence and Data Act, states: “that artificial intelligence systems and other emerging technologies should uphold Canadian norms and values in line with the principles of international human rights law”. The preamble to Bill 194 should similarly address the importance of human rights values in the development and deployment of AI systems for the broader public sector.

In addition, the bill would benefit from a new provision setting out the purpose of the part dealing with public sector AI. Such a clause would shape the interpretation of the scope of delegated regulation-making power and would provide additional support for a principled approach. This is particularly important where legislation only provides the barest outline of a governance framework.

In this regard, this bill is similar to the original version of the federal AI and Data Act, which was roundly criticized for leaving the bulk of its normative content to the regulation-making process. The provincial government’s justification is likely to be similar to that of the federal government – it is necessary to remain “agile”, and not to bake too much detail into the law regarding such a rapidly evolving technology. Nevertheless, it is still possible to establish principle-based parameters for regulation-making. To do so, this bill should more clearly articulate the principles that guide the adoption and use of AI in the broader public service. A purpose provision could read:

The purpose of this Part is to ensure that artificial intelligence systems adopted and used by public sector entities are developed, adopted, operated and maintained in manner that is transparent and accountable and that respects the privacy and human rights of Ontarians.

Unlike AIDA, the federal statute which will apply to the private sector, Bill 194 is meant to apply to the operations of the broader public service. The flexibility in the framework is a recognition of both the diversity of AI systems, and the diversity of services and activities carried out in this context. It should be noted, however, that this bill does not contemplate any bespoke oversight for public sector AI. There is no provision for a reporting or complaints mechanism for members of the public who have concerns with an AI system. Presumably they will have to complain to the department or agency that operates the AI system. Even then, there is no obvious requirement for the public sector entity to record complaints or to report them for oversight purposes. All of this may be provided for in s. 5(3)’s requirement for an accountability framework, but the details of this have been left to regulation. It is therefore entirely unclear from the text of Bill 194 or what recourse – if any – the public will have when they have problematic encounters with AI systems in the broader public service. Section 5(3) could be amended to read:

5(3) A public sector entity to which this section applies, shall, in accordance with the regulations, develop and implement an accountability framework respecting their use of the artificial intelligence system. At a minimum, such a framework will include:

a) The specification of reporting channels for internal or external complaints or concerns about the operation of the artificial intelligence system;

b) Record-keeping requirements for complaints and concerns raised under subparagraph 5(3)(a), as well as for responses thereto.

Again, although a flexible framework for public sector AI governance may be an important goal, key elements of that framework should be articulated in the legislation.

Schedule 1 – Digital Technology Affecting Individuals Under Age 18

The third part of Schedule 1 addresses digital technology affecting individuals under age 18. This part of Bill 194 applies to children’s aid societies and school boards. Section 9 enables the Lieutenant Governor in Council to make regulations regarding “prescribed digital information relating to individuals under age 18 that is collected, used, retained or disclosed in a prescribed manner”. Significantly, “digital information” is not defined in the Bill.

The references to digital information are puzzling, as it seems to be nothing more than a subset of personal information – which is already governed under both the Municipal Freedom of Information and Protection of Privacy Act (MFIPPA) and FIPPA. Personal information is defined in both these statutes as “recorded information about an identifiable individual”. It is hard to see how “digital information relating to individuals under age 18” is not also personal information (which has received an expansive interpretation). If it is meant to be broader, it is not clear how. Further, the activities to which this part of Bill 194 will apply are the “collection, use, retention or disclosure” of such information. These are activities already governed by MFIPPA and FIPPA – which apply to school boards and children’s aid societies respectively. What Bill 194 seems to add is a requirement (in s. 9(b)) to submit reports to the Minister regarding the collection, use, retention and disclosure of such information, as well as the enablement of regulations in s. 9(c) to prohibit collection, use, retention or disclosure of prescribed digital information in prescribed circumstances, for prescribed purposes, or subject to certain conditions. Nonetheless, the overlap with FIPPA and MFIPPA is potentially substantial – so much so, that s. 14 provides that in case of conflict between this Act and any other, the other Act would prevail. What this seems to mean is that FIPPA and MFIPPA will trump the provisions of Bill 194 in case of conflict. Where there is no conflict, the bill seems to create an unnecessary parallel system for governing the personal information of children.

The need for more to be done to protect the personal information of children and youth in the public school system is clear. In fact, this is a strategic priority of the current Information and Privacy Commissioner (IPC), whose office has recently released a Digital Charter for public schools setting out voluntary commitments that would improve children’s privacy. The IPC is already engaged in this area. Not only does the IPC have the necessary expertise in the area of privacy law, the IPC is also able to provide guidance, accountability and independent oversight. In any event, since the IPC will still have oversight over the privacy practices of children’s aid societies and school boards notwithstanding Bill 194, the new system will mean that these entities will have to comply with regulations set by the Minister on the one hand, and the provisions of FIPPA and MFIPPA on the other. The fact that conflicts between the two regimes will be resolved in favour of privacy legislation means that it is even conceivable that the regulations could set requirements or standards that are lower than what is required under FIPPA or MFIPPA – creating an unnecessarily confusing and misleading system.

Another odd feature of the scheme is that Bill 194 will require “reports to be submitted to the Minister or a specified individual in respect of the collection, use, retention and disclosure” of digital information relating to children or youth (s. 9(b)). It is possible that the regulations will specify that it is the Privacy Commissioner to whom the reports should be submitted. If it is, then it is once again difficult to see why a parallel regime is being created. If it is not, then the Commissioner will be continuing her oversight of privacy in schools and children’s aid societies without access to all the relevant data that might be available.

It seems as if Bill 194 contemplates two separate sets of measures. One addresses the proper governance of the digital personal information of children and youth in schools and children’s aid societies. This is a matter for the Privacy Commissioner, who should be given any additional powers she requires to fulfil the government’s objectives. Sections 9 and 10 of Bill 194 could be incorporated into FIPPA and MFIPPA, with modifications to require reporting to the Privacy Commissioner. This would automatically bring oversight and review under the authority of the Privacy Commissioner. The second objective of the bill seems to be to provide the government with the opportunity to issue directives regarding the use of certain technologies in the classroom or by school boards. This is not unreasonable, but it is something that should be under the authority of the Minister of Education (not the Minister of Public and Business Service Delivery). It is also something that might benefit from a more open and consultative process. I would recommend that the framework be reworked accordingly.

Schedule 2: FIPPA Amendments

Schedule 2 consists of amendments to the Freedom of Information and Protection of Privacy Act. These are important amendments that will introduce data breach notification and reporting requirements for public sector entities in Ontario that are governed by FIPPA (although, interestingly, not those covered by MFIPPA). For example, a new s. 34(2)(c.1) will require the head of an institution to include in their annual report to the Commissioner “the number of thefts, losses or unauthorized uses or disclosures of personal information recorded under subsection 40.1”. The new subsection 40.1(8) will require the head of an institution to keep a record of any such data breach. Where a data breach reaches the threshold of creating a “real risk that a significant harm to an individual would result” (or where any other circumstances prescribed in regulations exist), a separate report shall be made to the Commissioner under s. 40.1(1). This report must be made “as soon as feasible” after it has been determined that the breach has taken place (s. 40.1(2)). New regulations will specify the form and contents of the report. There is a separate requirement for the head of the institution to notify individuals affected by any breach that reaches the threshold of a real risk of significant harm (s. 40.1(3)). The notification to the individual will have to contain, along with any prescribed information, a statement that the individual is entitled to file a complaint with the Commissioner with respect to the breach, and the individual will have one year to do so (ss. 40.1(4) and (5)). The amendments also identify the factors relevant in determining if there is a real risk of significant harm (s. 40.1(7)).

The proposed amendments also provide for a review by the Commissioner of the information practices of an institution where a complaint has been filed under s. 40.1(4), or where the Commissioner “has other reason to believe that the requirements of this Part are not being complied with” (s. 49.0.1).) The Commissioner can decide not to review an institution’s practices in circumstances set out in s. 49.0.1(3). Where the Commissioner determines that there has been a contravention of the statutory obligations, she has order-making powers (s. 49.0.1(7)).

Overall, this is a solid and comprehensive scheme for addressing data breaches in the public sector (although it does not extend to those institutions covered by MFIPPA). In addition to the data breach reporting requirements, the proposed amendments will provide for whistleblower protections. They will also specifically enable the Privacy Commissioner to consult with other privacy commissioners (new s. 59(2)), and to coordinate activities, enter into agreements, and to provide for handling “of any complaint in which they are mutually interested.” (s. 59(3)). These are important amendments given that data breaches may cross provincial lines, and Canada’s privacy commissioners have developed strong collaborative relationships to facilitate cooperation and coordination on joint investigations. These provisions make clear that such co-operation is legally sanctioned, which may avoid costly and time-consuming court challenges to the commissioners’ authority to engage in this way.

The amendments also broaden s. 61(1)(a) of FIPPA which currently makes it an offence to wilfully disclose personal information in contravention of the Act. If passed, it will be an offence to wilfully collect, use or disclose information in the same circumstances.

Collectively the proposed FIPPA amendments are timely and important.

Summary of Recommendations:

On artificial intelligence in the broader public sector:

1. Amend the Preamble to Bill 194 to address the importance of human rights values in the development and deployment of AI systems for the broader public sector.

2. Add a purpose section to the AI portion of Bill 194 that reads:

The purpose of this Part is to ensure that artificial intelligence systems adopted and used by public sector entities are developed, adopted, operated and maintained in manner that is transparent and accountable and that respects the privacy and human rights of Ontarians.

3. Amend s. 5(3) to read:

5(3) A public sector entity to which this section applies, shall, in accordance with the regulations, develop and implement an accountability framework respecting their use of the artificial intelligence system. At a minimum, such a framework will include:

a) The specification of reporting channels for internal or external complaints or concerns about the operation of the artificial intelligence system;

b) Record-keeping requirements for complaints and concerns raised under subparagraph 5(3)(a), as well as for responses thereto.

On Digital Technology Affecting Individuals Under Age 18:

1. Incorporate the contents of ss. 9 and 10 into FIPPA and MFIPPA, with the necessary modification to require reporting to the Privacy Commissioner.

2. Give the authority to issue directives regarding the use of certain technologies in the classroom or by school boards to the Minister of Education and ensure that an open and consultative public engagement process is included.

Published in Privacy

Canadian Trademark Law

Published in 2015 by Lexis Nexis

Canadian Trademark Law 2d Edition

Buy on LexisNexis

Electronic Commerce and Internet Law in Canada, 2nd Edition

Published in 2012 by CCH Canadian Ltd.

Electronic Commerce and Internet Law in Canada

Buy on CCH Canadian

Intellectual Property for the 21st Century

Intellectual Property Law for the 21st Century:

Interdisciplinary Approaches

Purchase from Irwin Law