Fifty shades of compliance: Automating privacy impact assessments

This article was originally published by The Lawyer’s Daily (www.thelawyersdaily.ca), part of LexisNexis Canada Inc.
Co-written with Sharon Bauer

What we consider to be private is subjective and contextual. Our views on privacy vary depending on the nuances of the culture, values and technology our personal data is handled within. This is why privacy does not always fit within the black and white letter of the law. Rather, privacy is a myriad of shades of grey. What further adds to the complexity of privacy is our fluctuating level of trust with companies that collect, use and disclose our personal information.


The variability of privacy is the very reason why companies continuously assess their data handling practices against the contexts for which the data is used. The uptake in privacy assessments has led to an increase in automated privacy assessment tools, which are meant to assess a company’s privacy compliance posture. While these tools can be efficient and cost effective, they may not always provide an accurate reading as they do not consider the exponential number of extraneous factors that will determine a company’s privacy posture.


To deal with the complexity of privacy, most privacy regulations are principle-based by design. For example, Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) is based around the 10 CSA Model Code privacy principles. These principles provide the foundation for how privacy practices should be formulated, without providing specific measures. The European Union’s General Data Protection Regulation (GDPR) takes a similar approach. While it is more prescriptive than Canada’s legislation and mandates certain processes and procedures, it too provides for context in assessing privacy practices. This flexibility allows for capturing novel situations and adapting to ever-changing societal norms and emerging technologies.


The context of privacy


Privacy in recent years has become a hot topic and has moved to the forefront of mainstream attention. As more companies collect, use and disclose personal data, instances of data breaches and their significance are rising. Companies have recognized the need to assess their privacy posture to ensure compliance with privacy regulations and ensure trusting relationships with their customers through responsible information handling practices. As the need to assess privacy has risen, so too have the options in which to do so, such as automated privacy assessment tools.

Automated privacy assessment tools are on the rise and have been advertised to be more “efficient” and “inexpensive” compared to manual privacy assessments. Automated assessment tools use templated methodologies based on privacy regulations to assess privacy risks in a standardized fashion. While the design of these new privacy assessment tools is welcomed, companies need to be cautious as some of these tools lack a contextual approach to privacy. In other words, they assess a company against the black and white letter of the law, regardless of whether privacy considerations fall in a grey area.


This contrasts with how privacy experts evaluate privacy and its associated risks. When assessing a company’s privacy posture, privacy experts take into account the personal data that is being collected and the purposes for which the data is used. Although privacy experts review a company’s privacy compliance activities against privacy regulations, they also weigh the collection, use and disclosure of personal data with societal expectations and values, which generally change much faster than the law can keep up with. The same can be said about emerging technologies, such as the Internet of Things enabled devices or artificial intelligence (AI) services, both of which have numerous untested privacy implications.

As a result, privacy experts routinely incorporate the prevailing trends in ethical and moral views when evaluating innovative technologies or company plans against the relevant flexible regulatory backdrop. For many automated assessments, this is unfortunately not yet feasible.


Automating assessments


Many automated assessment tools not only miss the mark on incorporating social context when assessing a company’s privacy risks but also lack insights and fail to factor in the company’s business strategies, goals and vision. These concerns should play a role in the way a company’s privacy practices are shaped. Without this contextual visibility, the automated assessment tool may not accurately prioritize privacy risks for the company. This can leave the company vulnerable to a privacy breach, which could cost millions of dollars to remediate. On the other hand, an automated assessment may encourage a company to revamp its privacy program or implement a complex system when it is not necessary to do so. Recommending a company to implement unnecessary and costly systems can defeat the purpose of relying on an efficient and cost-effective automated solution in the first place.

As an example, an automated assessment tool may detect that a company’s information system may have access to individuals’ financial information through an API or other integrations provided by a financial service provider. Since financial information is considered sensitive personal information under Canadian privacy regulations, the automated assessment may recommend that the company implement expensive safeguards, such as endpoint protection, identity access management or security information and event management (SIEM) solutions to protect the financial information. This may occur even though the company never uses or plans to access the financial information, something an expert would notice. Depending on the size of the company, implementing these unnecessary safeguards could be costly and potentially stifle its growth.


Conclusion


As privacy concerns become more pervasive across society, cost-effective automated tools that review privacy risks can be a great resource to make privacy assessments accessible for companies that want to be responsible with their users’ privacy. There is also little doubt that through the growing use of AI in the future, automated privacy assessment tools will become more comprehensive and accurate in depicting a company’s true risk exposure. However, in the meantime, if companies rely on automated privacy risk assessment tools, they should consider supplementing those tools with the guidance and direction of a privacy expert, who can provide context and colour beyond the black and white letter of the law.


Photo credit: Adi Goldstein / UNSPLASH.COM