Insights from a Privacy Jedi

Michelle Dennedy
Author: ISACA Now
Date Published: 28 March 2023

Editor’s note: ISACA recently welcomed privacy expert Michelle Dennedy for an Ask Me Anything (AMA) session on ISACA’s Engage platform 6-10 March. Dennedy is CEO of PrivacyCode, partner at Privatus Consulting and a member of ISACA’s Digital Trust Advisory Council. She is the co-author of The Privacy Engineer’s Manifesto and The Privacy Engineer’s Companion. This AMA session led to a captivating community discussion of topics including the challenges and successes of privacy programs, predictions surrounding AI and how privacy affects digital trust. See highlights from the thread below, and for additional insights and conversations, the complete thread can be found here. To participate in the next AMA session, “I’m the Head of Governance and Trust at the World Economic Forum: Ask Me Anything!” with Daniel Dobrygowski, join the conversation on Engage 24-28 April 2023.

Michelle Dennedy is no stranger to privacy. With a background in intellectual property law, Dennedy broke into privacy as in-house IP counsel to the CMO of Sun Microsystems. She has witnessed first-hand how this practice area has evolved over time, and she said that, from an implementation perspective, one of the largest challenges is twofold: 1) policy, business and technical teams often speak different languages, and 2) privacy is often considered a cost center rather than a core operating exigency.

“Jargon serves as cultural binding within our tribes to test who is ‘in the know’ or ‘gets it,’ whether we are conscious of this bias or not,” Dennedy writes. “But deeper than being mindful of spelling out acronyms of explaining Latin phrasing or balance sheet requirements, we often give and receive requirements without the rigor needed to ensure execution throughout the soft systems that are required for success.”

Privacy is treated as an insurance-like task at the C-suite level, so the people, processes and nascent technologies in a highly complex and dynamic risk landscape continue to lack proper funding. Dennedy believes that with people like the group on Engage, privacy professionals can do better with improved systems thinking and innovation.

“If you can frame initiatives in tandem with business objectives over time, privacy and governance programs have a much greater chance at long-term success and growth,” writes Dennedy.

When asked about best practices, Dennedy has frequently observed that the success of a privacy program is personality-driven. Legal teams that want to publish policies but otherwise do not get involved with a particular business’s systems make it nearly impossible to translate legal requirements into the reality of operational work. Insecure CISOs or trust officers with combined duties may lean on their knowledge of information security and fail to understand the critical requirements for authorized data sharing or when statistical models of anonymity are best deployed versus individually identified, with a heavy emphasis on data shredding and deletion.

Dennedy writes that the most prominent emerging trend is data governance and data quality practices teams that understand the business they function within, tied to privacy and security teams with engaged and active legal support. A framework can fit into this picture and work to its best advantage when the top-line data strategy is a best fit.

As the conversation evolves to consider the privacy implications of AI, Dennedy suggests that privacy engineering should contemplate both the original state of data and the future recombinant states. Privacy tools should reflect risks that arise from this reality accordingly, and where encryption or “anonymizing” things fails, she argues that “ethics engineering” should be applied. This additional layer is meant to plan for unforeseen or inhumane practices, to be able to audit for evidence of their occurrence and respond with the appropriate, corrective action. For areas where technology offers a lack of precision, supplementary steps must be taken to ensure proper protection.

This conversation directly ties to the connections between privacy and digital trust. Trust naturally follows where PII is processed according to moral, ethical, legal, sustainable and governable principles. As Dennedy writes, “The more clear our practices, communication and behaviors [become] over time with respect to personal data, the more a person has a positive assumption that we are more likely to behave appropriately with respect to data.”