Privacy in Practice
The Future of Mental Privacy with Kristen Mathews
March 11, 2025
We’ve all had that moment—you think about a thing, and moments later, something related to it appears on your phone, smartwatch, or tablet. In this world of hyper-personalization, do our very thoughts become data opportunities? And can this shift be regulated? In the latest episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan sit down with Kristen Mathews, Partner at Cooley's Cyber Data Privacy Practice Group, to explore the emerging field of mental privacy and neurotech regulation. From wearable devices capable of detecting brain activity to the complexities of regulating neural data, this eye-opening conversation unpacks the unique privacy challenges—and opportunities—presented by brain-monitoring technologies. Whether you're interested in the future of data protection or want to understand how businesses can responsibly navigate this evolving space, this conversation provides practical insights into the intersection of AI, neurotech, and privacy law.
How long can we tow the line on our thoughts being accessed by new technology? In the latest episode of Privacy in Practice, hosts Kellie du Preez and Danie Strachan welcome Kristen Mathews, Partner at Cooley's Cyber Data Privacy Practice Group to explore the evolving landscape of mental privacy—its challenges, opportunities, and the critical questions shaping its future.

Together, they: 

Kristen Mathews is a partner in Cooley's Cyber Data Privacy Practice Group and a pioneering voice in the emerging field of mental privacy and neurotech regulation. With a career spanning multiple decades in privacy law, she has been at the forefront of numerous privacy developments, including early data breach responses, online behavioral advertising implementation, and biometric data protection. Her current focus on mental privacy and neurotech regulation demonstrates her commitment to addressing tomorrow's privacy challenges today. Her practical approach to balancing innovation with privacy protection makes her a valuable voice for privacy professionals navigating emerging technologies and their associated regulatory challenges.


Episode Highlights:

[00:03:18] The Data Exchange Framework for Privacy Communication
Privacy professionals should reframe data collection as a "data exchange" between businesses and consumers, where both parties receive clear value. This framework helps organizations clearly communicate what data they need to provide their service versus additional data they collect for other purposes. Companies should explicitly demonstrate the benefits users receive in exchange for their data, making the value proposition transparent. The approach requires privacy teams to work closely with product and marketing teams to articulate the exchange in user-friendly terms. This helps build trust and reduces the risk of users feeling "cheated" when they later discover unexpected data uses.

[00:07:14] Effective Privacy Notice Design: Beyond the Legal Document
Kristen emphasizes that privacy notices should be integrated into the user interface at the exact moment users need the information, not just buried in legal documents. Privacy professionals should ensure notices match the voice and tone of the service, using the same language style that resonates with users. The information should be presented concisely and prominently, avoiding overwhelming users with legal jargon. This approach helps build trust and transparency while reducing the likelihood of litigation and complaints. For maximum effectiveness, privacy teams should coordinate with UI/UX designers to create notices that appear at key decision points in the user journey.

[00:28:29] Protecting Neural Data: A Layered Security Approach
For organizations working with neural data, Kristen recommends implementing multiple layers of protection beyond standard privacy measures. Privacy teams should consider storing neural data locally on devices rather than in the cloud, implementing strong encryption that only allows individual device access, and carefully evaluating the effectiveness of de-identification methods. Organizations need to think about future-proofing their privacy protections, anticipating how advancing technology might affect data security. This approach helps protect sensitive neural data from breaches, unauthorized access, and potential subpoenas while maintaining functionality for legitimate uses.

[00:31:10] Proactive Self-Regulation in Emerging Technologies
Privacy professionals working with emerging technologies should consider implementing self-regulation before legislation mandates specific requirements. Drawing from the successful example of the ad tech industry, companies should develop privacy protection frameworks that align with their business models while protecting individual rights. Early self-regulation can help shape future legislation in practical ways that work for both businesses and consumers. This approach requires privacy teams to collaborate across industries to establish standards that address key concerns while maintaining innovation. Organizations that take the lead in self-regulation often have more influence over eventual regulatory requirements.


Episode Resources:






Connect with us at [email protected]

This podcast is brought to you by VeraSafe.