techUK Ltd.

11/07/2024 | News release | Distributed by Public on 11/07/2024 08:15

Supporting Innovation: UK Government and ICO Launch Privacy Technology Cost-Benefit Awareness Tool

07 Nov 2024

Supporting Innovation: UK Government and ICO Launch Privacy Technology Cost-Benefit Awareness Tool

In a significant step to support the adoption of Privacy Enhancing Technologies (PETs) the Responsible Technology Adoption Unit (RTA) of the Department for Science Innovation and Technology (DSIT) and the Information Commissioner's Office (ICO) have today released a cost-benefit awareness tool.This tool is intended to highlight essential considerations for organisations exploring PETs, whether evaluating technical options or building a business case for adoption.

In a significant step to support the adoption of Privacy Enhancing Technologies (PETs) the Responsible Technology Adoption Unit (RTA) of the Department for Science Innovation and Technology (DSIT) and the Information Commissioner's Office (ICO) have today released a cost-benefit awareness tool.This tool is intended to highlight essential considerations for organisations exploring PETs, whether evaluating technical options or building a business case for adoption.

The guidance is a welcome step from the government and the ICO, demonstrating their commitment to fostering innovation while protecting data privacy.

What are PETs?

PETs are tools designed to protect sensitive information during data analysis, processing, and sharing. Given that data serves as the fundamental feedstock for training and developing artificial intelligence systems, PETs can play a crucial role in unlocking valuable datasets while maintaining essential privacy safeguards. PETs include methods like homomorphic encryption, synthetic data, trusted execution environments, and federated learning - technologies that could support businesses in handling sensitive data. The toolkit provides organisations with practical insights into these technologies.

Toolkit highlights

Of particular significance is the toolkit's pragmatic approach to cost-benefit analysis. It acknowledges both the transformative potential of PETs - including enhanced data collaboration opportunities and stronger regulatory compliance - while being candid about implementation challenges such as computational requirements and technical expertise needs.

The guidance is especially timely as organisations across all sectors grapple with increasingly complex data protection requirements and cyber security threats. The tool's focus on privacy-preserving federated learning (PPFL) as a use case demonstrates how organisations can, in practice, harness advanced analytics while maintaining robust privacy protections - a balance that is key. It provides clear, practical guidance on evaluating and implementing PETs, which will be crucial for maintaining the UK's competitive edge in the digital economy.

The toolkit emphasises the importance of careful evaluation of specific needs and capabilities before adoption, providing guidance that can support strategic planning for data protection and privacy enhancement initiatives. With valuable insights into long-term benefits, including network effects and future regulatory alignment, the toolkit serves as a valuable resource for organisations at any stage of their privacy technology journey.

Looking ahead, this initiative could set the stage for wider adoption of PETs across the UK business landscape, potentially establishing new standards for privacy-preserving data collaboration and analysis.

While PETs are not a universal solution to all data protection challenges, they represent a powerful set of tools that, when implemented strategically, can unlock significant value while maintaining robust privacy safeguards. As data increasingly becomes the vital feedstock powering AI, organisations that strategically adopt PETs - aligned with their specific context, capabilities, and use cases - can be well positioned to innovate while protecting sensitive information.

For comprehensive guidance on legal compliance and implementation of best practices, we encourage organisations to consult the ICO's detailed PETs guidance documentation.

For more information about techUK's Digital Ethics and AI Safety work please contact [email protected]and for further details on our AI and Data Policy work please contact [email protected].

If you found this insight intersting, we encourage you to join us for further discussion on digital ethics at this years Eighth Annual Digital Ethics Summit on the 4 December by registering here.

Audre Verseckaite

Senior Policy Manager, Data & AI, techUK

Audre joined techUK in July 2023 as a Policy Manager for Data. Previously, she was a Policy Advisor in the Civil Service, where she worked on the Digital Markets, Competition and Consumers Bill at the Department for Science, Innovation and Technology, and at HM Treasury on designing COVID-19 support schemes and delivering the Financial Services and Markets Bill. Before that, Audre worked at a public relations consultancy, advising public and private sector clients on their communications, public relations, and government affairs strategy.

Prior to this, Audre completed an MSc in Public Policy at the Korea Development Institute and a Bachelor's in International Relations and History from SOAS, University of London. Outside of work, she enjoys spending time outdoors, learning about new cultures through travel and food, and going on adventures.

Email: [email protected]Website: www.techUK.org,www.techUK.org LinkedIn: https://www.linkedin.com/in/audre-v-81b2b0a2/,https://www.linkedin.com/in/audre-v-81b2b0a2/

Read lessmore

Tess Buckley

Programme Manager - Digital Ethics and AI Safety, techUK

Tess is the Programme Manager for Digital Ethics and AI Safety at techUK.

Prior to techUK Tess worked as an AI Ethics Analyst, which revolved around the first dataset on Corporate Digital Responsibility (CDR), and then later the development of a large language model focused on answering ESG questions for Chief Sustainability Officers. Alongside other responsibilities, she distributed the dataset on CDR to investors who wanted to further understand the digital risks of their portfolio, she drew narratives and patterns from the data, and collaborate with leading institutes to support academics in AI ethics. She has authored articles for outlets such as ESG Investor, Montreal AI Ethics Institute, The FinTech Times, and Finance Digest. Covered topics like CDR, AI ethics, and tech governance, leveraging company insights to contribute valuable industry perspectives. Tess is Vice Chair of the YNG Technology Group at YPO, an AI Literacy Advisor at Humans for AI, a Trustworthy AI Researcher at Z-Inspection Trustworthy AI Labs and an Ambassador for AboutFace.

Tess holds a MA in Philosophy and AI from Northeastern University London, where she specialised in biotechnologies and ableism, following a BA from McGill University where she joint-majored in International Development and Philosophy, minoring in communications. Tess's primary research interests include AI literacy, AI music systems, the impact of AI on disability rights and the portrayal of AI in media (narratives). In particular, Tess seeks to operationalise AI ethics and use philosophical principles to make emerging technologies explainable, and ethical.

Outside of work Tess enjoys kickboxing, ballet, crochet and jazz music.

Email: [email protected]Website: tessbuckley.me LinkedIn: https://www.linkedin.com/in/tesssbuckley/

Read lessmore

Supporting Innovation: UK Government and ICO Launch Privacy Technology Cost-Benefit Awareness Tool

Related topics