Photo by MART PRODUCTION from Pexels
Photo by MART PRODUCTION from Pexels
Photo by MART PRODUCTION from Pexels

The future of healthcare is being transformed by digital health startups. If you are reading this you are probably one of the more than 8,000 healthcare startups around the world that are building groundbreaking products that are improving quality, access, and outcomes in healthcare.

Innovations in healthcare hold a lot of promise. This industry, however, is difficult to disrupt. The system has a complex web of stakeholders (service providers, clinicians, payers, patients, researchers, etc.). The system has a history of fragmentation and data silos. Additionally, it’s a highly regulated market. It is important to get the tech and product right. Figuring out the path to commercialization, finding market-fit, navigating regulatory approvals, and hiring an amazing team are all equally important.

Many growing digital health businesses lack the most important – and newly emerging – part of their leadership teams: an expert in privacy. The goal here is not to hire a subject-matter expert in law or security (although you may also need them). It’s more like hiring a privacy product manager.

Who is a privacy product manager?

The role of a privacy product manager is to oversee how privacy is (or isn’t) observed within an organization. A product manager has a similar role to traditional product managers, and the newer specialized roles such as “data product managers” and “AI product managers” that are now common in Silicon Valley.

Privacy is treated like a product, not like a risk factor or a compliance burden.

Privacy product managers are generally employed with a background in privacy law and regulation, product management, and technology in order to fulfill their role as privacy experts within an organization.

It is the privacy PM’s responsibility to balance product strategy, governance, and implementation of anything privacy- and ethics-related, as well as facilitate the communication among all relevant stakeholders – executives, engineers, analysts, product teams, marketing teams, regulators, external partners, and customers/users.

As PMs were traditionally said to represent the voice of the customer during product development, privacy PMs represent the customer’s trust expectations, and their job is to ensure values such as privacy and ethics are implemented on a practical level within your organization and embodied in your product. Also, they’re the privacy evangelist for your company. The value-add extends far beyond simply providing legal and regulatory advice.

Here are five reasons why digital health products must be privacy-first

There is a looming issue of privacy that transcends compliance – it affects the entire software development lifecycle, engineering, product design, data flows, analytics, marketing, and technology architecture. Privacy must be considered in every aspect of your product’s design and commercialization, particularly in the digital health space. This is because:

Medical data is a game of high stakes

Currently, we are living in an “age of data privacy”. Privacy and data ethics are now considered a USP by the world’s biggest brands, rather than an issue of risk. In the digital health industry, where companies are trading in the marketplace for personal data, this is more relevant than anywhere else.

Digital health is a high-stakes game due to the sensitive nature of clinical data. Whether it is unethical or unauthorised data sharing, prediction errors in medical AI models or data breaches, the consequences can be dire. As opposed to other tech companies, digital health cannot afford to move too fast and break things.

The Cambridge Analytica data scandal is a perfect example of how clumsy slip ups in the healthtech space can cause direct harm to people (as well as lead to hefty fines and liability claims).

2. Digital health products will require a high level of trust

Ten years after monetizing personal data and denouncing privacy, Google, Apple and Facebook have poured huge amounts of money and resources into shiny new privacy agendas.

When it comes to data control, transparency, and ethics, consumers’ expectations have changed significantly. Future titans will be those companies that lead in privacy, data ethics, security, and reliability.

3. Privacy will affect regulatory approvals and healthcare procurement

Australia’s Therapeutic Goods Administration (TGA) released its revised cybersecurity guidance for pre- and post-market medical devices in April 2021. It advised healthcare providers to ask details about the developer’s privacy and security policies, including how is data from the device logged.
As part of its newly proposed “Software Precertification Program” and its SaMD Action Plan, the FDA anticipates that its evaluation of software developers’ safety and quality standards will include security, privacy and data protection systems, processes and controls. A recent FTC statement aimed at protecting the privacy of digital health app developers, connected devices, and other health products now requires developers to follow the Health Breach Notification Rule, regardless of whether they’re covered under HIPAA’s privacy and security standards.

4. Digital health requires privacy compliance

In recent years, privacy laws have been passed in more than 60 countries, driven by the desire to protect consumers’ data in the digital age and inspired by the EU’s General Data Protection Regulation (GDPR) and the hefty fines it entails.

Under GDPR, anyone collecting, storing, or using health (very broadly defined) has a number of obligations, including the requirement to implement privacy by design from the beginning.

Australia, for example, is enacting wide-ranging privacy law reforms that will align Australia’s outdated privacy laws with GDPR and impose much higher penalties for breaches, which will directly impact the digital health industry. There will be a complex web of new laws, regulations, and standards governing how AI companies use data, which will have a huge impact on the global medtech sector.

5. The need to obtain quality data cannot be overstated

The quality of medical AI products is largely determined by the quality of the data that fuels them. When it comes to handling patient data, they require close collaboration with hospitals, labs, and clinicians who are bound by strict legal, regulatory, and ethical obligations. The best way to foster partnerships with healthcare stakeholders, whether for data access or product evaluation and validation (such as the recent collaboration between a startup and 20 hospitals across 5 continents), is to embed privacy measures deeply into your product or platform’s architecture (not just its security). It is likely that your product will touch sensitive data multiple times, whether it is for process / service delivery automation, diagnosis and treatment, clinical trials, disease management, or virtual care delivery. Besides wanting your product to work reliably, your business partners will also expect it to be private, ethical, and secure.

How can a privacy expert help your digital health startup?

It’s not just about complying with regulators and checking compliance boxes when you have bulletproof privacy policies. They also make good business sense. Experts suggest:

Ensure privacy & ethics are integrated into the product development & design lifecycle, ensure you’re adhering to privacy by design principles, and that potential biases are properly controlled both during the model training, testing, and validation, and once the product goes live through continuous monitoring.
Operationalize privacy – they will develop clear standard procedures, policies and systems to ensure your data collection, development, and design processes are as privacy-friendly as possible, and they will handle user requests for deleting or transferring personal data, and they will prepare your infrastructure environments for future demands.
Driving growth through transparency and ethics is already a competitive advantage and builds trust in your brand. With medical AI, both clinicians and patients will be interested in understanding how the model reached its prediction or why certain treatments or actions are recommended. Market penetration, adoption, reputation, industry collaborations, and data access may be affected.
Fundraising/investment – VC due diligence takes data privacy into consideration, especially in the medical technology sector. If this is not taken into account, you could struggle to raise capital or enter new markets. When it comes to private equity investments in medtech (more mature companies), PE firms will look closely at how you acquire and repackage proprietary data as a key indicator of scalability. A lack of customer participation, data privacy regulations, patient consent requirements, intellectual property laws, or data quality issues often hinder companies in achieving significant growth.

Three tips for hiring privacy experts in startup leadership roles

Find someone with:

With privacy touching so many aspects of your business and the need to work with a multidisciplinary team, it’s essential for your privacy expert to understand all relevant laws and regulations across the globe. Additionally, they will need a solid understanding of data and product principles, technology, commercial frameworks, and the healthcare industry landscape.
Anyone with startup experience (or watching HBO’s Silicon Valley) knows that building the plane mid-flight is a typical feature of startup life. Solid corporate backgrounds are useful, but so is the ability to adapt, be resourceful, and deal with the unstable, fast-paced nature of a fast-growing business.
Diverse backgrounds – it’s well known that diversity and inclusivity are important. However, when you develop AI products, diversity ensures that bias is controlled and your product is built with a range of perspectives. This is especially important if the product is aimed at or intended for a specific user group/specific patient (e.g. women).

 

Leave a Reply

Your email address will not be published. Required fields are marked *