Using artificial intelligence in clinical trials can be beneficial, but protecting patient privacy is critical.
In today’s world, technology is on everyone’s mind. From measuring the number of steps taken in a day to tracking the quality number of hours slept, our data is continuously managed and processed by various applications and tools. From an individual perspective, these technological innovations are a one-stop shop for tracking health vitals; however, for a seasoned security professional, it is a path to significant risk regarding confidentiality, integrity, and availability of personal data. Additionally, with the growing need for cloud storage and the adoption of advanced analytics, such as machine learning and artificial intelligence (AI), these risks exacerbate and could result in unfair and unauthorized data use, access, sharing, or sale of personal data.
Although risks exist, applying technology and advanced analytics within life sciences is an essential part of enhancing the industry’s efficiency and effectiveness. This article highlights critical steps organizations can take to protect individual patient privacy without stifling technological advancements in clinical trial-related activities. It also identifies how to simultaneously maximize the impact of AI and appropriately protect the privacy of clinical trial patients.
COVID-19 and clinical trials
The COVID-19 pandemic has affected all our lives in unanticipated ways.1 Since the pandemic’s beginning, more than 5,000 clinical trials have been launched to test lifesaving COVID-19 treatments and vaccines, such as remdesivir, BNT162b2, and mRNA injections.2 Outside of COVID-19-related trials, more than 37,000 clinical trials were registered in 2021.3 Traditionally, clinical trials have posed challenges that affect the efficiency of research to develop supporting evidence of medicinal products for patients. Certain factors related to trial inefficiencies and opportunities for improvement continue to be highly relevant today.4
Some factors related to trial inefficiencies relate to the identification and recruitment of participants along with related clinical data acquisition which ultimately contribute to inflated costs and trial delays.5 These inefficiencies suggest that new drug development continues to be slower than desired, arduous, and expensive.6 However, some clinical trial sponsors are accelerating their efforts to move toward more patient-centered experiences by adopting digital technology and AI, hoping to address participant-related inefficiencies.7, 8
Regardless of the chosen approach, clinical trial sponsors and clinical investigators must think through how to maximize patient-level clinical data while addressing the privacy-related risks that individual patients face. Protecting the privacy of clinical trial participants remains a primary responsibility.9
Clinical trial evolution
The first double-blind controlled clinical trial was conducted in 1943 and involved more than 1,000 participants across Great Britain.10 Each patient had to go to great lengths to participate (for example, travel long distances), all observations were recorded by hand, and it took 18 months to generate the results.11
Fast forward to today, and what would surprise many is that current clinical trials do not look much different. For example:
- Most trials are designed to require direct patient intervention and as such are conducted on-site at physical locations.
- Many documented processes are printed on paper.
- Most researchers need a lot of time to complete analyses and record results.
- Most patients lack visibility into their trial data as a result of these circumstances.
Clinical trials are moving toward more digitalization, automation, and decentralization. Constant evolution in the clinical trial arena continues to focus on solving efficiency and efficacy challenges. This evolution is positively affecting speed to clinical milestones and related decisions using sometimes novel technologies that enhance patient recruitment and data collection and analytics. However, the use of technology invariably requires an additional focus on protecting patient privacy.
As opportunities arise regarding new-to-clinical trials technologies, it is natural to turn attention to privacy and the related obligations clinical sponsors and investigators have for protecting the privacy of individual data subjects. Traditionally, participants have been informed about the data collection plan and related details via the informed consent document to support ethical principles regarding respect for individuals. Practically speaking, this mechanism is challenging at best because it involves communicating about an individual’s privacy rights, especially as legislative requirements continue to change around the world.
Informed consent language is prepared by clinical researchers to notify potential and enrolled clinical trial subjects about the clinical study to help people choose whether to enroll in the study or continue to participate in it if the contents of the informed consent have changed. Language includes content designed to provide participants with information about the planned study’s risks and benefits. Informed consent document language is reviewed by an institutional or ethical review board.12 As privacy measures are included in the informed consent document, it is natural to translate the consensus to specific privacy-focused content contained within the document.
Despite the ideal characteristics of a well-designed, well-written consent form being well known by the research community, the consensus remains that it can be challenging for nonmedical or nonscientific individuals to read consent forms. Such forms can be too complex to fully comprehend, and many are just too long, so the result is that the documents fail to support facilitation of truly informed consent by study participants.13
Impact of big tech and AI
Technology companies big and small continue to look for ways to use the same tools that individual data subjects commonly use. Mobile devices, for example, are a potential platform for data collection capabilities that could be used to streamline the clinical trial process. Apple has created an ecosystem with the iPhone® and Apple Watch® mobile and wearable devices, enabling real-time health information collection. Additionally, Google is creating a research ecosystem through its Google Health Studies application and developing products through a subsidiary, Verily Life Sciences. These digital health technologies enable user-friendly measurements and provide the clinical research community with new tools. However, for the clinical trial process to make gains in the use of these tools, participants must be willing to share sensitive data digitally.
In combination with wearable technology, AI techniques can create value by efficiently monitoring patients, in real time, during a clinical trial. As a side benefit, it also might be possible to enhance compliance and the reliability of assessment endpoints, and even possible to use AI to simplify informed consent language to help increase comprehension.
Patient privacy risks within AI
Although AI helps create more efficient processes and procedures for both end users and clinicians, privacy risks arise when processing a specific data subject's data elements, including, but not limited to, personal characteristics, facial recognition, and other biometric data records. Per standard procedures and privacy-related compliance requirements, data elements should be categorized and assigned to one or more data categories (such as patient, consumer, customer, employee, or clinician) and one or more records of processing activity.
In addition, it is essential that each record of the processing activity is assessed, documented, and recorded in a privacy management platform such as DataGrail, OneTrust, Osano, Securiti, Transcend, TrustArc, and others. Using a platform is imperative for efficiently and effectively capturing specific information, such as asset inventory and processing activity details, along with legal entities and vendors involved in the process. Recording this information is necessary to be able to generate a catalog of personal data used and comply with regulatory standards, such as Article 30 of the General Data Protection Regulation. Without such documentation, privacy programs are subject to risks related to confidentiality, integrity, and availability, along with the potential for regulatory fines and other penalties.
Additionally, companies that conduct clinical trials using digitalized technologies and advanced analytics experience new challenges regarding computing data, managing vulnerabilities, ensuring encryption in transit and at rest, and user-side assistance when setting up patient devices for data input. These risks increase the potential of companies lacking the proper documentation for privacy program policies, processes, and risk and data governance.
AI creates broader visibility and opportunities to contribute to the development of medicinal products. Therefore, the responsibility to build and enable AI-powered applications and software solutions for clinical trials is shared among various business units such as patient support, legal, compliance, public relations, and marketing. Building efficacious and ethical AI requires that organizations use ethical practices, including methodical communication regarding the company's approach, needs, use, and ongoing training. If these elements are in place, everyone will have a role when supporting privacy programs, especially during clinical trials that use AI, which has altered the traditional approach to the clinical trials.14
Solutions and recommendations
Privacy implications affect every organization differently, but privacy and data protection-related challenges in clinical trials are among the most complex issues an organization will tackle. Following are questions organizations can ask to build a more effective program:
- Has an individual been identified as a program leader across the organization?
Organizations need to have a dedicated individual to manage the privacy program, data inventory, and privacy risk impact assessments. Additionally, each business area requires a committed owner for each record of the processing activity. The owners should be employees who possess knowledge about the data assets used (such as software and applications), data elements collected, who has access to the data, and whether the data is transferred away from the country of collection, per regulation, following Schrems II.15
- Has bias been addressed?
The most common challenge when using AI during clinical trials is race and gender bias. When these biases arise, organizations must focus on data management and work closely with the engineering teams to ensure the correct data is being processed and used to train predictive models. Organizations should have a specific group trained in good data quality and integrity practices and monitoring logs to identify and correct errors.
- Has a risk appetite been established?
Organizations running large-scale clinical trials and processing exponential amounts of data using AI should consider identifying how much privacy and data protection-related risk they are willing to assume, being mindful of regulatory obligations and the essential need to maintain trust with all stakeholders involved in the clinical trial process. Organizations should document their processing activities, risk appetite statement, and risk capacity by estimating the losses they could and are willing to tolerate. Knowing clinical trials are being powered by AI, it is also necessary to understand the tools, intentions of activities and models, and the ability to monitor for biases.
- Are robust data governance and accountability standards in place?
Organizations can help bridge the gap between engineering teams, patients, and AI technology by having implemented strong data governance and accountability. For example, engineering teams responsible for overseeing AI technology can help identify and mitigate risks using periodically structured privacy risk assessments. Doing so can allow business process owners to flag risks and potential data misuse early and allow input from critical stakeholders, such as managers, AI engineers, quality specialists, and governance specialists. These stakeholders should review the AI algorithm and examine the assets and third parties used for data collection.
- Are employees trained and educated on privacy procedures?
Regardless of an organization's current privacy and data protection program, opportunities to increase effectiveness still exist. Privacy consulting specialists can help organizations put their privacy programs on the right path forward. Some of the services they can offer include:
- Clinical and regulatory privacy program assessments
- Clinical trial data mapping inventory and risk scoring of processing activities
- Privacy enhancing, technology-related selection, design, and implementation support
- Periodic monitoring of technology and privacy program effectiveness, including:
- Analyzing and assessing a current privacy program
- Designing and implementing a privacy program, using a privacy program platform
- Monitoring and periodically assessing the privacy program
- Resolution of data subject needs beyond those needs addressed by the clinical trial investigators and their staff, including:
- Creating and implementing a data subject access request module
- Designing opt-in and opt-out consent forms for a website or application
- Implementing a consent management module to collect and document patient consent at any given time
Apple, Apple Watch, and iPhone are trademarks of Apple Inc., registered in the U.S. and other countries and regions.