From The Editor | August 16, 2024

Early Planning Trumps eCOA Bells & Whistles

Dan_2023_4_72DPI

By Dan Schell, Chief Editor, Clinical Leader

Julie Dietrich
Julie Dietrich

If there's one statement that best summarizes my eCOA discussion with Julie Dietrich, it’s “I can't overemphasize the criticality of early strategic planning and good design of the system itself.” Dietrich is no stranger to incorporating eCOA into trial designs; she has 25 years of experience in the biopharmaceutical industry and first used the technology back in 2002 with some pain trials for cancer patients. Since then, she’s been involved with incorporating eCOAs into multiple trials for multiple TAs at four sponsor companies where she has worked in senior ClinOps and clinical development roles.

During our conversation, she gave me examples of trials where the eCOA implementation went smoothly and one where there were some real hiccups — but we'll get to those a little bit later. First, let’s talk about all that planning she’s advocating.

MULTIPLE PLANS = DETAILED DOCUMENTATION

Dietrich is great at listing key questions you should be asking when considering an eCOA and when you should be asking those questions. Take the monitoring plan, for example.

  • Are you training patients how to use the technology appropriately?
  • Are patients using it? Do you have good adherence?
  • What type of data should the CRA be looking at in the eCOA platform?
  • What actions should be taken by site staff based on certain reported data?

Similar to my recent conversation with Robert Goldman, Dietrich talked about the importance of defining what a healthcare provider (e.g., PI) will be required to do when reviewing eCOA data. Should they be considering not only the eCOA scores, but the change in scores over time? And what about CRAs? Should they be reviewing the actual data or just compliance metrics?

“In your study protocol, you must describe how the data will be collected,” Dietrich says. “You don't have to go into a lot of detail about the specific device that will be used or if it's provisioned or the patient’s own device. But you do need to generally describe if the eCOA will be paper based or electronic.”

Other forms of documentation, such as the global product strategic plan, evidence generation plan, or global development plan also should discuss how you are going to use the data. Say you’re seeking a label claim. Is there already a qualified instrument? Do you want to modify an existing instrument or generate a new instrument? Either option will require you to identify and define all the psychometric performance elements of that instrument, which can be costly and take even more time that you likely don't have to spare. “This is why you want to start early with your eCOA planning,” Dietrich says. “You might need to make modifications, and you don't want to wait until your pivotal trial to test everything.”

A FOCUS GROUP COULD GUIDE YOUR TECH

During our interview, Dietrich talked a lot about the importance of pilot testing and using focus groups and how the FDA encourages both in its COA-related guidance documents, which will eventually replace the 2009 Guidance, “Patient-Reported Outcome Measures: Use in Medical Product Development to Support Labeling Claims.”   I took a look at the second guidance document — Patient-Focused Drug Development: Methods to Identify What Is Important to Patient (Feb 2022) — in that four part series. I found this section especially interesting, not only because it gives one example of pilot testing, but it also gets into the minutia of the type of training the agency expects when implementing eCOA systems.

Interview conduct
– Pilot testing length of interview and appropriateness of length for administration method.
– Selecting and training interviewers to perform interviews (e.g., considering expertise in performing interviews and other factors based on the characteristics of the disease or condition and target population under study).

  • The emotional burden for the respondent (potential for heightened emotions, including anxieties and discomfort among patients and caregivers), as well as the emotional burden for the interviewer (potential for emotional distress associated with hearing about difficult patient and caregiver experiences), may affect responses.

Pilot testing really came in handy for Dietrich during a trial that she was involved with for a  disorder that caused patients to commonly experience a certain symptom. The plan was for patients to use a handheld device to record daily the severity of that symptom for one year. This was a global study in approximately 10 countries. “There was a lot of skepticism that patients would be willing to enter this information every day,” Dietrich says. “At the earliest stage of the trial’s planning, we were thinking about the types of devices we wanted to use given that patients had to interact with them each day. We had to consider what percentage of our patient population even had smart devices or if it would be better to provision devices. We had to think about screen size and dimensions. You really have to ask yourself all these types of technology-related questions in parallel with your data questions, because there's pros and cons to each option, and every TA may need a different option.”

They decided to conduct a mini focus group where they explained the devices and technology, but also emphasized to the patients why it was important to enter the data in this manner and frequency. This helped formalize the trial design, and in the end, they were able to achieve over 80% compliance, “which was really good,” Dietrich says.

EARLY PROBLEMS

Back in the early 2000s, Dietrich says eCOA and its associated technologies were garnering a lot of attention in the industry, but there were some growing pains, especially related to the technology. “I remember there were some sites that were not setting up their devices properly,” Dietrich says. It was a multi-pronged problem. When the devices were shipped to a site, they needed to be provisioned for patient use. That included ensuring each one had the latest software downloaded. The problem was they never knew when the software was going to get an update pushed out. “Back then, I think some sites weren’t tech savvy enough to provision in the field and remember to update the software,” Dietrich says. “It led to data transmission problems. Of course, you could always transfer the data manually when a patient brought the device into a site, but that wasn’t convenient for the patient and didn’t give you the immediacy of data transfer for which the system was designed. Most of these types of issues have been resolved, though, with modern eCOA applications.”