Some Truths You Can't Avoid About eCOA
By Dan Schell, Chief Editor, Clinical Leader
I clearly struck a nerve when I asked Robert Goldman about challenges associated with eCOA. About 18 minutes later he said, “I’ll stop there to let you take all of that in.” And trust me, there was a lot to take in.
Goldman, who recently accepted the position of head of clinical operations at biotech Contraline, has worked for small biotech and large CROs during the past 12 years. I first met him last December when he was a panelist on my Clinical Leader Live titled “Why Isn’t More Tech Being Used In Clinical Trials,” and he’s a frequent guest on Dan Sfera’s The Clinical Trials Guru podcast.
To say that Goldman was passionate about this subject would be an understatement. I should also say that despite some of the issues he has with eCOA and ePRO, he’s not opposed to using these technologies; he recognizes their positive potential. It’s just that he’s experienced situations where their use has led to more problems than solutions. Below, I’m going to try to summarize some of the key points he made.
THE SOURCE IS THE SOURCE
According to Goldman, a lot of people tend to forget that the data from ePROs and eCOAs is considered source. “Once an answer is put into the ePRO, that data cannot be changed — period. It's what the patient reported. Even if the patient calls and says, ‘I made a mistake,’ — tough bananas, you can’t change it.” Considering eCOA data accumulates quickly — some data points can be collected multiple times per day — if you have a problem with the tech or the process, you may be facing the difficult decision of scrapping data and starting over. Goldman experienced just that scenario many years ago when it was discovered that a design flaw in the ePRO tech for a trial he was working on was causing participants to inadvertently enter quantities that were much higher than a safe dose of a medication. For example, if a patient wanted to enter that they took 2 tablets in a day, the system reported they took 20. If they took 3, it reported 30, and so on. While this example could be viewed as a one-off tech issue that likely would not happen again, it leads to another of Goldman’s messages.
YOUR MONITORING PLAN COULD BE A PROBLEM
“If the CRA is not going into the portal to review ePRO data, or the site is not going in to review the data during an onsite or telephone visit, source is not being reviewed for accuracy, consistency, and safety. Those are important aspects of what and why we monitor to begin with,” Goldman says. He adds that many of today’s clinical trial technologies have functionality for monitoring efficacy, safety, and data compliance signals in real time, but you still need to train your staff on where this information resides and how to act on it. Does the data being reported by the patient during a site visit make sense with the ePRO and eCOA data? Is the PI reviewing the portal’s data and comparing that to what the patient is telling them (e.g., in a pain study) during a site visit? And if there's a discrepancy between those two data points, is the PI taking the extra time to talk through it with the patient and see why that is the case?
“With eCOAs, you also have a problem when a patient is seen by multiple physicians over a series of site visits,” Goldman says. “They all could be making different assessments. The eCOA should require each user to sign in, and if it’s not the right provider for a patient, they can’t complete the visit. Yes, that’s inconvenient for the patient, but you could argue the patient shouldn't have been scheduled unless their provider was going to be there.”
IT'S NOT TECHNOLOGY, IT’S BETTER TRAINING
We all know this one truth about technology implementations: All the bells and whistles will never save you that promised time and money if the users of that application aren't trained well or using it correctly. “If your system can send emails, texts, or phone calls to signal you something is wrong or needs attention with a trial or patient, but your staff is ignoring those alerts, then what good is the tech,” Goldman asks. “It’s not more automation, it’s not more AI; it’s better training your people on how to use and react to tech, and it’s better operational execution of the whole process. Unfortunately, this is not a new problem.”
Honestly, I’m only scratching the surface of my conversation with Goldman. He told me stories of ePROs that had coercive questions. He talked about his early days as a CRA visiting PIs and telling them that their opinions and verbal interactions with a patient didn’t trump what the patient had been reporting via the ePRO. He recalled seeing patients sitting in their car in a site’s parking lot filling out their paper PROs for the month. And again, he’s not opposed to these technologies. In fact, he is in the process of designing a study right now that will include an ePRO. “The ePRO can actually give you better contemporaneous resolution of outstanding data that you otherwise would not connect in the context of an absence of a telephone visit or an on-site visit,” he concludes.