The shift from paper to Electronic Data Capture (EDC) in the clinical trial world saw a shift in the way we look at the quality measurements of clinical data management (CDM) activities. The paper world had a clear understanding that the quality of the clinical data collected was simply the quality of the transcription work teams performed of transferring data from paper to a database. The Quality Control (QC) of paper versus database had a set standard for sampling of √N+1 or 20 subjects, whichever was smaller and a 100% QC of critical variables. Acceptable error rates were 0.5% which was widely agreed across the industry. These thresholds were no longer necessary when EDC enabled sites to enter the data directly and transcription was no longer needed. However, it is the role of the data management teams to be involved in many efforts to prepare data for appropriate analysis and submissions.
The quality of the efforts which result in developing data collection tools and cleaning the data collected can directly impact the quality of the data collected. Thus it is important for organizations to look into managing the quality of the workstreams the teams are involved in, especially as we are seeing increased streams of data being collected from various sources like eSource, ePRO/eCOA, EMR/EHR, wearables, mHealth, and AI based tools for adherence tracking, etc. The traditional thinking of an error rate is no longer an ideal solution to manage the expectations of quality rather quality must be nurtured as a habit or culture within teams handling data. Teams must also be qualitative in their approach towards measuring quality versus a quantitative effort of sample QC of the effort. Below are four areas of treatment which should help to instill a quality culture:
Effective Review of Data Collection Tool (DCT) Design Specifications
Clinical trials are nothing but an expensive method of ‘collecting data’. If we are not designing the tool to collect the data properly, we create a gap which cannot be filled resulting in piling up the gaps with fixes which in turn ends up with teams taking additional efforts to ensure data quality. Specifications are reviewed normally, however how effectively are we looking at the appropriateness of the design from the site’s point of view for EDC and from the patient’s point of view for ePRO? With the advent of policies like 21st Century Cures Act in the US, patient engagement is highly regarded as it helps data quality. Thus we should be looking at more patient centric data collection specifications which can motivate sites and patients to provide accurate details to the questions asked in respective Case Report Forms (CRFs). For example, a patient suffering from a muscular dystrophy would be more interested in assessing how best he/she can do her daily chores or how well they can play with their grandchildren rather than measuring a 6 step walking test to be reported everyday.
Reducing manual interventions in data collection is considered to be the future, where solutions which enable EHR/EMR integrations play an important role. Use of medical grade devices to collect data directly from patients when using wearables and the mHealth tool would help calibrated data to flow into integrated EDC databases with minimal or no interventions. AI based tools can collect medication adherence data without human intervention. In addition, using integrated eCOAs, Central Lab APIs, Medical coding, Imaging and safety data workflow with EDCs will help centralized data collection with minimal manual intervention in data transfer from varied sources. Using EDC solutions with associated tools like eConsent, eCOA/ePRO, Imaging, Safety Gateway etc. within the same architecture also help save time and effort setting up and monitoring integration. Overall, ensuring the overall data flow has minimal manual intervention can create opportunities for better quality data.
Automation of steps converting the data collected to standards would enhance quality as well as efficiency. The process starts from developing CDISC compliant eCRFs to implementing standard mapping algorithms earlier in the project lifecycle than usual so that the SDTM requirements during the conduct of the study would be addressed seamlessly with improved quality. This helps to streamline the downstream statistical programming requirements and make them more efficient, accurate and consistent across multiple data releases within the same study or across a program or portfolio of studies.
Training & Knowledge Sharing
We all know less human intervention can bring in more quality as it reduces the chance of errors; however planning the automation and integration to support goals set is ultimately important. The setting up of all systems must ensure the people involved have a greater, broader and deeper understanding of the end to end process flow. Generic and study level trainings have become just an onboarding routine. Developing comprehensive understanding with effective training is key to making teams deliver ‘first time quality’. Training should focus on aspects of effective study set up conceptualized from a blend of technical and clinical knowledge. An effective success measuring strategy for training and on the job mentoring efforts could take us a long way in ensuring the quality of data collection. Organizations should also encourage knowledge sharing platforms within their infrastructure enabling teams to create various communities of learning.
Quanticate’s Clinical Data Management team are dedicated to ensuring high quality clinical data and have a wealth of experience in data capture, processing and collection tools. Our team offer flexible and customized solutions across various EDC platforms. If you would like more information on how we can assist your clinical trial submit an RFI.