Author: Muhammad Bilal, Vice President of Data Sciences at Medrio
In our fast-paced world, there is a lot of talk about accelerating clinical trials and dealing with missed timelines, delayed milestones, and flawed data, concerns shared by all study stakeholders. Many – often quite costly – issues that keep coming up repeatedly could be avoided by paying more attention to the importance of effective database design upfront.
In many ways, the quality of the database design impacts not only study timelines and data quality but the outcome, possibly even the ultimate relevance of clinical trials. While database design is often pushed to the side in the quest for moving a trial along as quickly as possible, it is crucial to remember the importance of effective database design as a critical basic building block.
Naturally, sponsors would like to start clinical studies quickly, would like their first subject visit as soon as possible, and need the database to be ready at the earliest date. Having the database up and running as fast as possible helps sponsors ensure that their study runs smoothly from the onset and can start at the desired date. Effective database design is paramount and dramatically contributes to potential savings of sponsors’ time and financial resources.
Clinical trial data management quality
Largely, accelerated trial times stand and fall with the quality of clinical trial data management. It is essential to remember that if done well, clinical trial data management is so much more than simply collecting data electronically, cleaning it to match the source document, and sending it off to the next department.
Outstanding clinical trial data management means keeping track of why things are being done the way they should be done. It means taking a step back to look at the larger picture and understanding the integral part effective database design plays in outstanding clinical trial data management.
Starting on the right foot
Effective clinical trial data management starts with effective database design right from the beginning. Before starting the design process, a thorough understanding of the study protocol is paramount, and the complete study team must have all their questions clarified and understand the purpose of the study. They must also consult with all other departments to ensure everyone is on the same page and the system captures the necessary data.
The database should be designed to make the job easier for the next contributor. Who will work with the database next? How will the database design influence other contributors? Issues and delays arise if departments work in isolation, failing to anticipate the reality of the next steps.
“Easy does it”
Regarding a database used during a clinical trial, site staff and data entry staff should find it easy to collect and enter data into the database. The database should be as user-friendly as possible, with all related data on the same page, for example. Built-in edit checks should prevent potential data entry errors, accelerating trial times and improving data quality.
New technologies for increased accuracy and speed
During the last few years, eSource has moved into the spotlight and is fast becoming the most valuable tool for data collection and accelerating trial times due to skipping the entire data transcription process. It is a significant development, improving both data quality and accuracy, as well as increasing the speed of data collection while at the same time reducing errors with the help of in-app logic checks. It should be kept in mind, though, that eSource is only as good as the underlying database design. If the database design behind the scenes is flawed, even the most up-to-date eSource technology will produce poor results.
Keeping an eye on the goal
Data collection and entry are followed by data review, which should again start with checking if the study purpose is being achieved. What are the data trends? What is the meaning behind the data collected, and how is the data quality? To prevent future delays, any noted irregularities should be addressed as early as possible, at the database design level, before significant amounts of data are either collected or entered incorrectly. Query management should not wait until, for example, 25 – 30% of the data has been collected but start immediately by working closely with sites.
Focus on SDTM
Another significant contributor is the SDTM programmer involved in the human clinical trial for FDA submissions. An effective database design should produce SDTM-friendly raw data since, according to FDA standards, all raw data, no matter what Electronic Data Capture system (EDC system) was used, must be mapped under Study Data Tabulation Model standards (SDTM standards).
Problems with database design should not be ignored and simply passed on to the SDTM programmer. All data should be captured in a way that will ultimately save the SDTM programmer time. Once again, accelerated trial times start with effective database design. One example would be setting up the case report form in grid format. This small change, if done during the database design process, allows for data to be exported as normalized data, adding substantial value to the outcome.
Trial Speed and Data Quality: It All Starts with Effective Database Design
Accelerated trial times rely a large part on effective database design. The Clinical Trial Data Management group/EDC programmer should take pride in the fact that effective database design is one of the core building blocks of a successful clinical study.
A CRO’s goal of handing over a quality clinical study to the sponsor promptly, from data capture in the clinic to delivery of the final data package, rests heavily on the Data Management team’s ability to achieve effective database design.