Into The Niche: Data Management And Its Importance In Clinical Trials

Into The Niche: Data Management And Its Importance In Clinical Trials

Clinical Trials have become quite famous lately. Thanks to Covid-19, the public has become much more aware of the pharmaceutical research and the work that needs to be done before a medication or a vaccine can actually hit the market and bring small or large miracles to the patients and the people who so desperately need them.

Data Management plays a significant role within those trials, and outcomes are measured by data analysis Therefore, having accurate, integrated, and reliable high-quality data is the ground for which the decision is made as to whether a compound is safe and effective enough to become something you can purchase in the pharmacy or supermarket. Moreover, if it will be given to you during your next hospitalization or even vaccinated into your body.

In- depth Details about Clinical Data Management:

Clinical Data Management starts early in a trial. Medical experts are responsible for the writing of a study protocol… you can think of this as the specification book on which a programmer would commence his or her work once received from the client. Of course, a study protocol is a bit more tricky and more exact as it is the guidance on the whole conduct of a trial and contains a lot of medical information, lab parameters, and other explanations on examinations, results, safety measures and so on. It is a mighty document with several hundred pages, and it is helpful to have a bit of a medical or biological background. So, once this protocol is final, which means approved by several medical advisory boards, ethic committees, and regulatory authorities, the Clinical Data Managers start to translate the requirements of the trial into a so-called Case Report Form (CRF). This was done on paper back in the day; nowadays, it is compiled into an electronic CRF. This eCRF builds the user interface, and the trial data is entered directly at the sites into it. In the past years, a lot of standard programs had come to the market that supported this data capture and built a database out of the demands that were resulting from the programmed CRFs. Medidata Rave and Oracle Clinical (or InForm) are the only ones to mention a view of the standard solutions that are used the most throughout the industry.

The eCRF is built from the very beginning with a lot of checks and security warnings to ensure the subject’s security during a trial as much as possible. Queries are programmed to pop up in case data points do not match predefined ranges, data points are missing, or data points do not correspond to a predefined condition resulting from other data points entered. Those queries need to be answered by the study personnel on site, and then the answers are checked by the Data Management team and either closed or required again if an answer does not logically seem sufficient. Of course, a stable audit trial must also be implemented.

Clinical Data Management Guidelines:

CDM must follow guidelines and standards, and it is strongly regulated. The “bible” of Clinical Data Management is the international Good Clinical Practice (ICHGCP) standard guideline on ethics and the protection of human rights for those who are participating in clinical trials. Additionally, the Society for Clinical Data Management (SCDM) has provided the Good Clinical Data Management Practices (GCDMP); a document that provides best practices and minimum standards to be implemented. 

Another important consortium significantly supporting the standardization of clinical trial data is the Clinical Data Interchange Standards Consortium (CDISC). CDISC standards have been developed to support the exchange, submission, and archival of clinical study data and the corresponding metadata of the data points. The two most important standards to be named here are the Study Data Tabulation Model (SDTM), and the Clinical Data Acquisition Standards Harmonization (CDASH).

Clinical Data Management that is compliant with all the guidelines and standards ensures the data security of a trial, including data confidentiality, availability, and integrity. This is fundamental for generating reliable and high-quality data in order to reduce the time from research to market, and to secure the safety of the trial subjects or volunteers.

The data entered in the eCRF is not the only important data source for a trial anymore. External data, and even real-world data, like data sent from Holter ECG machines in hospitals, external laboratories, smart watches, or other devices also needs to be integrated into the study data, and this must suit the same standards and guidelines. The integration of this external data, along with the corresponding programming and reconciliation, is also done by Clinical Data Management.

So, with a lot of different tasks and systems, Clinical Data Management is from the setting up of a trial, trying to ensure reliable and clean data, so that at the end of a trial, the completeness, consistency, and plausibility of the study data is ensured and the data can be transferred to the biometrics for proper analysis.

Once a study is closed and the data is declared clean and transferred for analysis, it is again the Data Manager who needs to close the database, ensure access is revoked, and ensure that the data is unchangeable from that very point of time on.

To summarize, here is a list of the typical Data Management tasks performed in the clinical research during a trial:

• Design and development of case report forms (electronic or paper-based)

• Development of the study database

• Programming of plausibility and consistency checks

• Creation of the data management plan and the data entry manual

• User training and support (including user administration)

• Randomization (in cooperation with biometrics)

• Handling of protocol deviations 

• Validation of data entry and query management (data cleaning)

• Creating and programming study-specific status reports and overviews (Clinical Analytics)

• Medical Coding of Adverse Events, Diagnoses, and Drugs with the latest coding dictionaries

• Data preparation for statistical evaluation (such as DSMB, statistical analyses)

• Performing a database-lock, mapping the data to the SDTM standard, transferring the data, and preparing the final archiving

• Development of SOPs and guidelines

All of this high-standard delivery must be done in a very time-sensitive manner as this is all about medicine and patients cannot wait.

  • Top articles, research, podcasts, webinars and more delivered to you monthly.

  • Former Nurse, Computer Scientist
    Leave a Comment
    Next Post

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    Big Data, Cloud & DevOps
    Cognitive Load Of Being On Call: 6 Tips To Address It

    If you’ve ever been on call, you’ve probably experienced the pain of being woken up at 4 a.m., unactionable alerts, alerts going to the wrong team, and other unfortunate events. But, there’s an aspect of being on call that is less talked about, but even more ubiquitous – the cognitive load. “Cognitive load” has perhaps

    5 MINUTES READ Continue Reading »
    Big Data, Cloud & DevOps
    How To Refine 360 Customer View With Next Generation Data Matching

    Knowing your customer in the digital age Want to know more about your customers? About their demographics, personal choices, and preferable buying journey? Who do you think is the best source for such insights? You’re right. The customer. But, in a fast-paced world, it is almost impossible to extract all relevant information about a customer

    4 MINUTES READ Continue Reading »
    Big Data, Cloud & DevOps
    3 Ways Businesses Can Use Cloud Computing To The Fullest

    Cloud computing is the anytime, anywhere delivery of IT services like compute, storage, networking, and application software over the internet to end-users. The underlying physical resources, as well as processes, are masked to the end-user, who accesses only the files and apps they want. Companies (usually) pay for only the cloud computing services they use,

    7 MINUTES READ Continue Reading »