At the July 25th PhUSE Webinar Wednesday, Ethan Chen (Director, Division of Data Management Services and Solutions Office of Business Informatics, FDA CDER) gave us a glimpse of the FDA’s view on Technical Rejection Criteria for Study Data. Those that were at the US Connect Conference earlier in the year may have seen this already.
There was a lot of useful information but in particular he focused on some of the initial checks which are shown below:
Demographic dataset (DM for SDTM submissions), define.xml, the Subject-Level Analysis Dataset (ADSL for ADaM submissions) and Trial summary (TS) must be present. "Easy!" you might think, and yet the error rate for High errors 1736 and 1734 noted above is 32% across all regulatory submission types!
Well he talked through their criteria in more detail and some checks may be more complicated than you first think. Some highlights are below:
The above noted items are not difficult to correct and ultimately allow the FDA to work out what standards were applicable to your study during regulatory submission. Simple things like this can delay submissions, delaying approval and therefore potentially life-changing medications reaching their target patients. He did add that the FDA has not (yet) rejected any submission containing these errors but they do plan to use the technical rejection criteria to identify applications that are not fulfilling these requirements.
So what can we do?
ALWAYS keep in mind the name of files we create for a regulatory submission. Keep filenames lowercase (where possible, unless specifically requested otherwise) and ensure they are in the correct format/convention - filenames are an important part of QC, particularly for the define.xml package and associated files.
For define.xml (and stf.xml) and even STUDYID columns in your datasets, ensure your Study Metadata are correct - they should be fully checked before submission. Just because some Study level information wasn't shown by default in older Stylesheet views of the define.xml doesn't mean this information is any less important than the rest of the contents.
For the SDTM TS domain - do not underestimate the content - it can often be a time-consuming dataset to research and get right, but the time spent on it is a requirement not just a “nice to have”, so the time is justified.
To avoid validation errors you should always understand the requirements from guidance and recommendations for submitting study data in the Study Data Technical Conformance Guide. With the extra note that this does change over time. We should always strive to check for and use the latest version where possible.
Quanticate's statistical programming team can support you with CDISC Mapping, CDISC SDTM in Integrated Summaries and also SDTM conversions and domains as part of a regulatory submission package. Submit a Request for Information and a member of Quanticate's Business Development team will be in touch with you shortly.
Related Blog Posts:
Address - UK HQ:
Address - US HQ: