At the July 25th PhUSE Webinar Wednesday, Ethan Chen (Director, Division of Data Management Services and Solutions Office of Business Informatics, FDA CDER) gave us a glimpse of the FDA’s view on Technical Rejection Criteria for Study Data. Those that were at the US Connect Conference earlier in the year may have seen this already.
There was a lot of useful information but in particular he focused on some of the initial checks which are shown below:
Demographic dataset (DM for SDTM submissions), define.xml, the Subject-Level Analysis Dataset (ADSL for ADaM submissions) and Trial summary (TS) must be present. "Easy!" you might think, and yet the error rate for High errors 1736 and 1734 noted above is 32% across all submission types!
Almost a third, how can this be?!
Well he talked through their criteria in more detail and some checks may be more complicated than you first think. Some highlights are below:
- The SDTM TS dataset (ts.xpt) needs to be submitted regardless of standard used and it should be of the proper name, "ts_xpt.xpt" is not considered valid. Similarly for other datasets “dm.xpt” (SDTM) and “adsl.xpt” (ADaM).
- Next the STUDYID variable in the TS must equal the <study-id> tag in the STF file (stf.xml), if it does not, then it is not considered a valid Study TS dataset.
- Within the SDTM TS dataset, if the Study Start Date (TSPARMCD="SSTDTC") is missing, or its associated TSVAL is not a valid ISO 8601 date then this also causes the TS to fail this criteria.
- If a file called "define.xml" does not exist as part of the submission, the study fails the rejection criteria - deviations from this name are considered invalid (e.g. "def-o4-123.xml" is NOT valid according to the FDA technical rejection criteria). Submitting a define.pdf file instead of a define.xml is equally not acceptable.
The above noted items are not difficult to correct and ultimately allow the FDA to work out what standards were applicable to your study during submission. Simple things like this can delay submissions, delaying approval and therefore potentially life-changing medications reaching their target patients. He did add that the FDA has not (yet) rejected any submission containing these errors but they do plan to use the technical rejection criteria to identify applications that are not fulfilling these requirements.
So what can we do?
ALWAYS keep in mind the name of files we create for submission. Keep filenames lowercase (where possible, unless specifically requested otherwise) and ensure they are in the correct format/convention - filenames are an important part of QC, particularly for the define.xml package and associated files.
For define.xml (and stf.xml) and even STUDYID columns in your datasets, ensure your Study Metadata are correct - they should be fully checked before submission. Just because some Study level information wasn't shown by default in older Stylesheet views of the define.xml doesn't mean this information is any less important than the rest of the contents.
For the SDTM TS domain - do not underestimate the content - it can often be a time-consuming dataset to research and get right, but the time spent on it is a requirement not just a “nice to have”, so the time is justified.
To avoid validation errors you should always understand the requirements from guidance and recommendations for submitting study data in the Study Data Technical Conformance Guide. With the extra note that this does change over time. We should always strive to check for and use the latest version where possible.
Quanticate's statistical programming team can support you with CDISC Mapping, CDISC SDTM in Integrated Summaries and also SDTM conversions and domains. Submit a Request for Information and a member of Quanticate's Business Development team will be in touch with you shortly.
Related Blog Posts:
- The INTO Statement in PROC SQL to Create Macro Variables
- Using CDISC SDTM to Improve Cost and Quality in Integrated Summaries
- Creating Custom or Non-Standard CDISC SDTM Domains