LISTSERV at the University of Georgia
Menubar Imagemap
Home Browse Manage Request Manuals Register
Previous messageNext messagePrevious in topicNext in topicPrevious by same authorNext by same authorPrevious page (September 2008, week 1)Back to main SAS-L pageJoin or leave SAS-L (or change settings)ReplyPost a new messageSearchProportional fontNon-proportional font
Date:   Wed, 3 Sep 2008 15:12:04 GMT
Reply-To:   Lou <lpogoda@VERIZON.NET>
Sender:   "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From:   Lou <lpogoda@VERIZON.NET>
Subject:   Re: Tables, Listings, Graphs
Comments:   To:

"Adam" <alamanna@ROCHESTER.RR.COM> wrote in message > Folks, I am trying to find some sort of benchmarks for how long it should > take to make tables, listing, and graphs for a pharmaceutical study. I have > a person who thinks the time to make these TLFs should decrease as more and > more studies are programmed and completed. While I agree that certain > efficiencies should come about after a few of these studies are completed > (and programs can be reused on future studies), I believe there still is > some time that it takes to make the TLFs due to column width adjusting, > pagination, specs that change, unique tables, etc. > > Anyone that can shed some light on this, it would be much appreciated.

I asked this same question 2-3 years ago when I was trying to "negotiate" standard times for these outputs with the programmers in my group. I received a lot of replies, most saying that standard times weren't possible, none giving any estimates.

I worked for one CRO where the standard for a simple table (one without p-vales, confidence limits, etc.) was 4 hours for an experienced programmer, 6 for a newbie. For the more complicated tables (ones with p-values, etc.) the standard was 6 hours for an experienced programmer, 8 for the newbie. Graphs weren't an issue in those days - in the 5 or so years I worked there, we did maybe a dozen graphs. Listings tended to be regurgitations of CRF data, usually by treatment group. The standard was 3 days to do all the listings for a study.

For what it's worth, three years or so ago the company I was working at was negotiating a contract with a major pharmaceutical company. They wanted us to use their table/listing generating system - I forget the name of the product, but they had paid big bucks for it, and spent a fair amount of time training programmers in its use. During the course of preparing the bid, I talked with the programming director about the system and its efficiencies. I was told that since adopting the system, programming productivity had gone from an average of two tables a day to three.

All these times assumed derived/analysis datasets were in place, and don't count any unanticipated changes that may be discovered necessary as programming proceeds. QC time was roughly equal to programming time, and the programming estimates include correcting errors discovered during QC.

If/when CDISC gains traction in the industry, these times may decrease substantially. If every AE dataset, for instance, has the same structure, theoretically you could plug it into a standard AE table program and generate a table in a matter of minutes. I think that ideal will be some time in coming - so far, each company seems to have it's own version of SDTM, some of which are less than 50% compliant with the published standard.

Back to: Top of message | Previous page | Main SAS-L page