Date: Mon, 12 Feb 1996 07:32:00 EST
Reply-To: "Shipley, Paul" <PShipley@VITGSSW.TELECOM.COM.AU>
Sender: "SAS(r) Discussion" <SAS-L@UGA.CC.UGA.EDU>
From: "Shipley, Paul" <PShipley@VITGSSW.TELECOM.COM.AU>
Subject: Re: Efficient processing of external Datasets
I have used SAS with MVS to operate on sequential datasets for many years
and found it no worse than any other product (and much better than some!).
While there are several factors that can effect performance, probably the
most significant in operating on sequential files is the block size of the
dataset. If you have a small blocksize (say 6000 in this case) you can
expect your job to execute significantly slower than if you had half track
Some other things to watch are:
As you are not saving a SAS dataset, ensure that you are using DATA _NULL_
to avoid creating a temporary work dataset.
Any variable that is not being reformatted or mathematically processed
should be read/written as character to avoid the overhead of conversion.
Please contact me directly if you require further information.
Telstra Corp, Australia
To: Multiple recipients of list SAS-L
Subject: Efficient processing of external Datasets
Date: Sunday, 11 February 1996 8:39AM
I need to read a large file (~150 variables >600 LRECL >1 Million rows)
containing different data types - packed/binary/float/character etc.
(The input file is NOT a SAS dataset).
Need to convert it to a CSV file with quotes around character variables
and comma separated.
We are using the QUOTE function to quote the character variables and
inserting commas between the variables with a 'put +(-1) ','.
This simple program is turning out to be a CPU hog. The program runs
the MVS mainframe and is a real bottleneck. The program only contains a
single INPUT statement, a series of statement to QUOTE character variables,
followed by a single PUT statement.
Is there anyway I can speed up processing? Or, is SAS generally not
good with external files?