Date: Tue, 10 Jul 2007 16:16:19 -0400
Reply-To: Sigurd Hermansen <HERMANS1@WESTAT.COM>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: Sigurd Hermansen <HERMANS1@WESTAT.COM>
Subject: Re: proc freq running out of space
Content-Type: text/plain; charset="us-ascii"
If by 'space' you mean memory, PROC FREQ typically tries to allocate
enough memory to hold data that it is summarizing. I advise using a SQL
query to group and count instances of the six codes, and then use PROC
FREQ with the WEIGHT option to compute frequency tables.
From: email@example.com [mailto:firstname.lastname@example.org]
On Behalf Of z
Sent: Tuesday, July 10, 2007 4:05 PM
Subject: proc freq running out of space
Hi and thanks in advance.
That said; I am running proc freq in a purely inquisitive attempt on a
pretty large dataset. (19 million records, 16 fields; 6 fields of which
I am doing the proc freq on). As far as I can tell, each of these 6
fields is just a bunch of codes, i.e. there are only a few possible
values, 5 or 6. But I am filling up the SAS work space. So, my question
is, does a proc freq on a large number of obs with only a few values
fill up the work space? Because, frankly speaking, it seems to me that
all the code would do is read down the file and keep tabs on how many of
each value it's encountered, which would be a tiny file 2 fields wide
and as many obs as there are distinct values, i.e. 6 at most. Or, is
this telling me that one of the fields has a large number of distinct
values that I don't know about, which makes intuitive sense to me, but
doesn't match what I think the reality of the situation is?