Date: Thu, 22 Jan 2009 21:54:31 -0500
Reply-To: Sigurd Hermansen <HERMANS1@WESTAT.COM>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: Sigurd Hermansen <HERMANS1@WESTAT.COM>
Subject: Re: Merging many-to-many, efficiently
Content-Type: text/plain; charset="us-ascii"
If you are trying to join a period attribute from a table into a survey dataset, a SQL join seems more than adequate and very efficient. This query amounts to nothing more than a table look-up:
create table altSurveyData as
from surveyData as t1 left join calendar as t2
On my machine it runs much more efficiently than the double-set Data step that you have developed. Perhaps appropriate use of SAS SQL will motivate some of the other programmers in your shop to learn to declare solutions and let the SAS System figure out how to implement them.
An INNER JOIN would be more efficient, but it would eliminate any tuples in the surveyData that don't have corresponding periods in the calendar. In that sense it serves as a Q/C check.
From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU] On Behalf Of Joe Matise
Sent: Thursday, January 22, 2009 7:47 PM
Subject: Merging many-to-many, efficiently
I have a dataset with question responses and a period value corresponding to the month the respondent was surveyed. I want to produce a dataset with multiple rows per respondent, one for each reporting period that respondent qualifies for - month, fiscal quarter, fiscal year, etc. - so I can do a proc means using period as a class variable. I have that information in a second dataset.
This dataset will grow relatively large (50,000 respondents per month, up to 18 months reported on at any given time, so 900,000 potential respondents, and up to 5 reporting periods they can qualify for each - up to 4.5m rows in the resulting dataset). I'm considering how potentially other ways of running the data might be more efficient, but for the moment let us say this is how I will go.
What would be the most efficient way of merging my two datasets together (or otherwise assigning period type)? I've thought of three ways so far, but my tests even in those methods were inconclusive so far.
Method 1: A merge. I can't actually get this to behave as I want it to (hence I can't really test it) and don't necessarily know that it will work. Merge does not seem to want to put out the records more than once each.
Method 2: A dual set statement, where I set the master dataset, then do n=1 by 1 until n=nobs using point=N and an if statement to limit the new dataset to only those obs. where the period is the period of the current record. ~20 seconds for 50,000 respondents (one period only), so ~6 minutes for 900k respondents, I imagine. I'm also experimenting using KEY= but that doesn't seem to help (yet).
Method 3: A macro-based solution where I store a bunch of IF statements, one for each period, that performs five outputs (as appropriate). Seems slow as you'd have 18 if statements evaluated per row of data, but then again method 2 does process a much larger dataset.
Method 4: Proc SQL join (see below)
Sample data follows... I'm looking for most efficient solution that is not overly complicated (one or two datasteps preferably). I'm avoiding PROC SQL for the moment because I'm the only one in the office who can really understand it (and so nobody else would follow my code if I used it), but if that turns out to be better it can of course be used. I included method 2 below, as that's the one I've managed to work out so far that seems practicable.
do periodtype=200901 to 200906;
do i = 1 to 50000;
input periodtype period $;
do N=1 by 1 until (N=nobsvar);
set calendar point=N nobs=nobsvar;
if periodtype=cperiodtype then output;
That takes about 6 seconds for 1/3 of my potential rows, but I have ~150 questions per row, so it takes rather longer, of course. I might convert the calendar dataset to horizontal, not vertical (so only one row per periodtype, with multiple period variables), but I'm not sure that is any more efficient (as I have a lot more statements to process then per row).
Thanks in advance!