Date: Mon, 4 Dec 2006 14:27:25 -0500
Reply-To: "Fehd, Ronald J. (CDC/CCHIS/NCPHI)" <rjf2@CDC.GOV>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: "Fehd, Ronald J. (CDC/CCHIS/NCPHI)" <rjf2@CDC.GOV>
Subject: Re: SQL and Dictionary Tables: slow with many librefs?
Content-Type: text/plain; charset="us-ascii"
> From: Robert Bardos
> my recent proposal in another forum of using PROC SQL and the
> dictionary tables to answer a simple question related to
> dataset variables was questioned by a knowledgeable forum
> member telling he had experienced performance problems with
> this technique when a large number of librefs was involved.
> My query contained a where statement specifying both a
> libname and two memnames (datasets). None of them were
> specified using wildcards. So even with a huge number of
> librefs, datasets and variables I fail to see where a very
> specific dictionary query would be significantly slower than
> a combination of similarly specific "proc contents out="
> executions followed by an SQL query over resulting smaller dataset.
> What do you think?
The more librefs+tables
-- it's not just librefs, but total MemNames --
you have, the slower you go.
I had an application working find in unit testing;
when I started doing the integration testing
(running all modules sequentially)
it just kept getting slower, and slower.
reason: I was writing new tables to the libref
from which I was using SQL to pick out the last set
of names of tables to process in the next step.
I disabled the sql list processing
and used <horrors!> global macro variable(s)
to pass the data set list.
moral: Not every good trick works everywhere!
Ron Fehd the SQL into:macro maven CDC Atlanta GA USA RJF2 at cdc dot
Repetition reduction enhances elegance!
Repetition reduction furthers finesse!