Date: Fri, 14 May 2004 08:58:01 -0400
Reply-To: "Richard A. DeVenezia" <radevenz@IX.NETCOM.COM>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: "Richard A. DeVenezia" <radevenz@IX.NETCOM.COM>
Subject: Re: Updating a wide dataset
Wing-Sze Tsui wrote:
> Thanks Richard.
> The tables are a mix of numeric and characters. And I'm currently
> using Data Step with Modify statement and the KEY= option, with index
> on the master dataset. I've tried data step merge and the
> performance is not good neither. And also Proc SQL Joining and
> Updates, but the performance even worse. And I cannot just use Proc
> Append as the transaction dataset has not only new data that need to
> be append to the master dataset, but also some updates to the
> existing master dataset.
> I can't think of anyother way... please help!
> Oh, Thanks again for the reminder. I'll unsubscribe and resubscribe
> Thanks a lot!
Well, you certainly are in a tough spot. Resource bound and data much less
A search of support.sas.com might bring ideas for some intermediate relief.
If I recall there was a note that stated, in some cases, throughput can be
improved if you drop indices prior to modifying the table, and rebuild them
afterwards. Something about time required to have indices updated after
each row modified, added or deleted. The important phrase being "in some
Richard A. DeVenezia