Two possibilities come to mind.  First, might this file have a huge collection of documents in it?  These carry over through merges, so sometimes they build up.  You can use Utilities>Data File Comments to see what is there and the DROP DOCUMENTS command to get rid of them.

Another possibility is that you have a huge collection of value labels, even if they are not being used.

Jon Peck

From:        "Kawashima-Ginsberg, Kei" <Kei.Kawashima_Ginsberg@TUFTS.EDU>
Date:        10/07/2010 09:01 AM
Subject:        [SPSSX-L] Extremely long processing time for PASW file
Sent by:        "SPSSX(r) Discussion" <SPSSX-L@LISTSERV.UGA.EDU>

Dear listserv members;
I’m working on a series of datasets that only has about 500 variables and 1000 cases in each.  The files are all really large for the content (36mb to 40mb), and when I try to process them (get the file, merge the files, run restructuring), they are so slow that my computer can’t finish processing them, and it seems to take all of my processing memory (3GB).  I don’t think it’s a file size problem because I handle much larger (from a different study) datasets and it usually takes only seconds.  Could someone help me understand what’s wrong with these files, and if there are ways to make the files workable?  
Kei Kawashima-Ginsberg, Ph.D.
Lead Researcher
The Center for Information and Research on Civic Learning and Engagement
Jonathan M. Tisch College of Citizenship and Public Service
Tufts University