Regarding Hector's and Melvin's posts... I agree with Hector. When we ran straight 32MB RAM, the case counter would zip through 20 cases at a time at a fairly fast pace, but you could still tell it was at 20 cases at a time. Upgrading one of the computers to 96MB RAM, we have noticed a increase to around 200 cases at a time. Running both computers, identical in every aspect except RAM, with the same syntax, the 96MB RAM machine speed through the runs faster than the 32MB machine.
We have on order hard drives with a capacity that will easily hold the temp files that our syntax produces.
To everyone that has been helpful in responding, I thank you.
From: Hector E. Maletta [SMTP:firstname.lastname@example.org]
Sent: Wednesday, May 21, 1997 3:54 AM
To: Multiple recipients of list SPSSX-L
Subject: Re: large data files
I am not quiote sure of the ideas propopsed by Melvin in the above
posting. In fact, in my own experience RAM helps, and helps a lot. It
probably has to do with caching the hard disk, thus saving on access
time to the next batch of cases.
Besides, cases are NOT read 'one at a time'. As your case counter shows
at the bottom of your SPSS screen, cases are loaded in batches of many.
In all SPSS/PC it was batches of 8 cases, but SPSS/Win read larger
batches. They are processed (i.e. computed in statistical procedures)
one at a time, but they are read from the disk in larger chunks.
When memory is larger than 16 Mb, you can enlarge your disk cache, and
besides SPSS will enlarge workspace automatically, thus allowing for
more to happenm in memory without resorting to the disk, or resorting to
it less often.
However, I'm certainly out of my depth here, and would very much like
having some guy (or gal) from SPSS technical support shedding some light
on this abstruse matter involving hardware and SPSS's speed.
Universaidad del Salvador
Buenos Aires, Argentina