Date: Tue, 15 Apr 2003 09:50:56 -0700
Reply-To: Arya Bhatta <firstname.lastname@example.org>
Sender: "SPSSX(r) Discussion" <SPSSX-L@LISTSERV.UGA.EDU>
From: Arya Bhatta <email@example.com>
Subject: Re: Problems with reading big SPSS files
Content-Type: text/plain; charset=us-ascii
It might sound strange, but it does happen if two programs (say SPSS and Visual basics) are clashing with each other. I am sure if that is case, but you might like to have a look.David
Nels Tomlinson <firstname.lastname@example.org> wrote:We commonly work with 600,000 to 700,000 cases, 50 to 100 variables, using
SPSS 11.5.1 on Win2K machines with about 0.5GBytes RAM. My machine has
about 15GB of free disk space, and the temporary files are on the local hard
drive. We do some data manipulation, and some data editing.
We recently had several week's work get lost, and never did figure out what
happened. SPSS crashes occasionally (before we upgraded to 11.5.1, it
crashed often), but usually it seems to be related to other programs, like
the Windows screensaver we are required to use. Sometimes it just crashes
without any apparent explanation.
After saying all that, we haven't seen any skewed columns, et cetera. I'm
afraid that none of this is very helpful. I think it merely confirms that
SPSS is still flakey, after all these years.
Before I started this job, I was accustomed to relying on free, Libre
software for work. I did most of my work with emacs, R and LaTeX, on a Unix
system. Nothing ever crashed, and I never lost any work. Having to use
commercial software has been a rude shock.
From: SPSSX(r) Discussion [mailto:SPSSX-L@LISTSERV.UGA.EDU]On Behalf Of
Sent: Tuesday, April 15, 2003 7:48 AM
Subject: SV: Problems with reading big SPSS files
I wonder if this is not some limitation in SPSS. I've run some 100.000 cases
and the data editor
starts to behave very strangely with skewed columns etc, although syntax
commands and the running of procedures seem to be unaffected. Could there be
some upper limit for how many cases that can be displayed in the data
editor. I've read something somewhere about this. Maybe with a million cases
the data editor will shut down, but that syntax and procedures will work as
National Institute of Public Health
Från: Thomas.Miyoshi@UCHSC.edu [mailto:Thomas.Miyoshi@UCHSC.edu]
Skickat: den 15 april 2003 17:36
Ämne: Re: Problems with reading big SPSS files
I'm sure that the people at SPSS can explain it better, but I've alway been
told that you must have a lot of disk space if you are processing a big data
I think that they recommend that you have at least 2 to 3 times the disk
space for temporary files compared to the file that you are processing.
So if you have a file that is 1 GB, then the location of your temporary file
must be 2 to 3 GB.
From: Nan Li [mailto:email@example.com]
Sent: Tue 4/15/2003 7:25 AM
Subject: Problems with reading big SPSS files
When I read very big SPSS files, usually over one million records,
window is shut down automatically. I have spss11.0 and 11.5 in my
both of them have the same problem. It's ok to read small files.
know what's wrong with it? I have enough space in my C drive. Any
Do you Yahoo!?
The New Yahoo! Search - Faster. Easier. Bingo.