LISTSERV at the University of Georgia
Menubar Imagemap
Home Browse Manage Request Manuals Register
Previous messageNext messagePrevious in topicNext in topicPrevious by same authorNext by same authorPrevious page (February 2005)Back to main SPSSX-L pageJoin or leave SPSSX-L (or change settings)ReplyPost a new messageSearchProportional fontNon-proportional font
Date:         Sun, 13 Feb 2005 08:18:38 -0600
Reply-To:     Ben Chapman <bpc0005@unt.edu>
Sender:       "SPSSX(r) Discussion" <SPSSX-L@LISTSERV.UGA.EDU>
From:         Ben Chapman <bpc0005@unt.edu>
Subject:      Attenuation correction for regression coefficients
In-Reply-To:  <20050213050117.00F7E74426@mailhost.unt.edu>
Content-Type: text/plain; charset=ISO-8859-1

Greetings All,

I searched the archives and found a thread debating the merits of disattenuating bivariate correlations for unreliability in measures. I'm interested in dissatenuating regression coefficients for unreliability in predictors, per the usual formulae (for instance, Cohen, Cohen, West & Aiken, 2003, pp. 56-57), and am wondering if syntax can be written for the task such that the SPSS output would yield Bs, betas, and Rs reflecting the correction for attenuation due to unreliability. Frankly, I'm not sure it's possible, but wanted to check with the list before I gave up hope.

As a pertinent footnote about measurement error, the research problem is not quite amenable to structural equation modeling, or at least not within my SEM capacity. Any thoughts greatly appreciated.

Best, Ben -- Benjamin Chapman, M.S. Doctoral Psychology Intern University of Missouri Kansas City Counseling, Health, and Testing Center 4825 Troost Building 5100 Rockhill Road Kansas City, MO 64110

Quoting Automatic digest processor <LISTSERV@LISTSERV.UGA.EDU>:

> There are 4 messages totalling 240 lines in this issue. > > Topics of the day: > > 1. multiple rows per case unique info per field (variable) > 2. Comments on SPSS documentation (2) > 3. SQL to SPSS. > > ---------------------------------------------------------------------- > > Date: Fri, 11 Feb 2005 23:33:24 -0800 > From: Dominic Lusinchi <dominic@farwestresearch.com> > Subject: multiple rows per case unique info per field (variable) > > Dear Colleagues, > > Here is my issue: > > I have many (most) cases in the (SPSS) data file with multiple records. For > example information on John Doe occupies 4 records (rows), so, for instance, > his blood pressure is recorded is row 2 (so rows 1, 3, and 4 in that > field/column/variable have no information), and his age in row 4 (so rows 1, > 2, and 3 in that field/column/variable have no information), etc... What I > would like to do is put all the information for John Doe in one record > (row), so instead of having 4 rows for John Doe I want only one. > > Any suggestions would be appreciated. Thank you in advance. > > Dominic > ********************************************************* > Dominic Lusinchi > FAR WEST RESEARCH > Demography-Survey Research-Applied Statistics > San Francisco, California > > Voice: 415-664-3032 > Fax: 415-664-4459 > E-mail: dominic@farwestresearch.com > Web: http://www.farwestresearch.com > ********************************************************* > > ------------------------------ > > Date: Sat, 12 Feb 2005 11:18:54 -0800 > From: Dale Glaser <glaserconsult@sbcglobal.net> > Subject: Comments on SPSS documentation > > In the years of using SPSS since mainframe days, and the few years I have > been on this listserv, I have never used this as a forum for gripes, but I'd > like to register a minor one and see if I'm in the minority!! This past week > I had the opportunity to use PLUM for proportional odds modeling (AKA: > ordinal regression). When I accessed the manuals to aid in interpretation of > the Threshold and Location parameters, they are woefully amiss in shedding > any insight. Though I had a suspicion that the Threshold parameters are > calculated similarly to how the same parameters are calculated via PRELIS for > deriving the polychoric correlation matrix, I wanted unequivocal confirmation > of such, which the manuals do not do (though I can always access the > statistical algorithms if need be). That being said, doesn't it seem in the > "good old days" the SPSS manuals actually were fairly illuminating in the > mathematical/statistical details in the output? I actually found them, > alongside other > texts, > to be very educational. Nowadays, the manuals are just on the CD and even > then , they are nothing more than details on point-and-click. Reading Long's > "Regression Models for Categorical and Limited Dependent Variables" last > night made very clear the interpretation for both the proportional odds model > and multinomial logistic model, but it would be nice to have that level of > information in the SPSS manuals. > > Just a comment/frustration that I wonder if other SPSS users also > have........thank you...Dale > > > > > > > > > Glaser Consulting > Dale Glaser, Ph.D. > Consultant/Lecturer (SDSU/USD/AIU) > 4003 Goldfinch St, Suite G > San Diego, CA 92103 > phone: 619-220-0602 > fax: 619-220-0412 > email: glaserconsult@sbcglobal.net > website: www.glaserconsult.com > > ------------------------------ > > Date: Sat, 12 Feb 2005 19:43:08 -0000 > From: John McConnell <john@applied-insights.co.uk> > Subject: Re: SQL to SPSS. > > Hi Boris > > I've done this quite a lot over time from SQL Server, Oracle and some > databases that don't even exist any more - whatever happened to Ingres? ;-). > Here are some top of the mind thoughts. If you'd like to discuss more > specifics just drop me a note. > > The fundamental choice is whether you push (from SQL Server) or pull (from > SPSS). And in the end I think the main decision criteria are around > > 1) are you more familiar with SQL Server or SPSS? E.g. you can join tables > in either > 2) is this extraction process a one-off or will you want to re-run it. E.g. > is this for an ongoing reporting process > > > -On the Pull side, and probably the most seamless way is to use the Open > Database Wizard in SPSS - found on the file menu. Behind the scenes this > uses ODBC and you will need to have either the Microsoft or SPSS ODBC drives > installed to facilitate this. > The Wiz gives you more control in mapping the relational style meta data tto > the SPSS dictionary. E.g. it can automatically create Value Labels which > tend to be found in Look-up tables on the database side. > You can also save the visual stuff you set up in the wizard as either spss > syntax and/or an SPSS Query file (.spq). You can run the former as a batch > job for more seamless automation or you can use the .spq to re-populate the > wizard. Though this is less automated it does mean that you can visually > change the query inside the wizard. > -If your analysis is one-off/ad hoc you can still use the Wizard but I tend > to export the date from with SQL Server using Export (DTS) as comma > separated with the fieldnames saved as a header row. You can then read these > into spss using the File>Read Text Data menu choice. > -As you probably know SQL Tables in a well designed database are often > structurally similar to what you want to analyse in an SPSS data set e.g. > you might have a customer table or an orders table. However you may often > want to join data from several tables. You can obviously do this as you > export through DTS, or you can do it visually through the wizard. Or you can > save the separate tables as individual SPSS data (.sav) files which you can > merge later using the menu option Data>Merge Files>Add Variables (like a SQL > join) or Data>Merge Files>Add Cases (like a SQL Union). > > Hth > > john > > John McConnell > applied insights > Beaumont House, Kensington Village > Avonmore Road > London > > -----Original Message----- > From: SPSSX(r) Discussion [mailto:SPSSX-L@LISTSERV.UGA.EDU] On Behalf Of > Boris Ratchev > Sent: 10 February 2005 20:12 > To: SPSSX-L@LISTSERV.UGA.EDU > Subject: SQL to SPSS. > > Hi everyone, > > Does anyone have experience exporting SQL tables to > SPSS? Any seamless ways to do that? > > Thanks, > > Boris Rachev > Caliber Associates, Inc. > > > > __________________________________ > Do you Yahoo!? > The all-new My Yahoo! - What will yours do? > http://my.yahoo.com > > ------------------------------ > > Date: Sat, 12 Feb 2005 14:50:18 -0500 > From: John Painter <painter@email.unc.edu> > Subject: Re: Comments on SPSS documentation > > Hi Dale and List, > > I agree with you, Dale (although I cannot speak to the PLUM documentation). > SPSS documentation has drifted away from a educational + technical approach > to almost entirely (and limited) technical 'here are the command options' > format. Caveat: Overall, I am very satisfied with SPSS. > > When I started using SPSS and SAS (about 15 years ago) the SPSS manuals > were a breath of fresh air compared to the SAS manuals. However, it now > seems that the SPSS manuals of today are much much more like SAS command > manuals of yesteryear. However, the big difference now is that the SAS > books-by-users (many of which are published by SAS: see > <http://support.sas.com/publishing/index.html>) are satisfying the > educational + technical niche very well. A quick google search for SPSS > books yields one site with 46 titles, many of which are same topic for > different versions, and / or detailing relatively basic instruction. > > Long story short: SPSS would be well served if they were to develop a > publishing program similar to SAS where users are encouraged (and rewarded) > to write about various SPSS procedures and tools, and application thereof. > > Sincerely, > John Painter > , > --On Saturday, February 12, 2005 11:18 AM -0800 Dale Glaser > <glaserconsult@sbcglobal.net> wrote: > > > In the years of using SPSS since mainframe days, and the few years I have > > been on this listserv, I have never used this as a forum for gripes, but > > I'd like to register a minor one and see if I'm in the minority!! This > > past week I had the opportunity to use PLUM for proportional odds > > modeling (AKA: ordinal regression). When I accessed the manuals to aid > > in interpretation of the Threshold and Location parameters, they are > > woefully amiss in shedding any insight. Though I had a suspicion that > > the Threshold parameters are calculated similarly to how the same > > parameters are calculated via PRELIS for deriving the polychoric > > correlation matrix, I wanted unequivocal confirmation of such, which the > > manuals do not do (though I can always access the statistical algorithms > > if need be). That being said, doesn't it seem in the "good old days" the > > SPSS manuals actually were fairly illuminating in the > > mathematical/statistical details in the output? I actually found them, > > alongside other texts, to be very educational. Nowadays, the manuals are > > just on the CD and even then , they are nothing more than details on > > point-and-click. Reading Long's "Regression Models for Categorical and > > Limited Dependent Variables" last night made very clear the > > interpretation for both the proportional odds model and multinomial > > logistic model, but it would be nice to have that level of information in > > the SPSS manuals. > > > > Just a comment/frustration that I wonder if other SPSS users also > > have........thank you...Dale > > > > > > > > > > > > > > > > > > Glaser Consulting > > Dale Glaser, Ph.D. > > Consultant/Lecturer (SDSU/USD/AIU) > > 4003 Goldfinch St, Suite G > > San Diego, CA 92103 > > phone: 619-220-0602 > > fax: 619-220-0412 > > email: glaserconsult@sbcglobal.net > > website: www.glaserconsult.com > > > > John S. Painter, Ph.D. > > Jordan Institute for Families > School of Social Work > University of North Carolina at Chapel Hill > > p: 919 962-6517 > e: painter@email.unc.edu > w: www.unc.edu/~painter > > ------------------------------ > > End of SPSSX-L Digest - 11 Feb 2005 to 12 Feb 2005 (#2005-43) > ************************************************************* >


Back to: Top of message | Previous page | Main SPSSX-L page