```Date: Sat, 3 Feb 2007 07:39:50 -0500 Reply-To: Art@DrKendall.org Sender: "SPSSX(r) Discussion" From: Art Kendall Subject: Re: Collinearity Comments: To: Statisticsdoc In-Reply-To: Content-Type: text/plain; charset=UTF-8; format=flowed It has worked for me since SPSS included RELIABILITY in the mid-70s. Art Statisticsdoc wrote: > Art- > > I would not call this quick and dirty - more like quick and very neat! > > Thanks, > > Steve > > Art said: > > A very quick and dirty way to locate which variables are involved in the > problem is to pretend that all of the x variables are items in a scale > and run RELIABILITY. > This procedure shows you the SMC - squared multiple correlation- of each > variable with the other variables. It also shows you the corrected > item-total correlation, the correlation of each item with the sum of the > other items. Items that have SMCs (R**2s) of 1.00 have perfect > redundancy. The column of SMCs shows the fit of all possible regressions > of each variable in the set with all other variables in the set. The > SMCs tells you the degree to which variables are collinear (redundant) > with the other variables in the set. > > > For personalized and professional consultation in statistics and research design, visit > www.statisticsdoc.com > > > -----Original Message----- > From: SPSSX(r) Discussion [mailto:SPSSX-L@LISTSERV.UGA.EDU]On Behalf Of > Art Kendall > Sent: Friday, February 02, 2007 11:07 AM > To: SPSSX-L@LISTSERV.UGA.EDU > Subject: Re: Collinearity > > > Which variable(s) to drop from the set will depend on the substantive > nature of your analysis. > > Art Kendall > Social Research Consultants > > > > > > Peck, Jon wrote: > >> I'd like to register an objection to the idea of "testing for collinearity". One can measure the degree of collinearity in various ways and can look at the effect - joint confidence intervals that show the degree of dependence of the estimates - but there can be no definitive rules about when there is too much short of perfect collinearity. And software will take care of that rule for you in ways varying between helpful and rude. Collinearity is a matter of degree, not a yes or no outcome. >> >> As long as you don't have experimental data designed to be orthogonal, you are going to have collinearity to some degree, and the more there is, the more unstable the estimates will be, but any rule short of perfect collinearity is arbitrary. >> >> One useful reality check, collinearity or not, is this. Consider the accuracy of your variables - say you believe the values are correct to three or four significant figures. Then add a random variable to the variables that is small enough that the values round to the actual values to that degree of accuracy. Rerun your estimates and see how much you care about the differences in the results. >> >> My two cents. >> >> Jon Peck >> >> -----Original Message----- >> From: SPSSX(r) Discussion [mailto:SPSSX-L@LISTSERV.UGA.EDU] On Behalf Of Statisticsdoc >> Sent: Thursday, February 01, 2007 8:57 PM >> To: SPSSX-L@LISTSERV.UGA.EDU >> Subject: Re: [SPSSX-L] Collinearity >> >> Stephen Brand >> www.statisticsdoc.com >> >> Albert-jan, >> >> A great deal of good advice has been given on this topic, particularly >> Anita's suggestion to utilize CATREG. Just to add a couple of small items >> to the pool, I would suggest the following: >> >> (1) Perfect collinearity exists when one independent variable can be >> predicted by a linear combination of the other independent variables, so in >> addition to looking at the bivariate correlations between the predictors, >> examine the multiple regression between each predictor and the other >> predictors (e.g., to what extent can X1 be predicted by a weighted >> combination of X2 and X3). >> >> (2) If you have a large sample, you might want to consider splitting it >> randomly into halves, and conducting the logistic regression analysis in >> both halves, or cross-validating the regression weights from one half in the >> other half. This approach will give some indication of how robust the >> parameter estimates are. >> >> HTH, >> >> Stephen Brand >> >> For personalized and professional consultation in statistics and research >> design, visit >> www.statisticsdoc.com >> >> >> -----Original Message----- >> From: SPSSX(r) Discussion [mailto:SPSSX-L@LISTSERV.UGA.EDU]On Behalf Of >> Albert-jan Roskam >> Sent: Thursday, February 01, 2007 6:39 AM >> To: SPSSX-L@LISTSERV.UGA.EDU >> Subject: Collinearity >> >> >> Dear list, >> >> I would like to test for collinearity between three >> ordinal variables. The variables have different >> numbers of values, but are coded in a similar way, >> i.e. category 1 is the lowest category for all three >> vars. >> >> I calculated Spearman's rho correlations for these >> variables. The correlation coefficient never exceeds >> .53; well below the generally used rule-of-thumb that >> it should not exceed .85. --btw, does anybody have a >> good reference for this rule? >> >> Can I now safely assume that my variables are not >> collinear when I use them simultaneously as >> independent predictors in a logistic regression >> analysis? >> >> Thank you for your replies! >> >> Albert-Jan >> >> >> >> ____________________________________________________________________________ >> ________ >> Now that's room service! Choose from over 150,000 hotels >> in 45,000 destinations on Yahoo! Travel to find your fit. >> http://farechase.yahoo.com/promo-generic-14795097 >> >> >> >> > > > > ```

Back to: Top of message | Previous page | Main SPSSX-L page