Date: Thu, 6 Jul 2006 20:24:46 +0100
Reply-To: Kathryn Gardner <firstname.lastname@example.org>
Sender: "SPSSX(r) Discussion" <SPSSX-L@LISTSERV.UGA.EDU>
From: Kathryn Gardner <email@example.com>
Subject: Re: Suppressor variables in moderated multiple regression
Content-Type: text/plain; charset="iso-8859-1"
Thanks a lot for your e-mail and advice.
I have considered making composites of variables as opposed to deleting variables, but it's often difficult to know which technique is most appropriate. Also, with interaction/product terms I wasn't actually sure whether these would simply be summed or the 2 interaction terms would have to be combined into one variable using another technique. I was also unsure about summing scales that have negative relationships with the DV with those which have positive relationships i.e., is this OK or are there any issues I need to be aware of. Finally, I actually found that when I did sum two interaction terms they no longer predicted the criterion variable in my regression models. Thus I decided to delete one variable instead.
> Date: Wed, 5 Jul 2006 12:07:25 -0400> From: firstname.lastname@example.org> Subject: Re: Suppressor variables in moderated multiple regression> To: SPSSX-L@LISTSERV.UGA.EDU> > Stephen Brand> www.statisticsdoc.com> > Kathryn,> > Keith's advice here is very good. If you have pairs of predictors with high condition indices / low eigenvalues / - one thing that I might consider in your research is combined both of the items in the problem pair into an additive composite (if that makes sense in the context of your research).> > HTH,> > Stephen Brand> > ---- Keith McCormick <email@example.com> wrote:> > Hi Kathryn,> >> > Sorry, I have not been on the list for the last several days.> >> > The condition index will be high for the same rows that the> > eigenvalues are low. You could focus on either one. In those same> > rows, when the variance proportions are high those are the variables> > or variables that are potential problems.> >> > For instance, I just ran MPG as the dependent and engine displacement,> > horsepower, vehicle weight, and time to accelerate using the CARS.sav> > data set which is in your installation directory. Two rows have low> > eigenvalues. One indicates a possible problem pair engine, and time> > to acc.; the other shows engine displacement and weight as a possible> > problem pair. The former has an condition index of 27, the later 36.> > I would hate to think anyone would treat these differently because> > they are on either side of 30. There are no hard and fast rules. In> > this case, it is pretty clear that they are versions of the same> > thing, so I might drop some. That would not always be the best route.> >> > Seaching for a good phrase to "cut and paste" for condition index, I> > cut the following from the "result's coach". "Condition indices are> > the square roots of the ratios of the largest eigenvalue to each> > successive eigenvalue." The help is brief, and sometimes basic, but> > it is a resource worth remembering. Just put any pivot table in> > editiing mode, and right click on it to get the result's coach.> >> > Hope that helps despite the delay,> >> > Best, Keith> > keithmccormick.com> >> >> > On 6/29/06, Kathryn Gardner <firstname.lastname@example.org> wrote:> > > Hi Keith,> > >> > > Thanks for the further advice. I've been looking at collinearity diagnostics and I was wondering how the condition index is used? You said "Look for rows with a low eigenvalue (near zero), and then across the row for the columns that have high proportions (near one). That will tell you which variables are collinear with each other". But I was wondering whether I need to look at the condition index as well? I read that a condition index over 30 suggests serious collinearity problems and an index over 15 indicates possible collinearity problems. If a factor (component) has a high condition index, one looks in the variance proportions column to see if two or more variables have a variance partition of .50 or higher. But this text makes no reference to the eigenvalues. Can you shed any light on how eigenvalues, condition indices and variance portions are used all together or relate?> > >> > > Thanks> > > Kathryn> > >> > >> > >> > > > Date: Tue, 27 Jun 2006 11:04:36 -0400> From: email@example.com> To: firstname.lastname@example.org> Subject: Re: Re: Suppressor variables in moderated multiple regression> CC: SPSSX-L@listserv.uga.edu> > HI Kathryn,> > I am glad you made progress. VIF and tolerance are often enough to do> the trick, but as Stephen mentioned there is also the Variance> Proportions table in the Collinearity Diagnostics.> > Look for row with a low eigenvalue (near zero), and then across the> row for the columns that have high proportions (near one). That will> tell you which variables are collinear with each other. Deleting> variables is not the only choice, you could combine them in some way> (add them or average them when it makes sense) or create factors using> factor analysis.> > Good luck, Keith> > On 6/27/06, Kathryn Gardner <email@example.com> wrote:> >> >> > Hi Keith,> >> > Thanks a lot for your help. I have centered my variables that are to be used> > to create product term!> s !> > and have also examined the collinearity diagnostics.> > I ended up deleting one variable with a VIF of 11.90 and I now have more> > acceptable VIF levels. I still have many variables which have non-sig zero> > order correlations but sig B coefficients in the regression model though, so> > perhaps they are suppressors. Actually some of my VIFs are still over 5 so> > you may be right in that they are suppressors.> >> > Thank you for your advice regarding my second question which was also> > helpful.> > Best wishes,> > Kathryn> >> >> > ________________________________> >> > > Date: Mon, 26 Jun 2006 13:01:57 -0400> > > From: firstname.lastname@example.org> > > Subject: Re: Suppressor variables in moderated multiple regression> > > To: SPSSX-L@LISTSERV.UGA.EDU> >> > >> > > Hi All,> > >> > > I think there is evidence of suppression here, but> > there are a number> > > of things I would check that you don't mention. I> > don't know which> > > you have tried, so I will list them in respon!> se!> > to 'a'.> > >> > > If you have not centered the variables, you might w> > ant> > to do that.> > > That is, you subtract the average of a variable> > from itself so that> > > zero is the average. This is important when creating> > the interactions> > > and polynomials.> > >> > > Request the collinearity diagnostics. Small> > tolerance values (below> > > .1) would indicate a problem and add to the evidence> > that suppression> > > is present. Since you ran 3 steps it would be> > interesting to see when> > > (if) the tolerance radically lowers.> > >> > > VIF would also be a sign. If the Variance Inflation> > Factor becomes> > > large, you might have suppression (5+ or so). If you> > have not> > > centered and the VIF jumps on step three, then I> > would center and run> > > it again. It might help a lot.> > >> > > In answer to 'b', I don't see any harm in looking at> > the standarized> > > beta on the interactions to check for one detail. See> > if the value> > > falls outside its normal range - that is it shouldn't> > be above 1 or> > > belo!> w !> > -1. If it is, that would also indicate> > suppression.> > >> > > HTH. Good Luck.> > >> > > Keith> > > keithmccormick.com> > >> > > On 6/26/06, Kathryn Gardner <email@example.com> wrote:> >> > > > Dear List,> > > >> > > > I've conducted moderated multiple regression> > analysis with the main effects> > > > on steps 1 and 2 and product (interaction) terms on> > step 3. After recently> > > > reading & learning about suppressor variables,> > I examined the zero-order> > > > correlations between all predictors (including> > interaction terms) and the> > > > criterion variable & compared them to the regression> > model beta> > > > coefficients for inconsistencies in sizes and signs.> > I found that some> > > > bivariate correlations between predictors &> > the criterion are non-> > > > significant, but they are significant predictors in> > the regression> > > > analysis. I have read that this may be a sign of> > classical suppression & I> > > > was wondering if anyone !> co!> > uld advise on:> > > >> > > > a) whether this is a sign of suppression,> > & even if it> > is, what else> > > > could these results suggest other than suppression?> > > > b) the literature on suppressor variables suggests> > looking for> > > > inconsistencies in signs and sizes between the> > bivariate correlations and> > > > standardized regression coefficients (beta).> > However, I have read that only> > > > the unstandardized coefficients (B) should be> > interpreted when interpreting> > > > interaction effects. Is it therefore OK to> > examine inconsistencies between> > > > bivariate correlations and unstandardized> > coefficients and are there any> > > > issues I need to be aware of?> > > >> > > > Many thanks.> > > >> > > > Kathryn> > > >> >> >> > ________________________________> > Be one of the first to try Windows Live Mail.> > > _________________________________________________________________> > > Try Live.com: where your online world comes together - with news, sports, weather, and much more.> > > http://www.live.com/getstarted> > --> For personalized and experienced consulting in statistics and research design, visit www.statisticsdoc.com
Be one of the first to try Windows Live Mail.