Date: Wed, 10 Aug 2011 15:50:50 -0700
Reply-To: Bruce Weaver <firstname.lastname@example.org>
Sender: "SPSSX(r) Discussion" <SPSSX-L@LISTSERV.UGA.EDU>
From: Bruce Weaver <email@example.com>
Subject: Re: problem with logistic regression - linearity to the logit
Content-Type: text/plain; charset=us-ascii
There is no need for every variable in the model to be statistically
significant. In fact, the Manuscript Checklist at the link below says that
a model with no non-significant explanatory variables is suspicious.
Here's the relevant section:
*Lack of insignificant variables in the final model*
Unless the sample size is huge, this is usually the result of the authors
using a stepwise variable selection or some other approach for filtering out
"insignificant" variables. Hence the presence of a table of variables in
which every variable is significant is usually the sign of a serious
Authors frequently use strategies involving removing insignificant terms
from the model without making an attempt to derive valid confidence
intervals or P-values that account for uncertainty in which terms were
selected (using for example the bootstrap or penalized maximum likelihood
esetimation). A paper in J Clin Epi March 2009 cited Ockham's razor as a
principle to be followed when building a model, not realizing that parsimony
resulting from utilizing of the data at hand to make modeling decisions only
seems to result in parsimony. Removing insignificant terms causes bias,
inaccurate (too narrow) confidence intervals, and failure to preserve type I
error in the resulting model's P-values, which are calculated as though the
model was completely pre-specified.
> Hello, I'm at a similar place in my research using logistic regression. I
> have four parameters that fail the linearity test, and was able to use the
> methods in Menard (1995) to discover possible parametric forms. One is a
> cubic, and the other three are quadratic; each provides a significant
> change in -2LL, so I believe all of the forms are correct.
> I have one small issue, when I refit the logistic regression with the
> transformed parameters, one of the quadratic parameters was no longer
> significant. I can just remove the squared term to get acceptable
> results, but it doesn't seem right--the other two quadratic parameters are
> significant and each variable has the same physical meaning. Any
> suggestions? Could this be a problem with colinearity?
"When all else fails, RTFM."
NOTE: My Hotmail account is not monitored regularly.
To send me an e-mail, please use the address shown above.
View this message in context: http://spssx-discussion.1045642.n5.nabble.com/problem-with-logistic-regression-linearity-to-the-logit-tp3336036p4687660.html
Sent from the SPSSX Discussion mailing list archive at Nabble.com.
To manage your subscription to SPSSX-L, send a message to
LISTSERV@LISTSERV.UGA.EDU (not to SPSSX-L), with no body text except the
command. To leave the list, send the command
For a list of commands to manage subscriptions, send the command