LISTSERV at the University of Georgia
Menubar Imagemap
Home Browse Manage Request Manuals Register
Previous messageNext messagePrevious in topicNext in topicPrevious by same authorNext by same authorPrevious page (June 2011)Back to main SPSSX-L pageJoin or leave SPSSX-L (or change settings)ReplyPost a new messageSearchProportional fontNon-proportional font
Date:         Mon, 27 Jun 2011 15:48:49 -0700
Reply-To:     David Marso <david.marso@gmail.com>
Sender:       "SPSSX(r) Discussion" <SPSSX-L@LISTSERV.UGA.EDU>
From:         David Marso <david.marso@gmail.com>
Subject:      Re: cross validation using SPSS
In-Reply-To:  <BANLkTinFXULWi3JF9-B+tF2KZH8M-jZ-yQ@mail.gmail.com>
Content-Type: text/plain; charset=us-ascii

OK! First of all it would be nice if you were to provide a reference to these quantities or a formula. Sure, I can google but AFAIC it is a pain and you really should save us the extra research effort! I made the effort to Google "prediction sum of squares" and located something which might be useful. http://webscripts.softpedia.com/script/Scientific-Engineering-Ruby/Statistics-and-Probability/press-35784.html Given the definition one might be inclined to run a bajillion different regressions leaving one case out for each regression and then calculating the residuals for the omitted case based on the regression weights for the remaining cases. OTOH, this is shear folly as there is a much nicer way to achieve this. My initial idea was to create a MATRIX program to calculate the 'hat' matrix and then go to town with that. My second idea was to see what SPSS will give you in terms of useful stuff in the SAVE subcommand. Rather than spoil all the fun I leave you with the following. You should run this as is and look at the data file after running all three regressions... Hmmmmm.

data list free / a b c y . begin data 1 6 3 1 6 3 6 1 5 3 6 5 3 6 1 5 6 3 1 5 6 3 1 5 6 7 3 5 1 2 6 7 3 5 1 7 6 3 7 6 1 3 5 6 7 1 3 6 7 1 5 3 6 7 1 5 3 6 7 1 7 6 end data. compute id=$casenum. reg / var a b c y / select id NE 1 / dep y / method enter a b c / SAVE DRESID (h1) RESID (e1). reg / var a b c y / select id NE 2 / dep y / method enter a b c / SAVE DRESID (h2) RESID (e2). reg / var a b c y / dep y / method enter a b c / SAVE DRESID (h_all) . *Note this is merely a pointer in the (hopefully right) direction.

Regarding MSPR (mean squared prediction error). I think you will need to provide an explicit publically available citation or formula. I found a few references but did not feel like attempting to make sense of them in the context of linear Regression. OTOH, I did see a reference to Mallow's Cp as a scaled version of MSRP. HTH, David

Mehrshad Koleini wrote: > > Dear all > > > Hi. During cross-validation procedure for making a regression model, I > need > to obtain PRESSp (prediction sum of squares), and MSPR (mean squared > prediction error). Does anybody know how I can calculate it by using SPSS > 17.0 Professor Package or I should use other software? > > > > Kind regards > > > Mehrshad >

-- View this message in context: http://spssx-discussion.1045642.n5.nabble.com/cross-validation-using-SPSS-tp4528990p4530101.html Sent from the SPSSX Discussion mailing list archive at Nabble.com.

===================== To manage your subscription to SPSSX-L, send a message to LISTSERV@LISTSERV.UGA.EDU (not to SPSSX-L), with no body text except the command. To leave the list, send the command SIGNOFF SPSSX-L For a list of commands to manage subscriptions, send the command INFO REFCARD


Back to: Top of message | Previous page | Main SPSSX-L page