LISTSERV at the University of Georgia
Menubar Imagemap
Home Browse Manage Request Manuals Register
Previous messageNext messagePrevious in topicNext in topicPrevious by same authorNext by same authorPrevious page (July 1998)Back to main SPSSX-L pageJoin or leave SPSSX-L (or change settings)ReplyPost a new messageSearchProportional fontNon-proportional font
Date:   Thu, 2 Jul 1998 16:40:54 -0500
Reply-To:   "Nichols, David" <nichols@SPSS.COM>
Sender:   "SPSSX(r) Discussion" <SPSSX-L@UGA.CC.UGA.EDU>
From:   "Nichols, David" <nichols@SPSS.COM>
Subject:   Re: prblm w. mkappasc macro?
Comments:   To: mark swiencicki <MAS93012@UCONNVM.UCONN.EDU>

As Rich Ulrich suggested, I'll be glad to clarify the calculations if given the data. I've tested the macro numerous times and I'm fairly certain the calculations are accurate. I don't really recommend that measure though, as it's really a measure of consistency rather than absolute agreement.

David Nichols Principal Support Statistician and Manager of Statistical Support SPSS Inc.

---------- From: mark swiencicki [SMTP:MAS93012@UCONNVM.UCONN.EDU] Sent: Thursday, June 18, 1998 11:19 AM To: SPSSX-L@UGA.CC.UGA.EDU Subject: prblm w. mkappasc macro?

Has anyone else had a problem with running the MNKAPPASC macro for determining overall kappa scores for multiple judges. When i run a job with 6 coders who have a 91% aggreement spread out over 3 values, it gave me a kappa of .47 when it should have been about .88. Could it be overlooking the cases with full agreement? I downloaded the file off the spss. web site and saved it as an spss text file, then used the proper commands in the syntax window: INCLUDE MKAPPASC.SPS. MKAPPASC VARS=varlist (with my 6 variables named here). PS I used consecutively numbered integers from 1 - up, so that can't be the problem.

Signed, Help me in Connecticut! Mark Swiencicki - Univ. of Conn. Sociology


Back to: Top of message | Previous page | Main SPSSX-L page