Wrong results from ANOVA post hoc

Discuss statistics related things

by rhlamber » Thu Aug 15, 2019 12:24 am

Hi there.

I would like to start congratulating the jamovi team for the excellent software.

Working with my data in jamovi I found some weird p values resulted from the post hoc of the interaction of groups.

To compare, I ran the same results in SPSS and GraphPad Prism. The overall ANOVA results are ok, however, as I suspected, the results provided by jamovi on post hoc comparisons were wrong (SPSS and Prism presented the same p value between them). I tried all tests available (Tukey, Scheffe, Bonferroni and Holm), and all returned wrong values from jamovi.

Am I doing something wrong?

Please see attached the tables returned from SPSS and jamovi.

Looking forward for the team reply.

Thanks in advance.
post hoc jamovi.jpg
post hoc jamovi.jpg (64.57 KiB) Viewed 1149 times
post hoc spss.jpg
post hoc spss.jpg (78.64 KiB) Viewed 1149 times
ANOVA.jpg (60.91 KiB) Viewed 1149 times
Posts: 3
Joined: Thu Mar 21, 2019 2:52 pm

by jonathon » Thu Aug 15, 2019 6:13 am


i think this might be a case where the results are *different* rather than wrong. there's a strong case that spss does these in a sub-optimal way.

the post-hoc tests in jamovi are based on estimated marginal means, you can read more about them here:

https://cran.r-project.org/web/packages ... asics.html


User avatar
Posts: 1343
Joined: Fri Jan 27, 2017 10:04 am

by rhlamber » Thu Aug 15, 2019 9:26 pm

Hi Jonathon.

Thank you very much for your reply.
You are totally right. The results are different and not wrong... I believe my english limitations did not help me on that.

I will study a little bit more about the estimated marginal means.

Best regards.
Posts: 3
Joined: Thu Mar 21, 2019 2:52 pm

Return to Statistics