Determining the findings of your research study involves more than simply reporting initial results. Instead, it is important to critically examine study results and check for statistical pitfalls to develop accurate findings upon which you can make reliable conclusions.
Critically examine results
No matter what your results, ask some critical questions:
- Did the content of your measure indicate the variables you intended to measure? Did you omit important variables or include unnecessary ones?
- Was the outcome measure you used specific and precise enough to detect the effects of the instructional intervention? Make sure you have avoided ceiling and floor effects , which occur when respondents encounter upper or lower limits to a measure.
- For a field or controlled experiment, were there significant differences between groups on your outcome measure before you started your intervention?
- For a single group experiment, were the testing conditions for the pre- and post-test equivalent?
- Were conditions for groups roughly the same (e.g., equivalent classrooms, instruction, and assistance outside of class)?
- Did anything happen other than your instructional intervention that would have affected study results?
- Was there any difference in motivation between the pre- and post-measures or between/among groups?
Check for statistical pitfalls
- While any findings should be statistically significant, having statistically significant results does not mean, they are important or valuable; it just indicates that the difference you found is unlikely to be due to chance.
- Was regression to the mean a problem? If you assign project participants to groups, such as an under-achieving and high-achieving group, solely on the basis of pre-test scores, the low-achieving group may improve when retested because they had an uncharacteristically bad performance (i.e., bad luck) on the first test.
- Could there be any errors due to sample size? If you have fewer than 25 cases per group, you may lack adequate statistical power to detect differences between groups. On the other hand, if you have very large groups, almost any difference, even a trivial one, will be statistically significant, and could lead you to make unwarranted conclusions. For this reason, you should indicate effect sizes, which allow the readers to judge how meaningful the differences are between/among groups. [more]
- Are you making multiple comparisons between variables? Each additional comparison between groups increases the chance of finding an erroneous relationship due to chance. Decrease errors resulting from multiple comparisons by using a more stringent significance level, adjusting for the number of comparisons made, or using multiple comparison techniques that account for this issue.
- Other statistical pitfalls
- Evaluate your results based on how well they answer your research questions or confirm your hypotheses.
- Statistically significant and/or practically significant findings should form the basis of your main conclusions. Emphasize your strongest findings.
- Consider all possible explanations for results before concluding an intervention definitely worked or did not work.
- Verify (triangulate) findings from your experiment with results from other data sources such as interviews or surveys that can provide additional insight. Finding similar results using different methods strengthens conclusions. On the other hand, differing results call for further analysis.
Aron, A. & Aron, E. N. (2002). Statistics for Psychology, 3rd edition. Upper Saddle River, N J: Prentice Hall.
Coe, R. (2000). What is an 'effect size'? A guide for users. Retrieved June 21, 2006 from the University of Durham, Curriculum, Evaluation and Management Centre, Evidence-Based Education-UK Web site: http://www.cemcentre.org/renderpage.asp?linkid=30325015
Helberg, C. (1995). Pitfalls of data analysis. Retrieved June 21, 2006 from: http://my.execpc.com/4A/B7/helberg/pitfalls/
Lowry, R. P. (2005). Concepts and Applications of Inferential Statistics. Retrieved June 21, 2006 from: http://faculty.vassar.edu/lowry/webtext.html