Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Publisher: Public Library of Sciene
Journal: PLoS ONE
Languages: English
Types: Article
Subjects: Economics, Computer Applications, Web-Based Applications, Social and Behavioral Sciences, Psychology, Science, Mathematics, Workshops, Cognitive Psychology, Science Education, Statistics, Statistical Methods, Medicine, Learning, Q, R, Money Supply and Banking, Pedagogy, Computer Science, Teaching Methods, Research Article
Background - The literature is not univocal about the effects of Peer Review (PR) within the context of constructivist learning. Due to the predominant focus on using PR as an assessment tool, rather than a constructivist learning activity, and because most studies implicitly assume that the benefits of PR are limited to the reviewee, little is known about the effects upon students who are required to review their peers. Much of the theoretical debate in the literature is focused on explaining how and why constructivist learning is beneficial. At the same time these discussions are marked by an underlying presupposition of a causal relationship between reviewing and deep learning. Objectives - The purpose of the study is to investigate whether the writing of PR feedback causes students to benefit in terms of: perceived utility about statistics, actual use of statistics, better understanding of statistical concepts and associated methods, changed attitudes towards market risks, and outcomes of decisions that were made. Methods - We conducted a randomized experiment, assigning students randomly to receive PR or non–PR treatments and used two cohorts with a different time span. The paper discusses the experimental design and all the software components that we used to support the learning process: Reproducible Computing technology which allows students to reproduce or re–use statistical results from peers, Collaborative PR, and an AI–enhanced Stock Market Engine. Results - The results establish that the writing of PR feedback messages causes students to experience benefits in terms of Behavior, Non–Rote Learning, and Attitudes, provided the sequence of PR activities are maintained for a period that is sufficiently long.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • 1. Yang Y, Tsai C (2010) Conceptions of and approaches to learning through online peer assessment. Learning and Instruction 20: 72-83.
    • 2. Van Zundert M, Sluijsmans D, Van Merrinboer J (2010) Effective peer assessment processes: Research findings and future directions. Learning and Instruction 20: 270-279.
    • 3. Dabbagh N (2005) Pedagogical models for e-learning: A theory-based design framework. International Journal of Technology in Teaching and Learning 1: 25-44.
    • 4. Gennip NAEV, Segers MSR, Tillema HH (2009) Peer assessment for learning from a social perspective: The influence of interpersonal variables and structural features. Educational Research Review 4: 41-54.
    • 5. Tillema H, Leenknecht M, Segers M (2011) Assessing assessment quality: Criteria for quality assurance in design of (peer) assessment for learning - a review of research studies. Studies in Educational Evaluation 37: 25-34.
    • 6. Wessa P, Baesens B (2009) Fraud detection in statistics education based on the compendium platform and reproducible computing. In: Proceedings of the 2009 WRI World Congress on Computer Science and Information Engineering (CSIE). Los Angeles, California, USA: IEEE Computer Society. pp 50-54. doi:10.1109/CSIE.2009.710.
    • 7. Strijbos JW, Narciss S, Du¨ nnebier K (2009) Peer feedback and senders competence level in academic writing revision tasks: Are they critical for feedback perceptions and efficiency? Learning and Instruction 20: 291-303.
    • 8. Lundstrom K, Baker W (2009) To give is better than to receive: The benefits of peer review to the reviewers own writing. Journal of Second Language Writing 18: 30-43.
    • 9. Strijbos JW, Sluijsmans D (2009) Unravelling peer assessment: Methodological, functional, and conceptual developments. Learning and Instruction 20: 265-269.
    • 10. Wessa P, van Stee E (2008) The xycoon stock market game: Virtual learning environment or real-life laboratory? In: Proceedings of the International Conference of Education, Research and Innovation. International Association of Technology, Education and Development.
    • 11. Wessa P (2009) IAENG Transactions on Engineering Technologies, American Institute of Physics, volume 2, chapter Reproducible Computing: a new Technology for Statistics Education and Educational Research. pp 86-97.
    • 12. Wessa P, De Rycker A (2010) Reviewing peer reviews - a rule-based approach. In: Proceedings of the 5th International Conference on E-Learning (ICEL). pp 408-418.
    • 13. Wessa P (2009) How reproducible computing leads to non-rote learning within socially constructivist statistics education. Electronic Journal of e-Learning 7: 173-182.
    • 14. Wessa P, De Rycker A, Holliday IE (2011) Content-based vle designs improve learning efficiency in constructivist statistics education. PLoS ONE 6(10): e25363.
    • 15. Wessa P (2009) A framework for statistical software development, maintenance, and publishing within an open-access business model. Computational Statistics 24: 183-193.
    • 16. Wessa P, Poelmans S, De Rycker A, Holliday IE (2010) Design and implementation of a web-based system for formative peer review in statistics education. In: ICERI 2010 Proceedings.
    • 17. Lootens T, Wessa P (1999) Stockhunter 1.1 (software). EURONEXT Brussels, EURONEXT Brussels, 1.1 edition.
    • 18. Directorate General for Environment (2004) Money well spent on social responsibility. In: Green Week daily, European Commission, 1. URL http:// stock-market-game.com/gwdaily_1.pdf.
    • 19. Wessa P (2008) How to Objectively Rate Investment Experts in Absence of Full Disclosure? An Approach based on a Near Perfect Discrimination Model. Advances in Methodology and Statistics 5: 19-32.
    • 20. McClave JT, Benson GP, Sincich T (2007) Statistiek: een inleiding voor het hoger onderwijs. Pearson Education, 9 edition.
    • 21. R Development Core Team (2008) R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. URL http://www.R-project.org. ISBN 3-900051-07-0.
    • 22. Wessa P (2010) Free statistics software (online software at http://www.wessa. net). URL http://www.wessa.net.
    • 23. Hsu MK, Wang SW, Chiu KK (2009) Computer attitude, statistics anxiety and self-efficacy on statistical software adoption behavior: An empirical study of online mba learners. Computers in Human Behavior 25: 412-420.
    • 24. Mondjar-Jimnez J, Vargas-Vargas M (2010) Determinant factors of attitude towards quantitative subjects: Differences between sexes. Teaching and Teacher Education 26: 688-693.
    • 25. Taylor P, Maor D (2000) Assessing the efficacy of online teaching with the constructivist on-line learning environment survey. In: Herrmann A, Kulski MM, eds. Flexible Futures in Tertiary Teaching. Curtin University of Technology, Proceedings of the 9th Annual Teaching Learning Forum.
    • 26. Poelmans S, Wessa P, Milis K, van Stee E (2009) Modeling educational technology acceptance and satisfaction. In: L Gmez Chova ICT D Mart Belenguer, editor, International Conference Proceedings on Education and New Learning Technologies (EduLearn09). International Association of Technology, Education and Development (IATED). pp 5882-5889.
    • 27. Baeten M, Kyndt E, Struyven K, Dochy F (2010) Using student-centred learning environments to stimulate deep approaches to learning: Factors encouraging or discouraging their effectiveness. Educational Research Review In Press, Corrected Proof: -.
    • 28. Smith VL, Suchanek GL, Williams AW (1988) Bubbles, crashes, and endogenous expectations in experimental spot asset markets. Econometrica 56: 1119-1151.
    • 29. Sheskin D (2004) Handbook of Parametric and Nonparametric Statistical Procedures. Ch, 3 edition.
    • 30. Rosnow RL, Rosenthal R (1996) Computing contrasts, effect sizes, and counternulls on other people's published data: General procedures for research consumers. Psychological Methods 1: 331-340.
    • 31. Randolph JJ, Edmondson SR (2005) Using the binomial effect size display (besd) to present the magnitude of effect sizes to the evaluation audience. Practical Assessment, Research & Evaluation 10.
  • No similar publications.

Share - Bookmark

Cite this article