LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Sadoski, Mark; Sanders, Charles W. (2009)
Publisher: Co-Action Publishing
Journal: Medical Education Online
Languages: English
Types: Article
Subjects:

Classified by OpenAIRE into

ACM Ref: ComputingMilieux_COMPUTERSANDEDUCATION
Student course evaluations were analyzed for common themes across five different basic science, clinical, and innovative courses from the first and third years of medical school. Each course had both unique and common numerically scaled items including an overall quality rating item. A principal components analysis was conducted for each course to determine the items that loaded most heavily on the same component as the overall quality item. Across courses and years, the items that consistently loaded on the same component as the overall quality item were (1) admin­istrative aspects including course organization, (2) clearly communicated goals and objectives, and (3) instructional staff responsiveness. These results concur with recent medical education literature in this area. Faculty interested in increasing student ratings of the overall quality of their courses might best attend primarily to carefully organizing course goals and objectives and clearly communicating them. The limitations of these conclusions are discussed.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • 1. Abrami PC, d'Apollonia S, Rosenfield S. The di - mensionality of student ratings of instruction: what we know and what we do not. In: Perry RP, Smart JC, editors. Effective teaching in higher education: research and practice. New York: Agathon Press; 1997.
    • 2. Billings-Gagliardi S, Barrett SV, Mazor KM. Interpreting course evaluation results: insights from thinkaloud interviews with medical students. Med. Educ. 2004; 38: 1061-70.
    • 3. Althouse LA, Stritter FT, Strong DE, Mattern WD. Course evaluations by students: the relationship of instructional characteristics to overall course quality. Paper presented at: The Annual Meeting of the American Educational Research Association; 1998 April 13-17; San Diego, CA.
    • 4. Cohen J, Cohen P, West SG, Aiken LS. Applied multiple regression/correlation analysis for the behavioral sciences. 3rd ed. Mahwah (NJ): Lawrence Erlbaum; 2003.
    • 5. Guest AR, Roubidoux MA, Blance CE, Fitzgerald JT, Bowerman RA. Limitations of student evaluations of curriculum. Acad Radiol. 1999;6:229-35.
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article

Collected from