Remember Me
Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:

OpenAIRE is about to release its new face with lots of new content and services.
During September, you may notice downtime in services, while some functionalities (e.g. user registration, login, validation, claiming) will be temporarily disabled.
We apologize for the inconvenience, please stay tuned!
For further information please contact helpdesk[at]openaire.eu

fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
White, Gareth R (2014)
Languages: English
Types: Doctoral thesis
Subjects: GV1469.15
This thesis presents the playthrough evaluation framework, a novel framework for the reliable usability evaluation of first-person shooter console video games. The framework includes playthrough evaluation, a structured usability evaluation method adapted from heuristic evaluation.\ud Usability evaluation can help guide developers by pointing out design issues that cause users problems. However, usability evaluation methods suffer from the evaluator effect, where separate evaluations of the same data do not produce reliably consistent results. This can\ud result in a number of undesirable consequences affecting issues such as:\ud • Unreliable evaluation: Without reliable results, evaluation reports risk giving incorrect or misleading advice.\ud • Weak methodological validation: Typically new methods (e.g., new heuristics) are validated against user tests. However, without a reliable means to describe observations, attempts to validate novel methods against user test data will also be affected by weak reliability.\ud The playthrough evaluation framework addresses these points through a series of studies presenting the need for, and showing the development of the framework, including the following stages,\ud 1. Explication of poor reliability in heuristic evaluation.\ud 2. Development and validation of a reliable user test coding scheme.\ud 3. Derivation of a novel usability evaluation method, playthrough evaluation.\ud 4. Testing the method, quantifying results.\ud Evaluations were conducted with 22 participants, on 3 first-person shooter action console video games, using two methodologies, heuristic evaluation and the novel playthrough evaluation developed in this thesis. Both methods proved effective, with playthrough evaluation providing more detailed analysis but requiring more time to conduct.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • Cohen, Jacob (1960). “A Coefficient of Agreement for Nominal Scales”. In: Educational and Psychological Measurement 20.1, pp. 37-46. DOI: 10.1177/001316446002000104. eprint: http: //epm.sagepub.com/content/20/1/37.full.pdf+html. URL: http://epm.sagepub.com/ content/20/1/37.short (cited on p. 223).
    • Cole, Frank L. (1988). “Content Analysis: Process and Application”. In: Clinical Nurse Specialist 2.1. URL: http://journals.lww.com/cns-journal/Fulltext/1988/00210/Content_Analysis_ _Process_and_Application.25.aspx (cited on p. 106).
    • Connell, Iain W. and N. V. Hammond (Aug. 1999). “Comparing usability evaluation principles with heuristics: problem instances vs. problem types”. In: Proceedings of INTERACT'99 - Human Computer Interaction, pp. 621-629 (cited on pp. 67, 150).
    • Desurvire, H, M Caplan, and J Toth (Jan. 2004). “Using heuristics to evaluate the playability of games”. In: Conference on Human Factors in Computing Systems. URL: http://portal.acm. org/citation.cfm?id=986102 (cited on pp. 25, 66, 68, 71, 85).
    • Desurvire, Heather and Bernard Chen (2008). 48 Differences Between Good and Bad Video Games: Game Playability Principles (PLAY) For Designing Highly Ranked Video Games. URL: http://www.behavioristics.com/downloads/PLAYPrinciples-HDesurvire.pdf (cited on p. 85).
    • Desurvire, Heather and Charlotte Wiberg (2008). “Master of the game: assessing approachability in future game design”. In: Proc. CHI extended abstracts. Florence, Italy: ACM, pp. 3177-3182. ISBN: 978-1-60558-012-X. DOI: http://doi.acm.org/10.1145/1358628.1358827 (cited on p. 85).
    • - (2009). “Game Usability Heuristics (PLAY) for Evaluating and Designing Better Games: The Next Iteration”. In: Proc. OCSC (Part of HCI International). San Diego, CA: Springer-Verlag, pp. 557-566. ISBN: 978-3-642-02773-4. DOI: http : / / dx . doi . org / 10 . 1007 / 978 - 3 - 642 - 02774-1_60 (cited on pp. 10, 24, 85-86, 103, 119, 121).
    • - (2010). “User Experience Design for Inexperienced Gamers: GAP - Game Approachability Guidelines”. In: Evaluating User Experience in Games - Concepts and Methods. Ed. by Regina Bernhaupt. Springer-Verlag. Chap. 8 (cited on pp. 68, 71, 85-86, 119).
    • Doubleday, Ann, Michele Ryan, Mark Springett, and Alistair Sutcliffe (1997). “A comparison of usability techniques for evaluating design”. In: Proceedings of the 2nd conference on Designing interactive systems: processes, practices, methods, and techniques. DIS '97. Amsterdam, The Netherlands: ACM, pp. 101-110. ISBN: 0-89791-863-0. DOI: http://doi.acm.org/10.1145/ 263552.263583. URL: http://doi.acm.org/10.1145/263552.263583 (cited on pp. 42, 70, 73-74, 103, 114).
    • Fabricatore, Carlo (1999). “Playability In Action Videgames: A Theoretical Design Reference”. Ph.D. thesis. Pontificia Universidad Catolica De Chile Escuela De Ingenieria, Santiago de Chile: Departamento de Ciencias de la Computación (cited on pp. 29, 92).
    • Fabricatore, Carlo, Miguel Nussbaum, and Ricardo Rosas (Feb. 2002). “Playability in Action Videogames: A Qualitative Design Model”. In: Human-Comp. Interaction 17.4, pp. 311-368. DOI: 10.1207/S15327051HCI1704_1 (cited on pp. 29, 87).
    • Falstein, Noah and Hal Barwood (Mar. 2006). The 400 Project. URL: http://www.theinspiracy. com/400_project.htm (cited on p. 87).
    • Febretti, Alessandro and Franca Garzotto (2009). “Usability, playability, and long-term engagement in computer games”. In: CHI EA '09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems. Boston, MA, USA: ACM, pp. 4063- 4068. ISBN: 978-1-60558-24. DOI: http : / / doi . acm . org / 10 . 1145 / 1520340 . 1520618 (cited on p. 25).
    • Federoff, M (Jan. 2002). “Heuristics and usability guidelines for the creation and evaluation of fun in video games”. MA thesis. Department of Telecommunications, Indiana University. URL: http://citeseerx.ist.psu.edu/viewdoc/download?doi= type=pdf (cited on pp. 24, 85-86, 119).
    • Fleiss, Joseph L. (1971). “Measuring nominal scale agreement among many raters”. In: Psychological Bulletin 76.5, pp. 378-382. URL: http://search.proquest.com/docview/614289059? accountid=14182 (cited on p. 59).
    • Følstad, Asbjørn, Effie Law, and Kasper Hornbaek (2012). “Analysis in Practical Usability Evaluation: A Survey Study”. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI '12. Austin, Texas, USA: ACM, pp. 2127-2136. ISBN: 978-1-4503-1015-4. DOI: 10.1145/2207676.2208365. URL: http://doi.acm.org/10.1145/2207676.2208365 (cited on pp. 151-152).
    • Frasca, Gonzalo (July 2001). What is ludology? A provisory definition. Tech. rep. Ludology.org (cited on p. 226).
    • Freelon, Deen (Sept. 2008). ReCal for Ordinal, Interval, and Ratio Data (OIR). URL: http://dfreelon. org/utils/recalfront/recal-oir/ (cited on p. 87).
    • Frøkjaer, Erik and Kasper Hornbaek (Jan. 2008). “Metaphors of human thinking for usability inspection and design”. In: ACM Trans. Comput.-Hum. Interact. 14.4, 20:1-20:33. ISSN: 1073- 0516. DOI: 10 . 1145 / 1314683 . 1314688. URL: http : / / doi . acm . org / 10 . 1145 / 1314683 . 1314688 (cited on p. 59).
    • Gerhardt-Powals, Jill (Apr. 1996). “Cognitive engineering principles for enhancing human-computer performance”. In: Int. J. Hum.-Comput. Interact. 8.2, pp. 189-211. ISSN: 1044-7318. DOI: 10. 1080/10447319609526147. URL: http://dx.doi.org/10.1080/10447319609526147 (cited on p. 72).
    • Giddings, Seth (2006). “Walkthrough: videogames and technocultural form”. Ph.D. thesis. The University of the West of England, Bristol: School of Cultural Studies (cited on p. 24).
    • Gram, C. and G. Cockton (1996). Design Principles for Interactive Software. Ifip International Federation for Information Processing. Chapman & Hall. ISBN: 9780412724701. URL: http: //books.google.co.uk/books?id=A7a-RVFhvFkC (cited on p. 40).
    • Gray, Wayne D. and Marilyn C. Salzman (1998). “Damaged merchandise? a review of experiments that compare usability evaluation methods”. In: Hum.-Comput. Interact. 13.3, pp. 203-261. ISSN: 0737-0024. DOI: http://dx.doi.org/10.1207/s15327051hci1303_2 (cited on pp. 19, 41-42, 47-48, 51-53, 61, 228).
    • Grudin, Jonathan (Oct. 1989). “The case against user interface consistency”. In: Commun. ACM 32 (10), pp. 1164-1173. ISSN: 0001-0782. DOI: http://doi.acm.org/10.1145/67933.67934. URL: http://doi.acm.org/10.1145/67933.67934 (cited on pp. 73, 103).
    • Ham, Dong-Han (2008). “A New Framework for Characterizing and Categorizing Usability Problems”. In: EKC2008 Proceedings of the EU-Korea Conference on Science and Technology. Ed. by Seung-Deog Yoo. Vol. 124. Springer Proceedings in Physics. Springer Berlin Heidelberg, pp. 345-353. ISBN: 978-3-540-85. URL: http://dx.doi.org/10.1007/978- 3- 540- 85190-5_36 (cited on p. 19).
    • Hartson, H. Rex, Terence S. Andre, and Robert C. Williges (2001). “Criteria For Evaluating Usability Evaluation Methods”. In: International Journal of Human-Computer Interaction 13.4, pp. 373- 410. URL: http://www.informaworld.com/10.1207/S15327590IJHC1304_03 (cited on pp. 40, 60).
    • Hartson, H. Rex, Terence S. Andre, Robert C. Williges, and Linda van Rens (1999). “The User Action Framework: A Theory-Based Foundation for Inspection and Classification of Usability Problems”. In: Proceedings of HCI International (the 8th International Conference on HumanComputer Interaction) on Human-Computer Interaction: Ergonomics and User Interfaces-Volume I. Hillsdale, NJ, USA: L. Erlbaum Associates Inc., pp. 1058-1062. ISBN: 0-8058-3391-9 (cited on pp. 116, 228).
    • Hartson, Rex (Sept. 2003). “Cognitive, physical, sensory, and functional affordances in interaction design”. In: Behaviour and Information Technology 22, 315-338(24) (cited on pp. 32, 223).
    • Hassenzahl, Marc, Markus Schöbel, and Tibor Trautmann (2008). “How motivational orientation influences the evaluation and choice of hedonic and pragmatic interactive products: The role of regulatory focus”. In: Interact. Comput. 20.4-5, pp. 473-479. ISSN: 0953-5438. DOI: http://dx.doi.org/10.1016/j.intcom.2008.05.001 (cited on p. 23).
    • Hassenzahl, Marc and Noam Tractinsky (2006). “User experience - a research agenda”. In: Behaviour & Information Technology 25.2, pp. 91-97. DOI: 10 . 1080 / 01449290500330331. eprint: http : / / www . tandfonline . com / doi / pdf / 10 . 1080 / 01449290500330331. URL: http://www.tandfonline.com/doi/abs/10.1080/01449290500330331 (cited on p. 228).
    • Hayes, Andrew F. and Klaus Krippendorff (2007). “Answering the call for a standard reliability measure for coding data”. In: Communication Methods and Measures 1.1, pp. 77-89 (cited on pp. 87, 225).
    • Hertzum, Morten and Niels Ebbe Jacobsen (1999). “The evaluator effect during first-time use of the cognitive walkthrough technique”. In: in Proc. 8th Intl. Conf. Human-Computer Interaction, (HCI International ´99). Erlbaum, pp. 1063-1067 (cited on p. 150).
    • - (2001). “The evaluator effect: a chilling fact about usability evaluation methods”. In: Int. Journal of Human-Computer Interaction 13.4, pp. 421-443 (cited on pp. 19, 47-48, 58-60, 128, 150, 223-224, 230).
    • - (2003). “The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods”. In: International Journal of Human-Computer Interaction 15.1, pp. 183-204. URL: http : / / www . informaworld.com/10.1207/S15327590IJHC1501_14 (cited on pp. 59, 128).
    • Herzberg, F. (1973). Work and the nature of man. Mentor book,m. New American Library. URL: http://books.google.co.uk/books?id=ClFhAAAAIAAJ (cited on p. 23).
    • Hix, D. and H.R. Hartson (1993). “Developing user interfaces: ensuring usability through product & process”. In: Wiley professional computing. J. Wiley. Chap. 10: Formative Evaluation, pp. 283-340 (cited on p. 40).
    • Hollnagel, Erik (1993a). “Human reliability analysis - Context and control”. In: (cited on pp. 33, 102, 155).
    • - (July 1993b). “The phenotype of erroneous actions”. In: Int. J. Man-Mach. Stud. 39.1, pp. 1-32. ISSN: 0020-7373. DOI: 10.1006/imms.1993.1051. URL: http://dx.doi.org/10.1006/imms. 1993.1051 (cited on pp. 33, 102, 155).
    • Hornbaek, Kasper and Erik Frøkjaer (2008). “Comparison of techniques for matching of usability problem descriptions”. In: Interact. Comput. 20.6, pp. 505-514. ISSN: 0953-5438. DOI: http: //dx.doi.org/10.1016/j.intcom.2008.08.005 (cited on p. 59).
    • Hornbaek, Kasper and Erik Frøkjaer (2008). “A Study of the Evaluator Effect in Usability Testing”. In: Human-Computer Interaction 23.3, pp. 251-277. URL: http://www.informaworld.com/ 10.1080/07370020802278205 (cited on pp. 1, 16, 50, 55, 59, 67, 128, 224, 226).
    • Howarth, Jonathan Randall (Apr. 2007). “Supporting Novice Usability Practitioners with Usability Engineering Tools”. Ph.D. thesis. Blacksburg, Virginia (cited on p. 37).
    • Hsieh, Hsiu-Fang and Sarah E. Shannon (2005). “Three Approaches to Qualitative Content Analysis”. In: Qualitative Health Research 15.9, pp. 1277-1288. DOI: 10.1177/1049732305276687. eprint: http : / / qhr . sagepub . com / content / 15 / 9 / 1277 . full . pdf + html. URL: http : //qhr.sagepub.com/content/15/9/1277.abstract (cited on p. 106).
    • Huizinga, Johan (1955). Homo Ludens: A study of the play element in culture. Beacon Press, Boston (cited on p. 4).
    • Hutchins, Edwin L., James D. Hollan, and Donald A. Norman (Dec. 1985). “Direct manipulation interfaces”. In: Hum.-Comput. Interact. 1.4, pp. 311-338. ISSN: 0737-0024. DOI: 10 . 1207 / s15327051hci0104_2. URL: http://dx.doi.org/10.1207/s15327051hci0104_2 (cited on p. 36).
    • Hvannberg, Ebba Thora, Effie Lai-Chong Law, and Marta Kristín Lárusdóttir (2007). “Heuristic evaluation: Comparing ways of finding and reporting usability problems”. In: Interact. Comput. 19.2, pp. 225-240. ISSN: 0953-5438. DOI: http://dx.doi.org/10.1016/j.intcom.2006. 10.001 (cited on p. 59).
    • Ijsselsteijn, W, Y de Kort, K Poels, and A Jurgelionis (Jan. 2007). “Characterising and Measuring User Experiences in Digital Games”. In: International Conference on Advances in Computer Entertainment. URL: http : / / www . yvonnedekort . nl / pdfs / ACE % 2525202007 % 252520workshop%252520submission%252520TUe%252520final.pdf (cited on pp. 4, 31, 38).
    • ISO (1998). ISO/IEC 9241-11 Ergonomic Requirements for Office Work with Visual Display Terminals (VDTS). Part 11: Guidance on Usability. Tech. rep. ISO/IEC 9241-11. Geneva: International Organization for Standardization (cited on pp. 13, 18-19).
    • - (June 2001). ISO/IEC 9126-1:2001 Software engineering - Product quality - Part 1: Quality model. Tech. rep. 9126-1:2001. Geneva: International Organization for Standardization (cited on p. 19).
    • - (July 2010). ISO 9241-210 Ergonomics of human-system interaction. Part 210: Human-centred design for interactive systems. Tech. rep. 9241-210. Geneva: International Organization for Standardization (cited on p. 228).
    • ISO/IEC (Aug. 2005). ISO/IEC 25000:2005 Software Engineering - Software product Quality Requirements and Evaluation (SQuaRE) - Guide to SQuaRE. Tech. rep. 25000:2005. Geneva: International Organization for Standardization / International Electro technical Commission (cited on p. 19).
    • Jacobsen, Niels Ebbe, Morten Hertzum, and Bonnie E. John (1998a). “The Evaluator Effect in Usability Studies: Problem Detection and Severity Judgments”. In: Human Factors and Ergonomics Society Annual Meeting Proceedings 42.19, 1336-1340(5). URL: http://www.ingentaconnect. com/content/hfes/hfproc/1998/00000042/00000019/art00002 (cited on p. 1).
    • - (1998b). “The evaluator effect in usability tests”. In: CHI '98: CHI 98 conference summary on Human factors in computing systems. Los Angeles, California, United States: ACM, pp. 255- 256. ISBN: 1-58113-028-7. DOI: http://doi.acm.org/10.1145/286498.286737 (cited on pp. 1, 35, 52).
    • Järvinen, Aki (Feb. 2008). “Games without Frontiers Theories and Methods for Game Studies and Design”. Ph.D. thesis. University of Tampere, Finland: Media Culture, pp. 1-416 (cited on pp. 23, 44).
    • Järvinen, Aki, Satu Heliö, and Frans Mäyrä (2002). Communication and Community in Digital Entertainment Services - Prestudy Research Report. Tech. rep. Finland: University of Tampere, Hypermedia Laboratory. URL: tampub.uta.fi/tup/951-44-5432-4.pdf (cited on pp. 4, 27, 224, 226).
    • Johnson, Daniel and John Gardner (2010). “Personality, Motivation and Video Games”. In: Proc. OzCHI (cited on p. 23).
    • Jørgensen, A (Jan. 2004). “Marrying HCI/Usability and computer games: a preliminary look”. In: Proceedings of the third Nordic conference on Human-computer …. URL: http://portal.acm. org/citation.cfm?id=1028014.1028078 (cited on p. 22).
    • Juul, Jesper and Marleigh Norton (2009). “Easy to use and incredibly difficult: on the mythical border between interface and gameplay”. In: FDG '09: Proceedings of the 4th International Conference on Foundations of Digital Games. Orlando, Florida: ACM, pp. 107-112. ISBN: 978-1- 60558-437-9. DOI: http://doi.acm.org/10.1145/1536513.1536539 (cited on p. 22).
    • Kaiser, Henry F. (1960). “The Application of Electronic Computers to Factor Analysis”. In: Educational and Psychological Measurement 20.1, pp. 141-151. DOI: 10.1177/001316446002000116. eprint: http : / / epm . sagepub . com / content / 20 / 1 / 141 . full . pdf + html. URL: http : //epm.sagepub.com/content/20/1/141.short (cited on pp. 89-90).
    • Keenan, Susan L., H. Rex Hartson, Dennis G. Kafura, and Robert S. Schulman (1999). “The Usability Problem Taxonomy: A Framework for Classification and Analysis”. In: Empirical Softw. Engg. 4.1, pp. 71-104. ISSN: 1382-3256. DOI: http://dx.doi.org/10.1023/A:1009855231530 (cited on pp. 116, 228).
    • Kellar, Melanie, Carolyn Watters, and Jack Duffy (June 2005). “Motivational Factors in Game Play in Two User Groups”. In: Changing Views: Worlds in Play: Proceedings of the 2005 Digital Games Research Association Conference. Ed. by de Castell Suzanne and Jenson Jennifer. Vancouver: University of Vancouver, p. 6. URL: http://www.digra.org/dl/display_html?chid=06278. 15575.pdf (cited on p. 23).
    • Koivisto, Elina M.I. and Hannu Korhonen (2006). Mobile Game Playability Heuristics. Tech. rep. Nokia Research Center (cited on p. 10).
    • Komulainen, Jeppe, Jari Takatalo, Miikka Lehtonen, and Göte Nyman (2008). “Psychologically structured approach to user experience in games”. In: NordiCHI '08: Proceedings of the 5th Nordic conference on Human-computer interaction. Lund, Sweden: ACM, pp. 487-490. ISBN: 978-1-59593-704-9. DOI: http://doi.acm.org/10.1145/1463160.1463226 (cited on p. 23).
    • Korhonen, Hannu (2010). “Comparison of playtesting and expert review methods in mobile game evaluation”. In: Proceedings of the 3rd International Conference on Fun and Games. Fun and Games '10. Leuven, Belgium: ACM, pp. 18-27. ISBN: 978-1-60558-907. DOI: http://doi. acm.org/10.1145/1823818.1823820. URL: http://doi.acm.org/10.1145/1823818.1823820 (cited on pp. 68, 71).
    • Korhonen, Hannu, Janne Paavilainen, and Hannamari Saarenpää (2009). “Expert Review Method in Game Evaluations - Comparison of Two Playability Heuristic Sets”. In: Proc. Mindtrek (cited on pp. 66, 68, 73, 85-86, 92, 103, 119).
    • Kotval, Xerxes P., Cheryl L. Coyle, Paulo A. Santos, Heather Vaughn, and Rebecca Iden (2007). “Heuristic evaluations at bell labs: analyses of evaluator overlap and group session”. In: CHI '07 extended abstracts on Human factors in computing systems. CHI '07. San Jose, CA, USA: ACM, pp. 1729-1734. ISBN: 978-1-59593-64. DOI: http://doi.acm.org/10.1145/1240866. 1240891. URL: http://doi.acm.org/10.1145/1240866.1240891 (cited on p. 67).
    • Krippendorff, Klaus. (2004). Content analysis: an introduction to its methodology, 2nd ed. Thousand Oaks, Calif: Sage. ISBN: 9780761915447; 9780761915454; 0761915443; 0761915451. URL: http://prism.talis.com/sussex-ac/items/851313 (cited on pp. 106, 225).
    • Laitinen, Sauli (Aug. 2008). “Usability and Playability Expert Evaluation”. In: Game Usability: Advancing the Player Experience. Ed. by Katherine Isbister and Noah Schaffer. Morgan Kaufmann. Chap. 7, pp. 91-111 (cited on pp. 26, 28).
    • Landis, J R and G G Koch (Mar. 1977). “The measurement of observer agreement for categorical data”. In: Biometrics 33.1, pp. 159-174. ISSN: 0006-341X (Print); 0006-341X (Linking) (cited on pp. 59, 223).
    • Lanzilotti, R., C. Ardito, M. F. Costabile, and A. De Angeli (Jan. 2011). “Do patterns help novice evaluators? A comparative study”. In: Int. J. Hum.-Comput. Stud. 69 (1-2), pp. 52-69. ISSN: 1071-5819. DOI: http://dx.doi.org/10.1016/j.ijhcs.2010.07.005. URL: http://dx.doi.org/ 10.1016/j.ijhcs.2010.07.005 (cited on p. 59).
    • Lavery, Darryn, Gilbert Cockton, and Malcolm P. Atkinson (July 1997). “Comparison of evaluation methods using structured usability problem reports”. In: Behaviour and Information Technology 16.21, pp. 246-266 (cited on pp. 31, 33-34, 97-98, 116, 223).
    • Law, Effie Lai-Chong and Ebba Thora Hvannberg (2004a). “Analysis of combinatorial user effect in international usability tests”. In: Proceedings of the SIGCHI conference on Human factors in computing systems. CHI '04. Vienna, Austria: ACM, pp. 9-16. ISBN: 1-58113-702-8. DOI: 10.1145/985692.985694. URL: http://doi.acm.org/10.1145/985692.985694 (cited on pp. 47, 59, 228).
    • - (2004b). “Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation”. In: NordiCHI '04: Proceedings of the third Nordic conference on Human-computer interaction. Tampere, Finland: ACM, pp. 241-250. ISBN: 1-58113-857-1. DOI: http://doi.acm. org/10.1145/1028014.1028051 (cited on pp. 55, 60, 67, 72, 144, 230).
    • Law, Effie Lai-Chong and Ebba Thora Hvannberg (2008). “Consolidating usability problems with novice evaluators”. In: Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges. NordiCHI '08. Lund, Sweden: ACM, pp. 495-498. ISBN: 978-1-59593- 704-9. DOI: http://doi.acm.org/10.1145/1463160.1463228. URL: http://doi.acm.org/10. 1145/1463160.1463228 (cited on pp. 16, 67).
    • Law, Effie Lai-chong and Ebba Thora Hvannberg (Sept. 2008). “Problems of Consolidating Usability Problems”. In: Proc. I-USED'. http://www.scientificcommons.org/48840793. URL: http: //citeseerx.ist.psu.edu/viewdoc/summary?doi= (cited on p. 67).
    • Law, Effie Lai-Chong, Dominique Scapin, Dominique Scapin, Gilbert Cockton, Mark Springett, Christian Stary, and Marco Winckler (ed.s) (2009). “Maturation of Usability Evaluation Methods: Retrospect and Prospect (Final Reports of COST294-MAUSE Working Groups)”. In: COST294-MAUSE Closing Conference Proceedings. COST. IRIT Press, Toulouse, France, p. 188 (cited on p. 41).
    • Law, Lai-Chong and Ebba Thora Hvannberg (2002). “Complementarity and convergence of heuristic evaluation and usability test: a case study of universal brokerage platform”. In: Proceedings of the second Nordic conference on Human-computer interaction. NordiCHI '02. Aarhus, Denmark: ACM, pp. 71-80. ISBN: 1-58113-616-1. DOI: http://doi.acm.org/10.1145/ 572020.572030. URL: http://doi.acm.org/10.1145/572020.572030 (cited on pp. 42, 47).
    • Lazzaro, Nicole (Aug. 2008). “The Four Fun Keys”. In: Game Usability: Advancing the Player Experience. Ed. by Katherine Isbister and Noah Schaffer. Morgan Kaufmann. Chap. 20, pp. 317- 343 (cited on p. 29).
    • Lee, J. and C. Y. Im (2009). “A Study on User Centered Game Evaluation Guideline Based on the MIPA Framework”. In: Lecture Notes in Computer Science. Ed. by M. Kurosu. Vol. 5619. Berlin: Springer-Verlag, pp. 84-93 (cited on p. 24).
    • Lewis, Clayton, Peter G. Polson, Cathleen Wharton, and John Rieman (1990). “Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces”. In: Proceedings of the SIGCHI conference on Human factors in computing systems: Empowering people. CHI '90. Seattle, Washington, United States: ACM, pp. 235-242. ISBN: 0-201-50932-6. DOI: http: //doi.acm.org/10.1145/97243.97279. URL: http://doi.acm.org/10.1145/97243.97279 (cited on p. 223).
    • Mahajan, Reenal (June 2003). “A Usability Problem Diagnosis Tool: Development and Formative Evaluation”. MA thesis. Blacksburg, Virginia, USA: Virginia Polytechnic Institute and State University (cited on pp. 116, 228).
    • Malone, Thomas W. (1980). “What makes things fun to learn? heuristics for designing instructional computer games”. In: Proc. 3rd ACM SIGSMALL symposium and the first SIGPC symposium on Small systems. Palo Alto, California, United States: ACM, pp. 162-169. ISBN: 0- 89791-024-9. DOI: http : / / doi . acm . org / 10 . 1145 / 800088 . 802839 (cited on pp. 23-24, 85).
    • - (1982). “Heuristics for designing enjoyable user interfaces: Lessons from computer games”. In: Proc. Conference on Human factors in computing systems. Gaithersburg, Maryland, United States: ACM, pp. 63-68. DOI: http : / / doi . acm . org / 10 . 1145 / 800049 . 801756 (cited on pp. 85, 114).
    • Malone, Thomas W. and M.R. Lepper (1987). “Making learning fun: a taxonomy of intrinsic motivation for learning”. In: Aptitude, learning and interaction III. Conative and affective process analysis. Ed. by R.E. Snow and M.J. Farr. Aptitude, learning, and instruction. Hillsdale, NJ, USA: L. Erlbaum. Chap. 10, pp. 223-253 (cited on p. 23).
    • Matera, Maristella (1999). “SUE: A Systematic Methodology for Evaluating Hypermedia Usability”. Ph.D. thesis. Dipartimento di Elettronica e Informazione, Politecnico Di Milano (cited on p. 124).
    • Matera, M., M.F. Costabile, F. Garzotto, and P. Paolini (Jan. 2002). “SUE inspection: an effective method for systematic usability evaluation of hypermedia”. In: Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on 32.1, pp. 93 -103. ISSN: 1083-4427. DOI: 10.1109/3468.995532 (cited on pp. 42, 54, 73, 105).
    • McAllister, Graham and Gareth R. White (2010). “Video Game Development and User Experience”. In: Evaluating User Experience in Games - Concepts and Methods. Ed. by Regina Bernhaupt. Springer-Verlag. Chap. 7 (cited on pp. 45, 79).
    • Mentis, Helena and Geri Gay (2003). “User recalled occurrences of usability errors: implications on the user experience”. In: CHI '03: CHI '03 extended abstracts on Human factors in computing systems. Ft. Lauderdale, Florida, USA: ACM, pp. 736-737. ISBN: 1-58113-637-4. DOI: http : //doi.acm.org/10.1145/765891.765959 (cited on pp. 116, 228).
    • Meurs, Richard van (Aug. 2007). “How to Play the Game: a study on MUD player types and their real life personality traits”. MA thesis. Tilburg University (cited on p. 23).
    • Molich, Rolf, Meghan R. Ede, Klaus Kaasgaard, and Barbara Karyukin (2004). “Comparative usability evaluation”. In: Behav. Inf. Technol. 23.1, pp. 65-74. ISSN: 0144-929X. DOI: http : //dx.doi.org/10.1080/0144929032000173951 (cited on p. 1).
    • Monk, Andrew, Marc Hassenzahl, Mark Blythe, and Darren Reed (2002). “Funology: designing enjoyment”. In: CHI '02: CHI '02 extended abstracts on Human factors in computing systems. Minneapolis, Minnesota, USA: ACM, pp. 924-925. ISBN: 1-58113-454-1. DOI: http://doi.acm. org/10.1145/506443.506661 (cited on p. 23).
    • Nacke, Lennart (2009). “From playability to a hierarchical game usability model”. In: FuturePlay '09: Proceedings of the 2009 Conference on Future Play on @ GDC Canada. Vancouver, British Columbia, Canada: ACM, pp. 11-12. ISBN: 978-1-60558-685-4. DOI: http://doi.acm.org/10. 1145/1639601.1639609 (cited on p. 27).
    • Nielsen, Jakob (1992). “Finding usability problems through heuristic evaluation”. In: CHI '92: Proceedings of the SIGCHI conference on Human factors in computing systems. Monterey, California, United States: ACM, pp. 373-380. ISBN: 0-89791-513-5. DOI: http://doi.acm.org/ 10.1145/142750.142834 (cited on p. 87).
    • - (1994a). “Enhancing the explanatory power of usability heuristics”. In: CHI '94: Proceedings of the SIGCHI conference on Human factors in computing systems. Boston, Massachusetts, United States: ACM, pp. 152-158. ISBN: 0-89791-650-6. DOI: http://doi.acm.org/10.1145/ 191666.191729 (cited on pp. 10, 64, 84-86, 88, 93, 114, 148).
    • - (1994b). “Estimating the number of subjects needed for a thinking aloud test”. In: International Journal of Human-Computer Studies 41.3, pp. 385 -397. ISSN: 1071-5819. DOI: 10 . 1006 / ijhc . 1994 . 1065. URL: http : / / www . sciencedirect . com / science / article / pii / S1071581984710652 (cited on pp. 32, 62, 129).
    • - (1994c). “Usability inspection methods”. In: Conference companion on Human factors in computing systems. CHI '94. Boston, Massachusetts, United States: ACM, pp. 413-414. ISBN: 0-89791-651-4. DOI: http://doi.acm.org/10.1145/259963.260531. URL: http://doi.acm. org/10.1145/259963.260531 (cited on pp. 41, 187).
    • Nielsen, Jakob and Rolf Molich (1990). “Heuristic evaluation of user interfaces”. In: Proc. SIGCHI. Seattle, Washington, United States: ACM, pp. 249-256. ISBN: 0-201-50932-6. DOI: http : //doi.acm.org/10.1145/97243.97281 (cited on pp. 62, 150).
    • Nielsen Norman Group (2011). User Experience - Our Definition. URL: http://www.nngroup. com/about/userexperience.html (cited on p. 228).
    • Norman, Donald A. (1986). “Cognitive Engineering”. In: User centered system design;new perspectives on human-computer interaction. Ed. by Donald A. Norman and Stephen W. Draper. Hillsdale, NJ, USA: Lawrence Erlbaum Associates Inc. Chap. 3, pp. 31-61 (cited on pp. 36-37, 45, 116, 155, 228).
    • Omar, Hasiah Mohamed and Azizah Jaafar (2009). “Conceptual Framework for a Heuristics Based Methodology for Interface Evaluation of Educational Games”. In: Computer Technology and Development, International Conference on 1, pp. 594-598. DOI: http : / / doi . ieeecomputersociety.org/10.1109/ICCTD.2009.249 (cited on p. 25).
    • Pagulayan, Randy J., Kevin Keeker, Dennis Wixon, Ramon L. Romero, and Thomas Fuller (2003). “User-centered design in games”. In: The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications. Ed. by Julie A. Jacko and Andrew Sears. Hillsdale, NJ, USA: L. Erlbaum Associates Inc. Chap. User-centered design in games, pp. 883- 906. ISBN: 0-8058-3838-4. URL: http://portal.acm.org/citation.cfm?id=772072.772128 (cited on p. 36).
    • Pausch, R, R Gold, T Skelly, and D Thiel (Jan. 1994). “What HCI designers can learn from video game designers”. In: Conference on Human Factors in Computing Systems. URL: http : / / portal.acm.org/citation.cfm?id=260220 (cited on p. 23).
    • Pinelle, David, Nelson Wong, and Tadeusz Stach (2008a). “Heuristic evaluation for games: usability principles for video game design”. In: Proc. SIGCHI. Florence, Italy: ACM, pp. 1453-1462. ISBN: 978-1-60558-01. DOI: http://doi.acm.org/10.1145/1357054.1357282 (cited on pp. 66, 68, 71, 85-86, 103, 119).
    • - (2008b). “Using genres to customize usability evaluations of video games”. In: Future Play '08: Proceedings of the 2008 Conference on Future Play. Toronto, Ontario, Canada: ACM, pp. 129- 136. ISBN: 978-1-60558-218-4. DOI: http://doi.acm.org/10.1145/1496984.1497006 (cited on p. 92).
    • Polson, Peter G. and Clayton H. Lewis (June 1990). “Theory-based design for easily learned interfaces”. In: Hum.-Comput. Interact. 5 (2), pp. 191-220. ISSN: 0737-0024. DOI: http : / / dx . doi . org / 10 . 1207 / s15327051hci0502 \ &3 _ 3. URL: http : / / dx . doi . org / 10 . 1207 / s15327051hci0502\&3_3 (cited on p. 116).
    • Polson, Peter G., Clayton Lewis, John Rieman, and Cathleen Wharton (May 1992). “Cognitive walkthroughs: a method for theory-based evaluation of user interfaces”. In: Int. J. ManMach. Stud. 36 (5), pp. 741-773. ISSN: 0020-7373. DOI: 10.1016/0020- 7373(92)90039- N. URL: http://dl.acm.org/citation.cfm?id=141744.141755 (cited on pp. 103, 116).
    • Rasmussen, Jens (1982). “Human errors. A taxonomy for describing human malfunction in industrial installations”. In: Journal of Occupational Accidents 4.2-4, pp. 311 -333. ISSN: 0376- 6349. DOI: 10 . 1016 / 0376 - 6349(82 ) 90041 - 4. URL: http : / / www . sciencedirect . com / science/article/pii/0376634982900414 (cited on p. 36).
    • Rebellion Developments (Feb. 2010). Aliens vs. Predator (cited on p. 84).
    • Rigby, Scott and Richard Ryan (Jan. 2007a). “Rethinking Carrots: A New Method For Measuring What Players Find Most Rewarding and Motivating About Your Game”. In: Gamasutra (cited on p. 23).
    • - (Sept. 2007b). “The Player Experience of Need Satisfaction (PENS) An applied model and methodology for understanding key components of the player experience”. In: (cited on p. 23).
    • Robin, Hunicke, Leblanc Marc, and Zubek Robert (2004). “MDA: A Formal Approach to Game Design and Game Research”. In: Proc. Challenges in Game AI Workshop, Nineteenth National Conference on Artificial Intelligence (AAAI '04). San Jose, California: AAAI Press. URL: http: //cs.northwestern.edu/~hunicke/pubs/MDA.pdf (cited on p. 24).
    • Rohn, Janice A., Jared Spool, Mayuresh Ektare, Sanjay Koyani, Michael Muller, and Janice (Ginny) Redish (2002). “Usability in practice: alternatives to formative evaluations-evolution and revolution”. In: CHI '02 extended abstracts on Human factors in computing systems. CHI EA '02. Minneapolis, Minnesota, USA: ACM, pp. 891-897. ISBN: 1-58113-454-1. DOI: http://doi. acm.org/10.1145/506443.506648. URL: http://doi.acm.org/10.1145/506443.506648 (cited on p. 72).
    • Rosson, Mary Beth and John M. Carroll (2003). “The human-computer interaction handbook”. In: ed. by Julie A. Jacko and Andrew Sears. Hillsdale, NJ, USA: L. Erlbaum Associates Inc. Chap. Scenario-based design, pp. 1032-1050. ISBN: 0-8058-3838-4. URL: http://dl.acm. org/citation.cfm?id=772072.772137 (cited on pp. 118, 227).
    • Rosson, M.B. and J.M. Carroll (2002). Usability Engineering: Scenario-Based Development of HumanComputer Interaction. Morgan Kaufmann Series in Interactive Technologies. Academic Press. ISBN: 9781558607125. URL: http://books.google.co.uk/books?id=sRPg0IYhYFYC (cited on p. 40).
    • Ryan, Richard, C. Scott Rigby, and Andrew Przybylski (Jan. 2006). “The motivational pull of video games: A self-determination theory approach”. In: Motivation and Emotion (Volume 30, Number 4 / December, 2006), pp. 344-360. ISSN: 0146-7239 (Print) 1573-6644 (Online). DOI: 10 . 1007 / s11031 - 006 - 9051 - 8. URL: http : / / www . springerlink . com / index / H8U63440VL4Q6534.pdf (cited on p. 23).
    • Sauro, Jeff and Erika Kindlund (2005). “A method to standardize usability metrics into a single score”. In: Proceedings of the SIGCHI conference on Human factors in computing systems. CHI '05. Portland, Oregon, USA: ACM, pp. 401-409. ISBN: 1-58113-998-5. DOI: http://doi.acm. org/10.1145/1054972.1055028. URL: http://doi.acm.org/10.1145/1054972.1055028 (cited on p. 57).
    • Schaffer, Noah (Apr. 2007). Heuristics for Usability in Games. Tech. rep. PlayerFriendly.com (cited on p. 85).
    • - (Apr. 2009). “Verifying An Integrated Model Of Usability In Games”. Ph.D. thesis. Rensselaer Polytechnic Institute, Troy, New York: Dept. of Language, Literature, et al. (cited on pp. 23, 29).
    • Schmettow, Martin and Sabine Niebuhr (2007). “A pattern-based usability inspection method: first empirical performance measures and future issues”. In: BCS-HCI '07: Proceedings of the 21st British HCI Group Annual Conference on People and Computers. University of Lancaster, United Kingdom: British Computer Society, pp. 99-102. ISBN: 978-1-902505-95-4 (cited on p. 96).
    • Schuurman, Dimitri, Katrien De Moor, Lieven De Marez, and Jan Van Looy (2008). “Fanboys, competers, escapists and time-killers: a typology based on gamers' motivations for playing video games”. In: DIMEA '08: Proceedings of the 3rd international conference on Digital Interactive Media in Entertainment and Arts. Athens, Greece: ACM, pp. 46-50. ISBN: 978-1- 60558-248-1. DOI: http://doi.acm.org/10.1145/1413634.1413647 (cited on p. 23).
    • Scriven, Michael (1967). “The Methodology of Evaluation”. In: Perspectives of curriculum evaluation. Ed. by Ralph W Tyler, Robert M Gagné, and Michael Scriven. Chicago: Rand McNally. Chap. The Methodology of Evaluation (cited on pp. 39-40, 45-46, 53, 61-62, 67, 74, 81).
    • Sears, Andrew (1997). “Heuristic Walkthroughs: Finding the Problems Without the Noise”. In: International Journal of Human-Computer Interaction 9.3, pp. 213-234. URL: http:/ / www. informaworld.com/10.1207/s15327590ijhc0903_2 (cited on pp. 57, 60, 230).
    • Shackel, Brian (2009). “Usability - Context, framework, definition, design and evaluation”. In: Interacting with Computers 21.5-6, pp. 339 -346. ISSN: 0953-5438. DOI: DOI:10.1016/j. intcom . 2009 . 04 . 007. URL: http : / / www . sciencedirect . com / science / article / B6V0D - 4W99VWW-1/2/f8f10dd5b43df2bd66d26adfed437440 (cited on p. 20).
    • Sherry, John and Kristen Lucas (May 2003). “Video Game Uses and Gratifications as Predictors of Use and Game Preference”. In: Proceedings of International Communication Association 2003. Marriott Hotel, San Diego, CA (cited on p. 23).
    • Sim, Gavin R. (2009). “Evidence Based Design of Heuristics: Usability and Computer Assisted Assessment”. Ph.D. thesis. Preston: School of Computing, Engineering and Physical Sciences, University of Central Lancashire (cited on pp. 51, 67, 70, 72).
    • Sim, Gavin and Janet C. Read (2010). “The Damage Index: an aggregation tool for usability problem prioritisation”. In: Proceedings of the 24th BCS Interaction Specialist Group Conference. BCS '10. Dundee, United Kingdom: British Computer Society, pp. 54-61. ISBN: 978-1-78017- 130-2. URL: http://dl.acm.org/citation.cfm?id=2146303.2146311 (cited on p. 67).
    • Somervell, Jacob (June 2004). “Developing heuristic evaluation methods for large screen information exhibits based on critical parameters”. Ph.D. thesis. Virginia Polytechnic Institute and State University (cited on p. 118).
    • Springett, Mark (1998). “Linking surface error characteristics to root problems in user-based evaluation studies”. In: Proceedings of the working conference on Advanced visual interfaces. AVI '98. L'Aquila, Italy: ACM, pp. 102-113. DOI: 10.1145/948496.948512. URL: http://doi. acm.org/10.1145/948496.948512 (cited on pp. 29, 116).
    • Sridharan, Sriram (Dec. 2001). “Usability and Reliability of the User Action Framework: A Theoretical Foundation for Usability Engineering Activities”. MA thesis. Department of Industrial and Systems Engineering (cited on pp. 116, 228).
    • Stevens, J.P. (2002). Applied Multivariate Statistics for the Social Sciences, Fifth Edition. Applied Multivariate STATS. Taylor & Francis. ISBN: 9780805837766. URL: http://books.google.co. uk/books?id=mK0MtyWa7-QC (cited on p. 90).
    • Tech, HCI Virginia (Mar. 2012). The Concept of Affordance in Usability. URL: http://research.cs. vt.edu/usability/projects/uaf%20and%20tools/affordance.htm (cited on p. 223).
    • Theofanos, Mary and Whitney Quesenbery (Aug. 2005). “Reporting on Formative Testing: A UPA 2005 Workshop Report”. In: Journal of Usability Studies 1, pp. 27-45 (cited on p. 40).
    • Thompson, Jennifer A. (Dec. 1999). “Investigating the Effectiveness of Applying the Critical Incident Technique to Remote Usability Evaluation”. MA thesis. Virginia Polytechnic Institute and State University: Industrial and Systems Engineering (cited on p. 37).
    • Tychsen, Anders, Michael Hitchens, and Thea Brolund (2008). “Motivations for play in computer role-playing games”. In: Future Play '08: Proceedings of the 2008 Conference on Future Play. Toronto, Ontario, Canada: ACM, pp. 57-64. ISBN: 978-1-60558-218-4. DOI: http://doi.acm. org/10.1145/1496984.1496995 (cited on p. 23).
    • Vermeeren, Arnold Petrus Otto Sabina, Ilse E.H. van Kesteren, and Mathilde M. Bekker (2003). “Managing the Evaluator Effect in User Testing”. In: Proc. HCI Interact. Ed. by M. Rauterberg et al. IOS Press, pp. 674-654 (cited on pp. 3, 54, 59, 224).
    • Vermeeren, Arnold P.O.S., Jelle Attema, Evren Akar, Huib de Ridder, Andrea J. von Doorn, Cigdem Erbug, Ali E. Berkman, and Martin C. Maguire (2008). “Usability Problem Reports for Comparative Studies: Consistency and Inspectability”. In: Human-Computer Interaction 23.4, pp. 329-380. DOI: 10.1080/07370020802536396. eprint: http://www.tandfonline.com/ doi/pdf/10.1080/07370020802536396. URL: http://www.tandfonline.com/doi/abs/10. 1080/07370020802536396 (cited on pp. 55, 59).
    • Vermeeren, Arnold P. O. S., Karin den Bouwmeester, Jans Aasman, and Huib de Ridder (2002). “DEVAN: a tool for detailed video analysis of user test data”. In: Behaviour & Information Technology 21.6, pp. 403-423. URL: http : / / www . informaworld . com / 10 . 1080 / 0144929021000051714 (cited on pp. 33, 54, 224).
    • Vorderer, Peter, Tilo Hartmann, and Christoph Klimmt (2003). “Explaining the enjoyment of playing video games: the role of competition”. In: ICEC '03: Proceedings of the second international conference on Entertainment computing. Pittsburgh, Pennsylvania: Carnegie Mellon University, pp. 1-9 (cited on p. 23).
    • Ward, Liam (Aug. 2010). A Phenomenology Investigation Into The Motivations of Video Game Use. URL: http : / / socyberty . com / philosophy / a - phenomenology - investigation - into - the - motivations-of-video-game-use/ (cited on p. 23).
    • Welie, Martijn Van, Gerrit C. Van Der Veer, and Anton Eliëns (1999). “Breaking down Usability”. In: Proceedings of Interact '99. Press, pp. 613-620 (cited on p. 32).
    • Whitton, Nicola (2007). “Motivation and computer game based learning”. In: Proceedings of ASCILITE 2007 (cited on p. 23).
    • Winter, Sebastian, Stefan Wagner, and Florian Deissenboeck (2008). “A Comprehensive Model of Usability”. In: Engineering Interactive Systems. Ed. by Jan Gulliksen, Morton Harning, Philippe Palanque, Gerrit van der Veer, and Janet Wesson. Vol. 4940. Lecture Notes in Computer Science. Springer Berlin / Heidelberg, pp. 106-122 (cited on p. 116).
    • Wixon, Dennis (July 2003). “Evaluating usability methods: why the current literature fails the practitioner”. In: interactions 10 (4), pp. 28-34. ISSN: 1072-5520. DOI: http://doi.acm.org/ 10.1145/838830.838870. URL: http://doi.acm.org/10.1145/838830.838870 (cited on p. 41).
    • Woolrych, Alan, Gilbert Cockton, and Mark Hindmarch (2004). “Falsification Testing For Usability Inspection Method Assessment”. In: Proceedings of HCI 2004 Conference on People and Computers XVIII. Ed. by S. Fincher, P. Markopoulos, D.§ Moore, and R. Ruddle. BCS. Bath (cited on p. 49).
    • Woolrych, Alan, Kasper Hornbaek, Erik Frøkjaer, and Gilbert Cockton (2011). “Ingredients and Meals Rather Than Recipes: A Proposal for Research That Does Not Treat Usability Evaluation Methods as Indivisible Wholes”. In: International Journal of Human-Computer Interaction 27.10, pp. 940-970. DOI: 10.1080/10447318.2011.555314. eprint: http://dx.doi.org/10.1080/ 10447318.2011.555314. URL: http://dx.doi.org/10.1080/10447318.2011.555314 (cited on p. 152).
    • Yee, Nick (2006a). “Motivations for play in online games”. In: Cyberpsychology & behavior 9.6, pp. 772-775 (cited on p. 23).
    • - (2006b). “The demographics, motivations, and derived experiences of users of massively multi-user online graphical environments”. In: Presence: Teleoper. Virtual Environ. 15.3, pp. 309- 329. ISSN: 1054-7460. DOI: http://dx.doi.org/10.1162/pres.15.3.309 (cited on p. 23).
    • Yue, Wong Seng and Nor Azan Mat Zin (2009). “Usability evaluation for history educational games”. In: ICIS '09: Proceedings of the 2nd International Conference on Interaction Sciences. Seoul, Korea: ACM, pp. 1019-1025. ISBN: 978-1-60558-710-3. DOI: http://doi.acm.org/10. 1145/1655925.1656110 (cited on p. 27).
    • Zammitto, Veronica Lorena (2010). “Gamers' Personality And Their Gaming Preferences”. MA thesis. School of Interactive Arts and Technology, Simon Fraser University (cited on p. 23).
    • Zapf, Dieter, Felix C. Brodbeck, Michael Frese, Helmut Peters, and Jochen Prümper (1992). “Errors in working with office computers: A first validation of a taxonomy for observed errors in a field setting”. In: International Journal of Human-Computer Interaction 4.4, pp. 311-339. URL: http://www.informaworld.com/10.1080/10447319209526046 (cited on p. 36).
    • user action framework a framework for analysing usability issues, particularly those of a cognitive nature (Andre, 2000; Andre et al., 2000, 2001, 2003; Capra, 2001; Catanzaro, 2005; Hartson et al., 1999; Keenan et al., 1999; Mahajan, 2003; Mentis and Gay, 2003; Sridharan, 2001). Includes a hierarchical taxonomy of interaction breakdowns that may cause problems. Based on Norman's Theory of Action (Norman, 1986). 37, 38, 53, 54, 116-118, 120, 121, 144, 154, 155, 226, c.f. player action framework
  • Inferred research data

    The results below are discovered through our pilot algorithms. Let us know how we are doing!

    Title Trust
  • No similar publications.

Share - Bookmark

Download from

Cite this article

Cookies make it easier for us to provide you with our services. With the usage of our services you permit us to use cookies.
More information Ok