LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Sherief, N; Jiang, N.; Hosseini, Mahmoud; Phalp, Keith T.; Ali, Raian (2014)
Languages: English
Types: Unknown
Subjects:
Crowdsourcing is an emerging online paradigm for problem\ud solving which involves a large number of people often recruited on a voluntary basis and given, as a reward, some tangible or intangible incentives. It harnesses the power of the crowd for minimizing costs and, also, to solve problems which inherently require a large, decentralized and diverse crowd. In this paper, we advocate the potential of crowdsourcing for software evaluation. This is especially true in the case of complex and highly variable software systems, which work in diverse, even unpredictable, contexts. The crowd can enrich and keep the timeliness of the developers’ knowledge about software evaluation via their iterative feedback. Although this seems promising, crowdsourcing evaluation introduces a new range of challenges mainly on how to organize the crowd and provide the right platforms to obtain and process their input. We focus on the activity of obtaining evaluation feedback from the crowd and conduct two focus groups to understand the various aspects of such an activity. We finally report on a set of challenges to address and realize correct and efficient crowdsourcing mechanisms for software evaluation\ud
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [1] Howe, J. (2006). The rise of crowdsourcing. Wired magazine, 14(6), 1-4.
    • [2] Estellés-Arolas, E., & González-Ladrón-de-Guevara, F. (2012). Towards an integrated crowdsourcing definition. Journal of Information science, 38(2), 189-200.
    • [3] Naroditskiy, V., Jennings, N. R., Van Hentenryck, P., & Cebrian, M. (2013). Crowdsourcing Dilemma. arXiv preprint arXiv:1304.3548.
    • [4] Stringhini, G., Kruegel, C., & Vigna, G. (2010). Detecting spammers on social networks. In Proceedings of the 26th Annual Computer Security Applications Conference (pp. 1-9). ACM.
    • [5] Surowiecki, J. (2005). The wisdom of crowds. Random House LLC.
    • [6] Ali, R., Solis, C., Omoronyia, I., Salehie, M., Nuseibeh, B. (2012). Social Adaptation: When Software Gives Users a Voice. In the proceedings of the 7th Inerantional Conference on Evaluation of Novel Approaches to Software Engineering (ENASE 2012).
    • [7] Vredenburg, K., Mao, J.Y, Smith, P., Carey, T. 2002. A survey of usercentered design practice. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'02), ACM, Minneapolis, Minnesota, USA. p. 471-478.
    • [8] Law, E.L.-C. and P.v. Schaik. 2010. Editorial: Modelling user experience - An agenda for research and practice. Interact. Comput. 22(5): p. 313- 322.
    • [9] Dybå, T. and T. Dingsøyr. 2008. Empirical studies of agile software development: A systematic review. Information and Software Technology. 50(9-10): p. 833-859.
    • [10] Adikari, S. and C. McDonald. 2006. User and Usability Modeling for HCI/HMI: A Research Design. In Proceedings of the International Conference on Information and Automation. ICIA'06. Shandong, 151 - 154.
    • [11] Maalej, W., Happel, H-J., Rashid, A. 2009. When users become collaborators: towards continuous and context-aware user input. In Proceedings of the 24th ACM SIGPLAN conference companion on Object oriented programming systems languages and applications (OOPSLA '09). ACM, Orlando, Florida, USA, 981-990.
    • [12] Knauss, A. 2012. On the usage of context for requirements elicitation: End-user involvement in IT ecosystems. In Proceedings of the 2012 IEEE 20th International Requirements Engineering Conference (RE '12). IEEE Computer Society, Washington, DC, USA, 345-348.
    • [13] Pagano, D., Brügge, B. 2013. User involvement in software evolution practice: a case study. In Proceedings of the 2013 International Conference on Software Engineering (ICSE '13). IEEE Press, Piscataway, NJ, USA, 953-962.
    • [14] Kittur, A., Chi, E. H., and Suh, B. 2008. Crowdsourcing user studies with Mechanical Turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'08). ACM, Florence, Italy, 453-456.
    • [15] Stolee, K.T. and Elbaum, S. 2010. Exploring the use of crowdsourcing to support empirical studies in software engineering. In Proceedings of the 2010 ACM-IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM'10). ACM, Bolzano-Bozen, Italy, 1-4.
    • [16] Liu, D., Bias, R. G., Lease, M., and Kuipers, R. 2012. Crowdsourcing for usability testing. In Proceedings of the American Society for Information Science and Technology, 49(1): p. 1-10.
    • [17] Komarov, S., Reinecke, K., and Gajos, K.Z. 2013. Crowdsourcing performance evaluations of user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'13). ACM, Paris, France, 207-216.
    • [18] Ali, R., Solis, C., Salehie, M., Omoronyia, I., Nuseibeh, B., Maalej, W. 2011. Social sensing: when users become monitors. In Proceedings of the 19th ACM SIGSOFT symposium and the 13th European conference on Foundations of software engineering (ESEC/ FSE'11), ACM, Szeged, Hungary, 476-479.
    • [19] D. Brandes and P. Ginnis. A guide to student-centred learning. Nelson Thornes, 1996.
    • [20] Almaliki, M., Faniyi, F., Bahsoon, R., Phalp, K., Ali, R. 2014. Requirements-driven Social Adaptation: Expert Survey. The 20th International Working Conference on Requirements Engineering: Foundation for Software Quality (REFSQ 2014).
    • [21] Kontio, J., Lehtola, L., and Bragge, J. 2004. Using the focus group method in software engineering: obtaining practitioner and user experiences. In Proceedings of the International Symposium on Empirical Software Engineering (Redondo Beach, CA, USA, August 19 - 20, 2004). ISESE '04. IEEE, New York, NY, 271-280.
    • [22] Braun, V. and Clarke, V. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology. 3, 2 (Jul. 2006), 77 - 101
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article