LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Azzopardi, Leif (2014)
Languages: English
Types: Other
Subjects:
Understanding how people interact when searching is central to the study of Interactive Information Retrieval (IIR). Most of the prior work has either been conceptual, observational or empirical. While this has led to numerous insights and findings regarding the interaction between users and systems, the theory has lagged behind. In this paper, we extend the recently proposed search economic theory to make the model more realistic. We then derive eight interaction based hypotheses regarding search behaviour. To validate the model, we explore whether the search behaviour of thirty-six participants from a lab based study is consistent with the theory. Our analysis shows that observed search behaviours are in line with predicted search behaviours and that it is possible to provide credible explanations for such behaviours. This work describes a concise and compact representation of search behaviour providing a strong theoretical basis for future IIR research.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [1] C. W. Axelrod. The economic evaluation of information storage and retrieval systems. Information Processing & Management, 13(2):117-124, 1977.
    • [2] L. Azzopardi. The economics in interactive information retrieval. In Proceedings of the 34th ACM conference on research and development in information retrieval (SIGIR), pages 15-24, 2011.
    • [3] L. Azzopardi. Economic models of search. In Proceedings of the 18th Australasian Document Computing Symposium, ADCS '13, pages 1-1, 2013.
    • [4] L. Azzopardi, K. Ja¨rvelin, J. Kamps, and M. D. Smucker. Sigir 2010 report workshop on the simulation of interaction. SIGIR Forum, 44:35-47, 2011.
    • [5] L. Azzopardi, D. Kelly, and K. Brennan. How query cost a↵ects search behavior. In Proceedings of the 36th International ACM SIGIR Conference on Research and Development in IR, pages 23-32, 2013.
    • [6] F. Baskaya, H. Keskustalo, and K. Ja¨rvelin. Time drives interaction: simulating sessions in diverse searching environments. In Proceedings of the 35th ACM conference on research and development in information retrieval (SIGIR), pages 105-114, 2012.
    • [7] F. Baskaya, H. Keskustalo, and K. Ja¨rvelin. Modeling behavioral factors ininteractive information retrieval. In Proceedings of the 22nd ACM international conference on Conference on information and knowledge management, pages 2297-2302, 2013.
    • [8] M. J. Bates. The design of browsing and berrypicking techniques for the online search interface. Online Information Review, 13(5):407-424, 1989.
    • [9] M. J. Bates. Training and education for online. chapter Information search tactics, pages 96-105. Taylor Graham Publishing, London, UK, UK, 1989.
    • [10] N. J. Belkin. Some(what) grand challenges for information retrieval. SIGIR Forum, 42:47-54, 2008.
    • [11] N. J. Belkin, R. N. Oddy, and H. M. Brooks. Ask for information retrieval: part i: background and theory; part ii: results of a design study. Journal of Documentation, 38(2) 61-71 and 38(3) 145-164, 1982.
    • [12] U. Birchler and M. Butler. Information Economics. Routledge, 1st edition edition, 2007.
    • [13] J. Brutlag. Speed matters for google web search. In Technical Report,2009, Retrieved online at http://googleresearch.blogspot.com/2009/06/speedmatters.html.
    • [14] M. D. Cooper. A cost model for evaluating information retrieval systems. Journal of the American Society for Information Science, pages 306-312, 1972.
    • [15] S. Erdelez. Information encountering: a conceptual framework for accidental information discovery. In Proceedings of an international conference on Information seeking in context, pages 412-421, 1997.
    • [16] N. Fuhr. A probability ranking principle for interactive information retrieval. Information Retrieval, 11(3):251-265, 2008.
    • [17] P. Ingwersen and K. Ja¨rvelin. The Turn: Integration of Information Seeking and Retrieval in Context. Springer-Verlag New York, Inc., 2005.
    • [18] K. Ja¨rvelin. Ir research: systems, interaction, evaluation and theories. ACM SIGIR Forum, 45(2):17-31, 2012.
    • [19] K. Ja¨rvelin and J. Keka¨l¨ainen. Cumulated gain-based evaluation of ir techniques. ACM Transactions on Information Systems, 20:2002, 2002.
    • [20] K. Ja¨rvelin and T. D. Wilson. On conceptual models for information seeking and retrieval research. Information Research, 9(1):9-1, 2003.
    • [21] D. Kelly, V. D. Dollu, and X. Fu. The loquacious user: a document-independent source of terms for query expansion. In SIGIR '05: Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval, pages 457-464, 2005.
    • [22] D. Kelly and C. Sugimoto. A systematic review of interactive information retrieval evaluation studies, 1967-2006. Journal of the American Society for Information Science and Tech., 64(4):745-770, 2013.
    • [23] H. Keskustalo, K. Ja¨rvelin, A. Pirkola, T. Sharma, and M. Lykke. Test collection-based ir evaluation needs extension toward sessions - a case of extremely short queries. In Proc. of the 5th AIRS, pages 63-74, 2009.
    • [24] C. C. Kuhlthau. Developing a model of the library search process: Cognitive and a↵ective aspects. RQ, pages 232-242, 1988.
    • [25] P. Pirolli and S. Card. Information foraging. Psychological Review, 106:643-675, 1999.
    • [26] P. Pirolli and W. T. Fu. Snif-act: a model of information foraging on the www. In Proceedings of the 9th International Conference on User Modeling, pages 45-54, 2003.
    • [27] P. Pirolli, P. Schank, M. Hearst, and C. Diehl. Scatter/gather browsing communicates the topic structure of a very large text collection. In Proceedings of the ACM SIGCHI Conference, pages 213-220, 1996.
    • [28] S. E. Robertson. The probability ranking principle in ir. Journal of documentation, 33(4):294-304, 1977.
    • [29] D. H. Rothenberg. An eciency model and a performance function for an ir system. Information Storage and Retrieval, 5(3):109 - 122, 1969.
    • [30] D. M. Russell, M. J. Stefik, P. Pirolli, and S. K. Card. The cost structure of sensemaking. In Proceedings of the INTERACT/SIGCHI, pages 269-276, 1993.
    • [31] T. Sakai and Z. Dou. Summaries, ranked retrieval and sessions: A unified framework for information access evaluation. In Proceedings of the 36th International ACM SIGIR Conference on Research and Development in IR, pages 473-482, 2013.
    • [32] M. Sanderson. Test collection based evaluation of information retrieval systems. FNTIR, 2010.
    • [33] C. L. Smith and P. B. Kantor. User adaptation: good results from poor systems. In Proceedings of the 31st ACM conference on research and development in information retrieval (SIGIR), pages 147-154, 2008.
    • [34] M. D. Smucker and C. L. Clarke. Time-based calibration of e↵ectiveness measures. In Proceedings of the 35th International ACM SIGIR Conference on Research and Development in IR, pages 95-104, 2012.
    • [35] G. J. Stigler. The economics of information. The Journal of Political Economy, 69(3):213-225, 1961.
    • [36] S. Sushmita, H. Joho, M. Lalmas, and R. Villa. Factors a↵ecting click-through behavior in aggregated search interfaces. In Proc. of the 19th ACM International Conference on Information and Knowledge Management, pages 519-528, 2010.
    • [37] A. Turpin and W. Hersh. User interface e↵ects in past batch versus user experiments. In Proceedings of the 25th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR '02, pages 431-432, 2002.
    • [38] A. H. Turpin and W. Hersh. Why batch and user evaluations do not give the same results. In Proceedings of the 24th Annual International ACM SIGIR Conference on Research and Development in IR, SIGIR '01, pages 225-231, 2001.
    • [39] H. R. Varian. Intermediate microeconomics: A modern approach. W.W. Norton, New York:, 1987.
    • [40] H. R. Varian. Economics and search. SIGIR Forum, 33(1):1-5, 1999.
    • [41] J. Wang and J. Zhu. Portfolio theory of information retrieval. In Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval, pages 115-122. ACM, 2009.
    • [42] R. White. Beliefs and biases in web search. In Proceedings of the 36th International ACM SIGIR Conference on Research and Development in Information Retrieval, pages 3-12, 2013.
    • [43] R. W. White, S. T. Dumais, and J. Teevan. Characterizing the influence of domain expertise on web search behavior. In Proceedings of the ACM Conference on Web Search and Data Mining (WSDM), pages 132-141, 2009.
    • [44] R. W. White and D. Morris. Investigating the querying and browsing behavior of advanced search engine users. In Proc. of the 30th ACM conference on research and development in IR, pages 255-262, 2007.
    • [45] R. W. White, I. Ruthven, J. M. Jose, and C. J. V. Rijsbergen. Evaluating implicit feedback models using searcher simulations. ACM Trans. Inf. Syst., 23(3):325-361, 2005.
    • [46] T. D. Wilson. Human information behavior. Informing science, 3(2):49-56, 2000.
  • No related research data.
  • No similar publications.

Share - Bookmark

Download from

Cite this article