Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Ioana Alexandra HORODNIC (2014)
Types: Article
Subjects: ranking, measurement methods, academic performance, biliometrics, peer review
Evaluation of scientific research is crucial, however, although numerous studies have been conducted in this area, it is not easy to measure academic productivity/ performance. The most important perspective in measuring the productivity/ performance comes from the economic field, where it is the ratio between outputs and inputs for a particular product. This article aims to address the most important elements to be considered in the measurement of scientific research: types of indicators, qualitative/ quantitative respectively simple/ composite, level of measurement, micro, respectively macro indicators and the limits of those indicators. Also, this paper presents evidences regarding academic performance for Romanian university professors in economics and business administration from North – East Romania
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • Carayol, N. and Matt, M. (2004). Does research organization influence academic production? Laboratory level evidence from a large European University. Research Policy , 33, 1081-1102.
    • Chew, M., Villaneuva, E. V. and Van Der Weidon, M. B. (2007). Life and times of the impact factor: retrospective analysis of trends for seven medical journals (1994-2005) and their editor`s views.
    • Journal of the Royal Society of Medicine , 100, 142-150.
    • Combes, P. P. and Linnemer, L. (2003). Where are the Economists who Publish? Publication Concentration and Rankings in Europe based on Cumulative Publications. Jurnal of European Economic Association , 1, 1250-1308.
    • Costas, R. and Bordons, M. (2005). Bibliometric indicators at the micro-level: some results in the aria of natural resurces at the Spanish CSIC. Research Evaluation , 14 (2), 110-120.
    • Costas, R. and Bordons, M. (2007). The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level. Journal of Informetrics , 1, 193-203.
    • Costas, R., van Leeuwen, T. N. and Bordons, M. (2010). A bibliometric classificatory approach for the study an assessment of research performance at the individual level: the effects of age on productivity and impact. Centre for Science and Technology Studies - Working Papers Series , 1-32.
    • Danell, R. and Hjerm, M. (2013). The importance of early academic career opportunities and gender differences in promotion rates. Research Evaluation, 22 (4), 210-214.
    • Dietz, J. S., Chompalov, I., Bozeman, B., O`neil Lane, E. and Park, J. (2000). Using the curriculum vitae to study the career paths of scientist and engineers: an exploratory assessment. Scientometrics , 49 (3), 419-442.
    • Dill, D. D. and Soo, M. (2005). Academic quality, league tables, and public policy: A cross-national analysis of universities ranking systems. Higher Education , 49, 495-533.
    • Egghe, L. (2006a). An improvement of the h-index: The g-index. ISSI Newsletter , 2 (1), 8-9.
    • Egghe, L. (2006b). Theory and practice of the g-index. Scientometrics , 69 (1), 131-152.
    • Farnham, D. (1999). Towards the flexi-university? In D. Farnham, Managing academic staff in changing university systems: International trends and comparison. Buckingham: The Society for Research into Higher Education & Open University Press.
    • Franceschet, M. (2009). A cluster analysis of scholar and jurnal bibliometric indicators. Journal of the American Society for Information Science and Technology , 60 (10), 1950-1964.
    • Franceschet, M. (2010). A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar. Scientometrics , 83, 243-258.
    • Garcis-Ruiz, J. M. (2008). International citations of the Spanish geography journals. Boletin de la Asociation de Geografos Espanoles , 46, 389-391.
    • Garfield, E. (1997). Dispelling a few myths about Journal Citations Impact. The Scientist , 11, 11.
    • Garfield, E. (2006). The history of meaning of the journal impact factor. Journal of the American Medical Association , 295 (1), 90-93.
    • Gaughan, M. and Bozeman, B. (2002). Using curiculum vitae to compare some impacts of NSF research grants with research center funding. Research Evaluation , 11 (1), 17-26.
    • GFO Echological Society, o. G. (2009). Towards objectivity in research evaluation using bibliometric indicators - A protocol for incorporating complexity. Basic and Applied Ecology , 10, 393-400.
    • Hancock, T., Lane, J., Ray, R., Glennon, D. and Armstrong, J. S. (1992). The Ombudsman: Factors Influencing Academic Research Productivity: A Survey of Management Scientists. Interfaces , 22 (5), 26-38.
    • Hirsch, J. E. (2005). An index to quantify an individual's scientific research output. Proceedings of the National Academy of Sciences , 102, 16569-16572.
    • Jarwal, S. D., Brion, A. M. and King, M. L. (2009). Measuring research quality using the jurnal impact factor, citations and "ranked jurnals": blunt instruments or inspired metrics. Journal of Higher Education Policy and Management , 31 (4), 289-300.
    • Juznic, P., Peclin, S., Zaucer, M., Mandelj, T., Pusnik, M. and Demsar, F. (2010). Scientometric indicators: peer-review, bibliometric methods and conflict of interests`. Scientometrics , 85, 429-441.
    • Krapf, M., Ursprung, H. W. and Zimmermann, C. (2014). Parenthood and Productivity of Highly Skilled Labor: Evidence from the Groves of Academe, Federal Reserve Bank of St. Louis Working Papers Series, pp. 1-36.
    • Krapf, M. and J. Schläpfer (2012). How Nobel Laureates Would Perform in the Handelsblatt Ranking, Regional and Sectoral Economics Studies, 12(3), pp. 47-56.
    • Krapf, M. (2011). Research evaluation and journal cality weights. Much ado about nothing. Zeitschrift fur Betriebs Wirtschaft , 81, 5-27.
    • Lawrence, P. A. (2007). The mismeasuring of science. Current Biology, 17, R583-R585.
    • Leimu, R. and Koriecheva, J. (2005). What determines the citation frequency of ecological papers? Trends in Ecology and Evolution , 20, 28-32.
    • Long, R., Crawford, A., White, M. and Davis, K. (2009). Determinants of faculty research productivity in information system: An empirical analysis of the impact of academic origin and academic affiliation, Scientometrics, 78 (2), pp. 231-260.
    • Mahnung, A. (1998). Playing the rankings game. Academic Search Complete , 30 (4), 1-12.
    • Manley, C. (1998). Identifying Common Characteristics among Researchers trought Citation Analysis.
    • Information Science , 21 (2), 75-85.
    • Martin, B. R. (1996). The use of multiple indicators in the assessment of basic research.
    • Scientometrics , 36 (3), 343-362.
    • Mauleón, E., Daraio, C. and Bordons, M. (2014). Exploring gender differences in patenting in Spain.
    • Research Evaluation , 23 (1), 62-78.
    • Meho, L. I. and Yang, K. (2007). Impact of Data Sources on Citation Counts and Rankings of LIS Faculty: Web of Science Versus Scopus and Google Scholar. Journal of the American Society for Information Science and Technology , 58 (13), 2105-2125.
    • Moed, H. F. (2000). Bibliometric indicators reflect publication and management strategies.
    • Scientometrics , 47 (2), 323-346.
    • Moed, H. F. and Visser, M. S. (2007). Developing Bibliometric Indicators of Research Performance in Computer Science: An Exploratory Study. Leiden, The Netherlands: Center for Science and Technology Studies.
    • Monastersky, R. (2005). The number that`s devouring science. The Chronical of Higher Education , 52, A12-A17.
    • Okubo, Y. (1997). Bibliometric Indicators and Analysis of Research Systems: Methods and Examples.
    • OECD Science, Technology and Industry Working Papers , 1-71.
    • Prpic, K. (2000). The publication productivity of young scientistic: an empirical study. Scientometrics, 49 (3), 453-490.
    • Ranking Resources. (2011). Retrieved 06 25, 2014, from Academic Ranking of World Universities: http://www.arwu.org/resources.jsp Rauber, M. and Ursprung, H. W. (2008a). Evaluation of researchers: a life cycle analysis of German academic economists, Scientific Competition, Conferences on New Political Economy, Tubingen: Mohr Siebek, pp. 100-123.
    • Rauber, M. and Ursprung, H. W. (2008b). Life Cycle and Cohort Productivity in Economic Research: The Case of Germany, German Economic Review, 9 (4), pp. 431-456.
    • Rehn, C., Kronman, U. and Wadskog, D. (2007). Bibliometric indicators - definitions and usage at Karolinska Institutet. Bibliometric Handbook - Karolinska Institutet , 1, 1-33.
    • Sadlak, J., Merisotis, J. and Liu, N. C. (2008). University Rankings: Seeking Prestige, Raising Visibility and Embedding Quality - the Editors' Views. Higher Education in Europe , 33 (2/3), 195- 199.
    • Schmoch, U. and Schubert, T. (2009). When and how to use bibliometrics as a screening tool for research performance. Science and Public Policy , 36 (10), 753-762.
    • Wallin, J. A. (2005). Bibliometric methods: Pitfalls and Possibilities. Basic & Clinical Pharmachology & Toxichology , 97, 261-275.
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article