Two circuit assessments of the performance of scientific organizations in Russia: current state and development prospects from the point of view of international experience

Cover Page

Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

In Russia, in fact, two contours for assessing the scientific performance of organizations have developed: the expert one, which underlies the categorization of scientific organizations, and the quantitative one, which, according to the methodology of the comprehensive calculation of publication productivity, makes it possible to determine and monitor the implementation of the state assignment for fundamental scientific research. The article describes the history of the formation of these circuits, their advantages and disadvantages, as well as the prerequisites for further development. The British REF (Research Excellence Framework) examination system and the Norwegian quantitative assessment model are considered as model examples for improving both circuits. REF features include a differentiated assessment of the results, impact and environment of scientific units, which is organized by expert panels and groups. The features of the Norwegian model are an expert approach to the formation of a national white list of scientific journals, conferences and publishers, a methodology for calculating the publication indicator and an open database of publications of Norwegian scientists, from which this indicator is determined. The authors believe that these two examples can be considered as parametric assessment models, the adaptation of which taking into account national characteristics will make it possible to update and improve both contours of the assessment of scientific organizations in Russia. The importance of regular retrospective reflection on the experience, procedures and results of assessing organizations is emphasized, which is necessary for the systematic development of this system at the next stages.

Full Text

Restricted Access

About the authors

D. V. Kosyakov

Russian Research Institute of Economics, Politics and Law in the Scientific and Technical Sphere

Author for correspondence.
Email: kosyakov@sciencepulse.ru

заместитель заведующего лабораторией наукометрии и научных коммуникаций РИЭПП

Russian Federation, Moscow

I. V. Selivanova

Russian Research Institute of Economics, Politics and Law in the Scientific and Technical Sphere

Email: i-seli@yandex.ru

кандидат технических наук, научный сотрудник РИЭПП

Russian Federation, Moscow

A. E. Guskov

Russian Research Institute of Economics, Politics and Law in the Scientific and Technical Sphere

Email: guskov.andrey@gmail.com

кандидат технических наук, заведующий лабораторией наукометрии и научных коммуникаций РИЭПП

Russian Federation, Moscow

References

  1. Gruening G. Origin and theoretical basis of new public management // International Public Management Journal. 2001. V. 4. № 1. P. 1–25. https://doi.org/10.1016/S1096-7494(01)00041-1
  2. Hicks D. Performance-based university research funding systems // Research Policy. 2012. V. 41. № 2. P. 251–261. https://doi.org/10.1016/ j.respol.2011.09.007
  3. Dougherty K. J., Natow R. S. Performance-based funding for higher education: how well does neoliberal theory capture neoliberal practice? // Higher Education. 2020. V. 80. № 3. P. 457–478. https://doi.org/10.1007/s10734-019-00491-4
  4. Henkel M. The modernisation of research evaluation: The case of the UK // Higher Education. 1999. V. 38. № 1. P. 105–122. https://doi.org/10.1023/A:1003799013939
  5. Mok K. H. Enhancing quality of higher education for world-class status: Approaches, strategies, and challenges for Hong Kong // Chinese Education and Society. 2014. V. 47. № 1. P. 44–64. https://doi.org/ 10.2753/CED1061-1932470103
  6. Crowe S. F., Watt S. Excellence in Research in Australia 2010, 2012, and 2015: The Rising of the Curate’s Soufflé? // Australian Psychologist. 2017. V. 52. № 6. P. 503–513. https://doi.org/10.1111/ap.12248
  7. Chatterjee B. et al. The spectacle of research assessment systems: insights from New Zealand and the United Kingdom // Accounting, Auditing and Accountability Journal. 2020. V. 33. № 6. P. 1219–1246. https://doi.org/10.1108/AAAJ-01-2019-3865
  8. Abramo G., D’Angelo C. A. The VQR, Italy’s second national research assessment: Methodological failures and ranking distortions // Journal of the Association for Information Science and Technology. 2015. V. 66. № 11. P. 2202–2214. https://doi.org/10.1002/asi.23323
  9. Luwel M. Performance-based Institutional Research Funding in Flanders, Belgium // Scholarly Assessment Reports. 2021. V. 3. № 1. P. 3. https://doi.org/10.29024/sar.29
  10. Engels T. C.E., Guns R. The Flemish Performance-based Research Funding System: A Unique Variant of the Norwegian Model // Journal of Data and Information Science. 2018. V. 3. № 4. P. 45–60. https://doi.org/10.2478/jdis-2018-0020
  11. Shu F., Liu S., Larivière V. China’s Research Evaluation Reform: What are the Consequences for Global Science? // Minerva. 2022. V. 60. № 3. P. 329–347. https://doi.org/10.1007/s11024-022-09468-7
  12. Aagaard K., Bloch C., Schneider J. W. Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator // Research Evaluation. 2015. V. 24. № 2. P. 106–117. https://doi.org/10.1093/reseval/rvv003
  13. Deutz D.B. et al. Quantitative quality: a study on how performance-based measures may change the publication patterns of Danish researchers // Scientometrics. 2021. V. 126. № 4. P. 3303–3320. https://doi.org/10.1007/s11192-021-03881-7
  14. Mathies C., Kivistö J., Birnbaum M. Following the money? Performance-based funding and the changing publication patterns of Finnish academics // High Educ. 2020. V. 79. № 1. P. 21–37. https://doi.org/10.1007/s10734-019-00394-4
  15. Hammarfelt B. Taking Comfort in Points: The Appeal of the Norwegian Model in Sweden // Journal of Data and Information Science. 2018. V. 3. № 4. P. 84–94. https://doi.org/10.2478/jdis-2018–0023
  16. Постановление Правительства РФ от 8 апреля 2009 г. № 312 “Об оценке и о мониторинге результативности деятельности научных организаций, выполняющих научно-исследовательские, опытно-конструкторские и технологические работы гражданского назначения”. 2009. https://sciencemon.ru/documents/6 (дата обращения 05.10.2023).
  17. Информационная справка к Совету при Президенте Российской Федерации по науке и образованию “Об оценке результативности деятельности научных организаций, подведомственных федеральным органам исполнительной власти и государственным академиям наук за 2010–2012 годы”. 2013. https://sciencemon.ru/documents/2 (дата обращения 05.10.2023).
  18. Поручение Президента Российской Федерации от 30 апреля 2013 г. № Пр-1144 по итогам заседания Совета при Президенте Российской Федерации по науке и образованию. 2013. https://sciencemon.ru/documents/3 (дата обращения 05.10.2023).
  19. Гуськов А. Е., Косяков Д. В., Селиванова И. В. Методика оценки результативности научных организаций // Вестник Российской академии наук. 2018. № 5. https://doi.org/10.7868/S0869587318050092
  20. Kosyakov D., Guskov A. Research assessment and evaluation in Russian fundamental science // Procedia Computer Science. 2019. V. 146. P. 11–19. https://doi.org/10.1016/j.procs.2019.01.072
  21. Kosyakov D., Guskov A. Reasons and consequences of changes in Russian research assessment policies // Scientometrics. 2022. V. 127. № 8. P. 4609–4630. https://doi.org/10.1007/s11192-022-04469-5
  22. Guskov A. E., Kosyakov D. V., Selivanova I. V. Boosting research productivity in top Russian universities: the circumstances of breakthrough // Scientometrics. 2018. V. 117. № 2. P. 1053–1080. https://doi.org/10.1007/s11192-018-2890-8
  23. Liu W., Hu G., Gu M. The probability of publishing in first-quartile journals // Scientometrics. 2016. V. 106. № 3. P. 1273–1276. https://doi.org/10.1007/s11192-015-1821-1
  24. Kosyakov D. Analysis of the abnormal growth in the number of Russian publications in conference proceedings in Scopus // Scientififc and Technical Information. 2023. № 4. P. 13–24. https://doi.org/10.36535/0548-0019-2023-04-3
  25. Phillimore A. J. University research performance indicators in practice: The University Grants Committee’s evaluation of British universities, 1985–86 // Research Policy. 1989. V. 18. № 5. P. 255–271. https://doi.org/10.1016/0048-7333(89)90053-X
  26. Sizer J. The impacts of financial reductions on British universities: 1981–84 // Higher Education. 1987. V. 16. № 5. P. 557–580. https://doi.org/10.1007/BF00128422
  27. Lee F. S., Pham X., Gu G. The UK Research Assessment Exercise and the narrowing of UK economics // Cambridge Journal of Economics. 2013. V. 37. № 4. P. 693–717. https://doi.org/10.1093/cje/bet031
  28. Sizer J. In Search of Excellence —Performance Assessment in the United Kingdom // Higher Education Quarterly. 1988. V. 42. № 2. P. 152–161. https://doi.org/10.1111/j.1468–2273.1988.tb01811.x
  29. Bekhradnia B. et al. Research Evaluation: Past, present and future. Higher Education Policy Institute (HEPI), 2022.
  30. Curry S., Gadd E., Wilsdon J. Harnessing the Metric Tide: indicators, infrastructures and priorities for UK responsible research assessment. Research on Research Institute, 2022.
  31. Smith S., Ward V., House A. ‘Impact’ in the proposals for the UK’s Research Excellence Framework: Shifting the boundaries of academic autonomy // Research Policy. 2011. V. 40. № 10. P. 1369–1379. https://doi.org/10.1016/j.respol.2011.05.026
  32. Building on Success and Learning from Experience. An Independent Review of the Research Excellence Framework. UK Government. 2016. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/541338/ind-16-9-ref-stern-review.pdf (дата обращения 30.01.2023).
  33. Future Research Assessment Programme. https://www.ukri.org/about-us/research-england/research-excellence/future-research-assessment-programme-frap/ (дата обращения 30.01.2023).
  34. Thelwall M. et al. Can REF output quality scores be assigned by AI? Experimental evidence. Wolverhampton, UK: Statistical Cybermetrics and Research Evaluation Group, University of Wolverhampton, 2022. P. 141.
  35. Thelwall M. et al. Predicting article quality scores with machine learning: The U. K. Research Excellence Framework // Quantitative Science Studies. 2023. V. 4. № 2. P. 547–573. https://doi.org/10.1162/qss_a_00258
  36. Singh Chawla D. AI system not yet ready to help peer reviewers assess research quality // Nature Index. 2022. https://doi.org/10.1038/d41586-022-04493-8
  37. Review of research assessment. Report by Sir Gareth Roberts to the UK funding bodies. UK Government. 2003. P. 44. https://web.archive.org/web/20070720232304/http://www.rareview.ac.uk/reports/roberts.asp (дата обращения 30.01.2023).
  38. Research Excellence Framework 2028: Initial decisions and issues for further consultation. Research England, 2023. https://www.ukri.org/publications/ref2028-initial-decisions-and-issues-for-further-consultation/ (дата обращения 30.01.2023).
  39. FAQs on the REF 2021 Impact case study database. https://ref.ac.uk/guidance-on-results/impact-case-study-database-faqs/ (дата обращения 30.01.2023).
  40. Pidd M., Broadbent J. Business and Management Studies in the 2014 Research Excellence Framework: Business and Management Studies in 2014 REF // Brit J Manage. 2015. V. 26. № 4. P. 569–581. https://doi.org/10.1111/1467–8551.12122
  41. Sivertsen G. The Norwegian Model in Norway // Journal of Data and Information Science. 2018. V. 3. № 4. P. 3–19. https://doi.org/10.2478/jdis-2018–0017
  42. A Bibliometric Model for Performance-based Budgeting of Research Institutions. Norwegian Association of Higher Education Institutions, 2004.
  43. Norwegian Register for Scientific Journals, Series and Publishers. https://kanalregister.hkdir.no/publiseringskanaler/Forside (дата обращения 05.07.2023).
  44. Costas Boletsis. Level 1 & Level 2 publications — Norwegian Scientific Index. 2015. https://boletsis.net/level-1-level-2-publications-norwegian-scientific-index/ (дата обращения 05.07.2023).
  45. Ahlgren P., Colliander C., Persson O. Field normalized citation rates, field normalized journal impact and Norwegian weights for allocation of university research funds // Scientometrics. 2012. V. 92. № 3. P. 767–780. https://doi.org/10.1007/s11192-012-0632-x
  46. Cristin. https://www.cristin.no (дата обращения 05.07.2023).
  47. Sivertsen G. The Norwegian Model in Norway // Journal of Data and Information Science. 2018. V. 3. № 4. P. 3–19. https://doi.org/10.2478/jdis-2018–0017
  48. Schneider J. W. An Outline of the Bibliometric Indicator Used for Performance-Based Funding of Research Institutions in Norway // Eur Polit Sci. 2009. V. 8. № 3. P. 364–378. https://doi.org/10.1057/eps.2009.19
  49. Ahlgren P., Colliander C., Persson O. Field normalized citation rates, field normalized journal impact and Norwegian weights for allocation of university research funds // Scientometrics. 2012. V. 92. № 3. P. 767–780. https://doi.org/10.1007/s11192-012-0632-x
  50. Bloch C., Schneider J. W. Performance-based funding models and researcher behavior: An analysis of the influence of the Norwegian Publication Indicator at the individual level // Research Evaluation. 2016. P. rvv047. https://doi.org/10.1093/reseval/rvv047
  51. Aagaard K. How incentives trickle down: Local use of a national bibliometric indicator system // Science and Public Policy. 2015. V. 42. № 5. P. 725–737. https://doi.org/10.1093/scipol/scu087
  52. Skivenes M., Trygstad S. C. When whistle-blowing works: The Norwegian case // Human Relations. 2010. V. 63. № 7. P. 1071–1097. https://doi.org/10.1177/0018726709353954
  53. Schneider J.W., Aagaard K., Bloch C.W. What happens when national research funding is linked to differentiated publication counts? A comparison of the Australian and Norwegian publication-based funding models // Research Evaluation. 2016. V. 25. № 3. P. 244–256. https://doi.org/10.1093/reseval/rvv036
  54. Гуськов А. Е., Косяков Д. В. Национальный фракционный счёт и оценка научной результативности организаций // Научные и технические библиотеки. 2020. № 9. P. 15–42. https://doi.org/10.33186/1027-3689-2020-9-15-42

Supplementary files

Supplementary Files
Action
1. JATS XML
2. KOSYAKOV Denis Viktorovich – Deputy Head of the Laboratory of Scientometrics and Scientific Communications of the RIEPP. SELIVANOVA Irina Vyacheslavovna – candidate of technical sciences, researcher at RIEPP. GUSKOV Andrey Evgenievich – Candidate of Technical Sciences, Head of the Laboratory of Scientometrics and Scientific Communications of the RIEPP.

Download (229KB)
3. Fig. 1. Main panels and panels of experts (UoA) at REF 2021

Download (457KB)
4. Fig. 2. Distribution of results by type in REF 2021

Download (108KB)
5. Fig. 3. Distribution of results by type in REF 2021, by expert panels

Download (200KB)
6. Fig. 4. Number of results of less common types in REF 2021

Download (150KB)
7. Fig. 5. Distribution of the number of impact examples by their types (areas) and expert panels in REF 2021

Download (131KB)
8. Fig. 6. An example of the results of an assessment of one of the universities in two areas (UoA) in REF 2021

Download (159KB)

Copyright (c) 2024 Russian Academy of Sciences

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies