ST 64/12: Indikatorkritikk fra HiOA
Prorektor Frode Eika Sannes har kritisert de internasjonale universitetsrangeringene på rektoratbloggen til HiOA.
Bildet: Beckman Institute, Cal Tech, Pasadena
Jeg har kommentert slik:
Det er flott at du setter søkelyset på de internasjonale universitetsrangeringene.
Du legger stor vekt på de metodiske svakhetene – og der kan jeg være enig med deg. Men jeg tror samtidig at du undervurderer deres faktiske betydning for studenter, lærere, forskere, arbeidsgivere og bevilgende myndigheter ute i verden. Hele utdanningsmarkedet er svært uoversiktlig. For personer som er nødt til å bestemme seg, er det lettere å støtte seg på en synlig indikator – selv om den har skjevheter – enn å vurdere hundre ulike muligheter case for case. HiOA gjør akkurat det samme når vi bruker antall publiseringspoeng pr. faglig stilling som målestokk for vår FoU-innsats.
Jeg støtter din målsetting om at institusjoner bør kunne sammenlikne seg med andre ut i fra «relevante og meningsfulle kriterier». Et europeisk samarbeid om alternative målemetoder – der vi selv kan velge hva vi vil vektlegge – høres fornuftig ut. Dette er spesielt viktig for et framtidig profesjonsuniversitet som, i langt høyere grad enn den klassiske forskninguniversitetene, må ivareta koplingen mellom utdanning, forskning, innovasjon og formidling.
I 2011 hadde HiOA som mål å oppnå 0,6 publiseringspoeng pr. faglig årsverk. Vi nådde bare 0,4 – omtrent det samme som i 2009. Se
Betyr dette at høyskolen nå må satse ennå mer på publiseringssiden? Eller skal vi heller analysere svakheter ved selve indikatoren – a la din kritikk av Times Higher Education? Den sier jo ingenting om hvor relevant og anvendbar forskningen er – bare at den har blitt publisert.
Som en ekspert på området (Drummond Rennie) sier:
«There seems to be no study too fragmented, no hypothesis too trivial, no literature too biased or too egotistical, no design too warped, no methodology too bungled, no presentation of results too inaccurate, too obscure, and too contradictory, no analysis too self-serving, no argument too circular, no conclusions too trifling or too unjustified, and no grammar and syntax too offensive for a paper to end up in print.»
Vi må sørge for at vi måler det vi faktisk ønsker å få til – og ikke det som lettest og billigst lar seg registrere.
- Intetsigende rangeringer
- World University Rankings 2012-13: Asia’s high-flyers challenge Western supremacy
The rise of Confucian Asia
Asian universities, especially in the Republic of Korea, Singapore, Taiwan and China, performed particularly well in the rankings, published on 3 October, rising by an average of almost 12 places.
- Although the UK remains the best represented country behind the US in the top 200, falls in both regions contrast with a surge in performance in Asia and Australia.
- These regions saw some of the largest improvements, with the Nanyang Technological University in Singapore leaping 83 places to 86th, thanks mainly to a significant increase in research income.
- Yang Wei, president of Zhejiang University and chairman of the Chinese C9 League of universities, called the shift the result of a decade of steady progress across Asia that will continue to pay off.
A time lag on the results from the most recent initiatives, such as China’s strategy to internationalise higher education and a government commitment to spend 4 per cent of gross domestic product on education, also means Chinese institutions are likely to continue to improve their positions, he added. «I think this progression will extend at least for five or ten years.»
The secret of CalTech
Keeping things small may be the secret to the success of the California Institute of Technology, which has taken the top spot in the Times Higher Education World University Rankings for the second year in a row.
- The private institution in Pasadena, California, has just 900 undergraduates and 1,200 postgraduate students. It focuses solely on science and engineering, and the social sciences and humanities connected to science.
- Dr Chameau said features of Caltech’s culture, such as the opportunity for students to do research as early as their first year and support for interdisciplinary research, were factors behind its success rather than one specific strategy.
- We don’t try to do everything, but what we do we do well.»
Top 101 by culture
- Anglo-Saxon 68
- Germanic 16 (Netherlands, Germany, Switzerland, Sweden)
- Confucian 11 (Korea, Singapore, Japan, Hong Kong, China)
- Romanic 6 (France, Belgium)
Top 101 by countries
- United States 47
- United Kingdom 10
- Netherlands 7
- Australia 6
- Canada 5
- Germany 4
- France 4
- Korea, Republic Of 3
- Switzerland 3
- Sweden 2
- Singapore 2
- Japan 2
- Hong Kong 2
- China 2
- Belgium 2
The last few years have seen a new addition to the rankings scene: so-called ‘world’ rankings that purport to be lists of the top universities or programs in the world. The best-known examples are the THE Supplement, Quacquarelli Symonds World University Rankings (QS), and the Academic Ranking of World Universities produced by the Institute of Higher Education at Shanghai Jiao Tong University.
Early data suggest that these global ranking systems are helping to better inform prospective students in many countries about the available choices, including where they can obtain a high quality education for a reasonable price.
- Reservations remain, however, about the potential impact on access given the strong likelihood that such rankings will encourage higher education stratification within and across countries.
- In fact, some countries are explicitly moving in that direction because they see rankings and stratification as the means to create ‘world class’ universities and thus meet the challenges of increasing global competition.
- Since at least some of the adverse impacts of rankings are related to student selectivity indicators, one issue for rankers is whether to replace measures that reward schools for recruiting already academically high-achieving students with those that recognize schools for their success in educating students.
- It is important for students to recognize that rankings reflect only one aspect of an institution’s profile and are not necessarily predictive of the quality of the education that they will receive, or of the opportunities that will be available to them after graduation.
Both producers and consumers may benefit from consulting The Berlin Principles on Ranking of Higher Education Institutions. These principles, which offer standards of quality and good practice in the development and use of rankings, are available at http://www.ihep.org/.
The Berlin principles
From the introduction
Rankings and league tables of higher education institutions (HEIs) and programs are a global phenomenon.
They serve many purposes:
- they respond to demands from consumers for easily interpretable information on the standing of higher education institutions;
- they stimulate competition among them;
- they provide some of the rationale for allocation of funds;
- and they help differentiate among different types of institutions and different programs and disciplines.
In addition, when correctly understood and interpreted, they contribute to the definition of “quality” of higher education institutions within a particular country, complementing the rigorous work conducted in the context of quality assessment and review performed by public and independent accrediting agencies.
- This is why rankings of HEIs have become part of the framework of national accountability and quality assurance processes, and why more nations are likely to see the development of rankings in the future.
- Given this trend, it is important that those producing rankings and league tables hold themselves accountable for quality in their own data collection, methodology, and dissemination.