robin. roche@utbm. fr). E. Kyriakides is with the University of Cyprus, 1678 Nicosia, Cyprus (e-mail:
and also with Cadi Ayyad University, Marrakesh 40000, Morocco (e-mail: abdellatif. miraoui@utbm. fr). Digital Object Identifier 10.1109/TIA. 2012.2199730 in the distribution milieu since the latter has traditionally been considered as user end points of service,
4 JULY/AUGUST 2012 Marcelo Godoy Simões (S'89 SM'98) received the B. S. and M. S. degrees from the University of São paulo, São paulo, Brazil
, in 1985 and 1990, respectively, the Ph d. degree from The University of Tennessee, Knoxville, in 1995,
and the D. Sc. degree (Livre-Docência) from the University of São paulo in 1998. He is currently an Associate professor at the Colorado School Of mines, Golden,
He is currently an Assistant professor in the Department of Electrical and Computer engineering, University of Cyprus, Nicosia, Cyprus
He received the B. E. degree in electrical engineering from Hanoi University of Technology, Hanoi, in 2002, the M. Eng. degree in electrical engineering from the Asian Institute of technology, Pathum Thani, Thailand, in 2004,
Paulo F. Ribeiro (M'79 SM'88 F'03) received the Ph d. degree from the University of Manchester, Manchester, U k. He is currently an Associate professor of electrical engineering at the Technische Universiteit
He received the M. Sc. degree from Haute Alsace University, Mulhouse, France, in 1988 and the Ph d. and Habilitation degrees from the University of Franche-Comté, Besancon, France, in 1992 and 1999, respectively.
He is currently the President of Cadi Ayyad University, Marrakesh, Morocco. Since 2000, he has been a Full professor of electrical engineering (electrical machines and energy) at the Université de Technologie de Belfort Montbéliard1, Belfort, France,
where he was the Vice-president of Research Affairs from 2008 to 2011, the Director of the Electrical engineering Department from 2001 to 2009,
He is Doctor Honoris Causa of the Technical University of Cluj-Napoca, Cluj-Napoca, Romania.
He was distinguished also as an Honorary Professor by the Transylvania University of Brasov, Brasov, Romania
A guide to eco-innovation for SMES and business coaches Editors Michal Miedzinski, Technopolis Group Belgium Martin Charter, The Centre for Sustainable Design, University for the Creative Arts
Meghan O'brien, Wuppertal Institute Authors Technopolis Group Belgium Michal Miedzinski, Asel Doranova, Johanna Castel, Laura Roman The Centre for Sustainable Design, University
towards smart freight models TRI-VIZOR, a spin-off from the University of Antwerp in Belgium, developed an original horizontal cooperation based business model for freight transport.
you may choose to train a designer, contract an external consultancy or partner with an appropriate university or technical school.
University) and The Centre for Sustainable Design, 2002 cfsd. org. uk/smart-know-net/smart-know-net. pdf The EU Eco-label helps identify products
why it matters and how it can be accelerate, University of Oxford, Skoll centre for social entrepreneurship, Murray, R.,Caulier-Grice, J.,Mulgar, G.,(2010).
University. The Young Foundation,(2010. The Young Foundation and the Web. Digital Social Innovation, working paper
MASSIMILIANO GRANIERI Assistant professor at the University of Foggia Law school ANDREA RENDA Research Fellow, CEPS TABLE OF CONTENTS Foreword...
To be sure, Europe already features world-leading industries and a few high-ranked universities. In the past few years, the budget for R&d has been increased
and IP management in universities and public research organisations TheThird Mission'launched by the Lisbon Agenda for universities requires specialised human resources that universities should be able to form
A NEW APPROACH TO INNOVATION POLICY IN THE EUROPEAN UNION 9 EU institutions should devote efforts to improving theprofessionalisation'of the management of public-funded universities and research institutions.
which led to 215 responses from universities and research institutions, companies, governments, non-governmental organisations and individuals;
alliances with local companies and universities; mergers and acquisitions of local firms; and increasing research intensity of foreign production facilities.
For instance, see R. Polk Wagner, Understanding Patent Quality Mechanism, Public law and Legal Theory, University of Pennsylvania Law school, Research Paper No. 09-22, subsequently published as 157 U. Penn.
such as universities and public research organisations. 3. 3. 4 Create a unified patent litigation system with an acceptable level of centralisation Costs
-Public-private technology transfer, meaning the transfer of research results generated by universities and public research institutions.
and available empirical data on the fact that European universities are good at sciences and technology,
and Japanese universities. 31 The situation is serious when considering the amount of funding that Europe is pouring into the R&d efforts of research and technology organisations (RTOS) and universities through framework programmes and other funds.
One major goal of any policy on innovation should be to pay more attention to the return on investment for public money devoted to research.
since funded institutions (universities, PROS, SMES) must be ready to harvest results and turn them into economic development.
the Commission is using soft law instruments to suggest good practices that universities should follow in technology transfer 31 This was also a point made by President Barroso in his speech at the European Innovation Summit, European parliament, 13 october 2009, Brussels
. 32 Commission Recommendation on the management of intellectual property in knowledge transfer activities and Code of practice for universities and other public research organisations,
incentives can be created for universities to perform better and reach that critical mass necessary for any successful strategy of technology transfer.
Enabling small and medium-sized enterprises to achieve greater benefit from standards and from involvement in standardisation, Rotterdam School of management, Erasmus University, at http://www. ecap-sme. org/documenti/primapagina
Enabling small and medium-sized enterprises to achieve greater benefit from standards and from involvement in standardisation, Rotterdam School of management, Erasmus University, 2009 (http://www. ecap-sme. org/documenti
actio n=display&doc id=5714&userservice id=1&request. id=0 European commission (2008), Recommendation on the management of intellectual property in knowledge transfer activities and Code of practice for universities and other public
R. 2009), Understanding Patent Quality Mechanism, Public law and Legal Theory, University of Pennsylvania Law school, Research Paper No. 09-22, subsequently published as 157 U. Penn.
Andrea Renda Massimiliano Granieri Senior Research Fellow Professor CEPS University of Foggia andrea. renda@ceps. eu mgranieri@luiss. it Mr. Hasan Alkas
Ph d. Daniela MITRAN Nicolae Titulescu University Athenaeum University Ph d. Student Adrian NICOLAU Bucharest SC Avangarde Technologies Consulting Abstract:
Assessing Europe's University-Based Research Expert Group on Assessment of University-Based Research EUR 24187 EN European Research Area Science & society EUROPEAN COMMISSION Research
Knowledge-Based Economy Unit C. 4-Universities and Researchers Contact: Adeline Kroll European commission Office SDME 9/17 B-1049 Brussels Tel. 32-2) 29-85812 Fax (32-2) 29-64287 E-mail:
http://ec. europa. eu/research/research-eu EUROPEAN COMMISSION Assessing Europe's University-Based Research Expert Group on Assessment of University-Based Research RTD.
and more coherent methodology to assess the research produced by European universities? This is the question experts were asked to answer, following a 2006 Commission Communication on the modernisation of universities1,
which suggested that universities should become more specialised and concentrate on working to their specific strengths.
Universities rankings are increasingly popular. Today, 33 countries have some form of ranking system operated by government and accreditation agencies, higher education, research and commercial organisations,
Universities use them to define performance targets and implement marketing activities, while academics use rankings to support their own professional reputation
'then the productivity, quality and status of research produced by universities is a vital indicator.
or on the fees associated with the university of their choice. A ranking system of this kind does exist for students,
Ranking universities as entire institutions may not be the most appropriate way to identify where the best research is done
A university may be renowned for one or two departments, but may not be excellent in all disciplines it offers.
and disseminated should allow for a better assessment of university-based research. I believe that the coexistence of different models to assess universitybased research is not only inevitable
We need to design flexible and multidimensional methodologies that will adapt to the diverse and complex nature of research, disciplines and of our universities.
and monitor the quality of research at universities. I wish to end with a simple quote from someone who understood better than anyone else the value of freedom, creativity and knowledge:
when we rank our universities, these most important of our knowledge powerhouses. Commissioner Janez Potocnik 1 Overview...
17 2. 1 University-based Research in the Knowledge Economy...17 2. 2 The European Policy Context...
20 2. 4 Remit of Expert Group on Assessment of University-based Research...21 2. 5 Format of the Report...
35 4 Measuring University-Based Research...36 4. 1 Indicators...36 4. 2 Indicators and Disciplinary Practice...
58 7 Appendix I. Activities and Membership of Expert Group on Assessment of University-Based Research...
88 10.3 FINLAND (AALTO UNIVERSITY...91 10.4 FINLAND (HELSINKI UNIVERSITY...93 10.5 FRANCE...96 10.6 GERMANY-FORSCHUNGSRATING (CONDUCTED BY WISSENSCHAFTSRAT...
99 10.7 GERMANY-CHE UNIVERSITYRANKING, CHE RESEARCHRANKING...102 10.8 GERMANY-INITIATIVE FOR EXCELLENCE...105 10.9 HUNGARY...
134 10.19 GLOBAL-THE-QS WORLD UNIVERSITIES RANKING...137 10.20 GLOBAL-PERFORMANCE RANKING OF SCIENTIFIC PAPERS FOR RESEARCH UNIVERSITIES...
140 10.21 GLOBAL THE LEIDEN RANKING...143 11 Appendix V. Bibliography...145 EU Publications...145 Other Publications...
147 9 Overview 1. 1 Executive Summary HEIGHTENED IMPORTANCE OF UNIVERSITY-BASED RESEARCH AND OF ASSESSMENT OF UNIVERSITYBASED RESEARCH The political context Assessment of university-based research1 (AUBR) has become a major issue for a wide range of stakeholders at all levels.
One of the main reasons is that research performance is regarded widely as being a major factor in economic performance.
and innovation, universities are considered key to the success of the Lisbon Strategy with its move towards a global and knowledge-based economy.
The economic dimension of (university-based) research in terms of expected economic and societal benefit and increased expenditure goes a long way to explain the heightened concern for quality and excellence in research
The following quote from the Commission's Communication Delivering on the modernisation agenda for universities:
Universities should be funded more for what they do than for what they are, by focusing funding on relevant outputs rather than inputs,
Global rankings The growing concern for the quality and assessment of university-based research partly explains the increasing importance attached to university rankings, especially global rankings.
rankings compare universities on the basis of a range of indicators; different systems favour different indicators,
The total score for each university is aggregated into a single digit, and universities are ranked accordingly.
Rankings enjoy a high level of acceptance among stakeholders and the wider public because of their simplicity and consumer-type information.
In particular, doubt has been cast on the possibility of comparing whole universities in other words, 1 In this report, the termuniversity'refers to all higher education institutions (HEIS),
irrespective of the name and status in national law. 10 diverse and complex organisations on the basis of aggregated scores.
THE RAISON D'ÊTRE OF THE ASSESSMENT OF UNIVERSITY-BASED RESEARCH EXPERT GROUP In this context, the Commission's Directorate-General for Research decided to convene an expert group on assessment of university-based research.
there is the generally recognised need for performance assessment of university-based research, especially publicly funded research;
of which if any seem to do justice to the diversity of research disciplines and fields, of research outputs, of university profiles,
Assessment of University-based Research Expert Group: remit and composition The Expert Group on Assessment of University-based Research was established in July 2008 to identify the parameters to be observed in research assessment as well as analyse major assessment
and ranking systems with a view to proposing a more valid comprehensive methodological approach. The overall objective was to promote
and contribute to the development of multidimensional methodologies designed to facilitate the assessment of university-based research.
and university senior management. Academically speaking the experts represented a variety of disciplines, including arts and design, humanities, socioeconomic sciences,
This makes comparability across universities and countries difficult. 2) Basic principles to be observed in the assessment of university-based research.
Guided by the conviction that the purpose and objectives of a given assessment exercise should be seen in context,
In particular, assessment of university-based research should Cover all disciplines and, crucially, trans-,multi, -and interdisciplinary work,
and resources of Europe's universities and higher education systems; Take into consideration, as appropriate, research tailored to specific local,
ii) Case studies Activity To obtain a clearer idea of existing methodologies for assessing university-based research
and rating universities/units within universities, case studies of pertinent exercises were prepared. Members of the Expert Group reviewed practices in their home countries and universities.
Outcomes These case studies represent different approaches and objectives. They furnish evidence that universities and national organisations regard assessment of university-based research as important for improving research performance and quality,
for strategic planning and for international benchmarking. They also reveal the common view that global rankings are not the perfect answer to their requirements.
The case studies highlight a number of key aspects of assessment which confirm or complement the insights gained
or be it that research quality is made the focus of attention to the detriment of other university functions. 14 OVERARCHING OUTCOME:
FAIRNESS AND FEASIBILITY IN ASSESSMENT OF UNIVERSITY-BASED RESEARCH A NEW APPROACH On the basis of the principles
General recommendations (1) Assessment of university-based research should be designed in relation to purpose and articulated objectives,
2) Assessment of university-based research should combine quantitative indicator-based data with qualitative information, for example information based on expert peer assessment or validation,
3) Assessment of university-based research should be undertaken at the level ofknowledge clusters',the precise scale and nature
fields of science within universities or inter-institutional networks. Knowledge clusters should allow for aggregation to institutional level.
and personalized tool-kit can be developed readily to meet different policy and university needs. External validation of provisional outcomes In April 2009, the Group organised, together with the European commission, a workshop,
RELEVANCE OF THE FINAL REPORT TO POLICY DEVELOPMENT AND THE DESIGN OF FUTURE RESEARCH ASSESSMENT EXERCISES The report is designed as a guide for Users of the outcomes of assessments of university-based research
Specialists engaged in assessment of university-based research, presenting them with a number of basic principles that need to inform assessment of university-based research,
and providing them with the outlines of a matrix for user -and purposedriven multidimensional research assessment. 1. 2 The Way Forward Recommendations The AUBR EG believes that, generally speaking,
assessment of university-based research is being hampered by a lack of reliable, comparable, and comprehensive data.
In view of this, the AUBR EG recommends that the European commission Take the lead in establishing a European Observatory for Assessment of University-based Research to identify and prioritise data requirements of a European Research Assessment Framework
and disseminate guidelines for use by universities, national agencies, government, and other stakeholders, based on the principles outlined in this report;
Develop a financial model to cover the full cost of university-based research including the cost of assessment,
The AUBR EG notes that global university rankings have become a popular means of gauging university-based research.
The EG cautions against rankings or similar assessment systems which seek to compare whole universities on the basis of an aggregated score and
European and international context for the establishment of the Expert Group on the Assessment of University-based Research (AUBR.
and a summary of its activities and findings is presented. 2. 1 University-based Research in the Knowledge Economy Around the world,
Because university-based research is the primary arena for the production of new knowledge, higher education is an important focal point for European union and national government policy-making.
European universities are at the forefront ofEurope's drive to create a knowledge-based society
universities are undergoing profound change. Competition is intensifying between universities nationally and internationally, students are becoming more conscious of the value of their education and its impact on their career opportunities,
and governments and other stakeholders are asking questions and requiring evidence of value-for-money. Attention is shifting to mechanisms to assess and benchmark the quality and performance of university teaching and learning,
and of research performance. In recent years, there has been a steady growth in methods to evaluate
and assess the activity and outcomes of higher education, with particular emphasis on the assessment of university-based research (AUBR).
In response to the results of the Shanghai Jiao Tong Academic ranking of world universities or Times QS WORLD UNIVERSITY RANKINGS concern has been expressed that too few European universities are ranked among the world's top 50 or 100 universities.
In order to capture the full richness of university-based research a multidimensional approach, combining qualitative and quantitative methodologies,
A new Start for the Lisbon Strategy (European commission 2005, p. 9) says that European universitiesmust be able to compete with the best in the world through the completion of the European Higher education Area'.
In its CommunicationDelivering on the Modernisation Agenda for Universities: Education, Research and Innovation',the Commission (2006, p. 7) argued thatUniversities should be funded more for
what they do than for what they are, by focusing funding on relevant outputs rather than inputs...
'In its resolutionModernising Universities for Europes Competitiveness in a Global Knowledge Economy',the Council (2007, p. 3) expressed the view that thechallenges posed by globalisation require that the European Higher education Area
and that Europe's universities aim to become worldwide competitive players'.'Both documents stress the relevance of university-based research to attaining the Lisbon goals.
Universities should communicate the relevance of their research to society and their stakeholders, and respond to calls for greater transparency, accountability and comparability.
Competitive funding should be based on institutional evaluation systems and on diversified performance indicators with clearly defined targets and indicators supported by international benchmarking for both inputs and economic and societal outputs (European commission 2006,
and support the diversity of European universities, which differ in their history and degree of involvement in research (some focus more on research than others),
the nature of their research activities (scientific, technical/applied research), their links to potential users of the results of their research (other universities, SMES and large enterprises), the geographical scope of their research partners,
The aim is to position European universities and research to generate increased investment, attract researchers from inside and outside Europe,
enhance the impact of university-based research on SMES and regional innovation, and strengthen teams engaged in inter-and trans-disciplinary research and global research networks.
which aims to classify European universities according to a multidimensional methodology. Mutual Learning on Approaches to Improve the Excellence of Research in Universities,
an expert group launched by the Scientific and Technical Research Committee (CREST) in 2007. The overall objective was to conduct a mutual learning exercise on the scope,
objectives and measures of national policies to improve research excellence in universities, to learn more about the effect of these policies,
and to develop recommendations for improving the policies and their impact on research in universities.
European University Data Collection a project studying the feasibility of a sustainable European system of data collection on the activities and performance of the European higher education institutions in the areas of education, research and innovation.
The European University Association's (EUA) Institutional Evaluation Programme focuses on quality enhancement at institutional level;
However, the experience of rankings illustrates that they can promote a simplistic understanding of university-based research and its contribution to society and the economy.
universities and other stakeholders. University rankings have become an increasing influence on the higher education landscape since US News and World Report began providing consumer-type information about US universities in 1983.
Since then, national rankings have been created in over 45 countries by public media organisations, government agencies or independent organisations.
It was followed quickly by the times QS World University ranking (henceforth Times QS, 2004), Webometrics or the Ranking Web of World Universities (2004), the Taiwan Ranking of Scientific Papers for World Universities (henceforth Taiwan, 2007),
but ignore the important role that universities have in knowledge and technology transfer. Concern has also been raised about the use of peer or reputation-based surveys
At a time of growing diversity of university mission and providers, rankings use a common set of indicators and weightings to measure all universities.
In addition, universities are complex organisations with strengths and weaknesses across various departments and activities. An aggregate score is unable to reflect this.
This has included efforts by governments and universities alike to reframe strategies and priorities, and make significant changes at the system
and institutional level in order to achieve a better ranking. 2. 4 Remit of Expert Group on Assessment of University-based Research Terms of Reference Performance assessment of university-based research is increasingly important,
Political and societal support for university research can only be maintained by a system of quality assessment, performance enhancement and value-formoney.
and to enable European universities to manage strategically, effectively and efficiently. It also assists universities to advance their own modernisation agenda, taking into account specific European values and objectives.
In response, the European commission established the Expert Group on the Assessment of University-Based Research to develop a multidimensional methodology to assess the quality of research produced in universities
with a European perspective, and taking into account the diversity of European universities performing research, research disciplines,
and the wide range of users. The aim is to enable institutional benchmarking, improvement in quality,
and comparative assessment of universities across Europe. The Terms of Reference were: 1) Review the needs of various types of users of measurement of research quality at universities;
2) Review main methodologies for assessing/ranking research quality of universities, covering existing international assessments/rankings and other methodologies being developed;
Preparation of a new framework for assessment of university-based research. Outcomes and Achievements Global rankings have achieved a high level of international popularity
and benchmarking of university-based research. The Expert Group concludes, however, that contrary to providing an accurate and useful assessment of research,
national governments and universities. 2. 5 Format of the Report This report is divided into four main sections, plus appendices.
and more and more fundamental research is conducted in the context of application, both within and outside universities.
Universities are the primary organization for this type of research. These developments have generated an important discussion on the definition R&d,
Depending upon the university, scientific field or policy environment, some formats may be more important than others.
Conference Proceedings X Book chapters X Monographs/Books X Artefacts X Prototypes X 3. 4 Users and Uses The assessment of university-based
Users'include policymakers and government agencies, universities, public or private research organisations (PROS), researchers or graduate students, employers, civil society and the media.
Local and regional authorities are likely to be interested in the reputation of individual universities and of the system as part of a wider economic strategy to position the city or region as an important node in the global economy.
relevance and impact of research activity Improve integration/collaboration between universities, government and private sector Improve attraction capacity Benchmarking performance and quality of HE system/institutions nationally and worldwide Indicator of national competitiveness Attraction capacity:
universities and government are interested in improving performance and quality while industry and employer groups want to be able to identify potential employees. 3. Some of the required data may be readily available
which makes comparability across universities or countries difficult. For example, bibliometric data on peer-reviewed publications are available commercially,
which transparency and accountability of publicly-funded university-based research have become core requirements on higher education.
Chapter 4 will discuss some of these issues in more depth. 36 4 Measuring University-Based Research This chapter examines the most important characteristics of research assessment.
A typical example is the total of human resources employed by university departments, offices or affiliated agencies to support
and facilitate comparisons over time and across different types of universities. Indicators should be unaffected by any bias arising from the interests of the parties involved in the research assessment exercise.
Many governments and universities strongly support the interconnection between teaching and research as one of the core principles of higher education.
Knowledge Clusters There are many types of research assessment processes at the university and national level, focusing on different institutional or cognitive units.
universities organise themselves differently for a complex set of reasons, including history, mission, finance, alignment with national or regional priorities, interdisciplinary thematics, etc.
Today, completion of an ethical statement or formal ethical approval by a university or national Research Ethics committee is required by most funding organisations,
and the university. This may include consideration of the impact and benefits of the research being conducted.
In response to the wider role and responsibility of university-based research more attention is being placed on its outcome and benefits, especially its social, economic, cultural and environmental impact.
The purpose of assessing the impact is to gauge the contribution that university-based research makes to society and the economy.
This arises from the need to assure stakeholders that publicly-funded university research is valuable
and universities from external sources, including competitive grants and research income from government, industry, business and community organisations.
Comparable data, verifiable through audit, is useful for comparing research performance across the system and within universities.
universities that do not have Medical Faculties will inevitably secure less funding than those with Medical Faculties.
in case of funding by end users because this information is known not to the University administration. Agree international comparative data base.
Comparable data, verifiable through audit, is useful for comparing research performance across the system and within universities.
Data needs to be adjusted to scale and mission of university. Employability of Phd graduates Industry employment of Phd graduates can be an indicator of the contribution of research to the highly Used to measure the quality of the graduates,
Databank on university related inventions should be developed. End-user Esteem Includes policy, technical or commissioned reports;
Number of Collaborations and Partnerships A count of national and international collaboration with other universities and/or with public-private and NGOS, etc.
university-university, university-external stakeholder, national, European or international. Doctoral Completions The number Phd and equivalent research doctorates and,
Data is verifiable by universities although there can be a time lag. Rates of completion may differ across disciplines.
or equivalent full-time (FTE) ofresearch active'academics employed by a university.Research active'is established by setting threshold levels of performance for a specific period.
'Total R&d investment Total investment in university-based R & D (research and development) from all sources, including external research income
and university resourcing of research, including Investment in research is a strong predictor of research performance.
Can be difficult to fully calculate university resourcing of research. Research Infrastructure and Facilities Number of research laboratories, Books in the library and/or electronic journal access
Favours older, well-endowed universities. Develop appropriate comparative indicators. Research Ethics Comprehensive process ensuring good ethical practice is promoted and promulgated.
environmental and cultural context in which the university operates and the research is conducted. In addition, assessment of its impact and benefits should be included in recognition of the wider role and responsibility of university-based research.
Finally knowledge clusters should be the basic unit of assessment, for five main reasons: 1. Universities differ in their internal structures;
2. Different disciplines tend to have different types of research units; 3. It enables new and emerging disciplines,
Consultation with Researchers and Universities: Development of both the Research Quality Framework (RQF) and the Excellence in Research for Australia (ERA) involves extensive consultation to ensure procedural fairness, transparency and acceptance by key stakeholders.
in Finland, France and The netherlands, this also involves visits to the university. Peer review panels are used also to assess the quality of research outputs and outlets of individual researchers for career promotion,
Aalto and Helsinki Universities, Finland, and The netherlands include selfevaluation as a key component in the process.
Several countries and universities are experimenting with measuring societal impact, demonstrated through case studies, enduser opinion,
developed in 2005-07, Aalto University in Finland, and The netherlands. Unit of Assessment: Many case studies focus on the research discipline (Australia) or research unit (Germany,
Forschungsrating), making it possible to illustrate differences in research quality within individual universities. The CHE ranking in Germany deliberately does not enable aggregation across a whole university.
Although the case studies represent very different systems and objectives, they share some common positive attributes.
the Expert Group has developed a Multidimensional Research Assessment Matrix discussed below. 5. 2 Framework for Research Assessment Assessing university-based research is a complex process.
and interpreted in the context of the discipline, national circumstances, the university, etc. This can only be done by experts.
Involve the universities and other users, as appropriate, in Step 1 above; Step 3: Identify the appropriate indicators;
Self evaluation. 56 6 Conclusion 6. 1 Limitations and Unintended Consequences University-based research has become one of the most critical factors shaping national competitiveness and university reputation.
Some governments, public agencies and universities are drawing immediate and direct links between research assessment and resource allocation or accreditation.
Simple ranking of universities on the basis of bibliometrics or citations can ignore differences between disciplines and between university missions, resources and context.
and new universities. In other words, these indicators cannot easily measure potential. This underlines the need to combine indicators and expert knowledge,
While it might be appropriate to allocate resources to researchers or universities which have performed best,
to allocate resources to weaker universities in order to build up their capacity. Or, rather than 57 using resource assessment to drive differentiation,
it might be more useful to encourage university self-profiling. Finally, in recent years, concern has been expressed about the financial and human costs associated with research assessment.
'orintelligent'presentation of data by universities and researchers anxious to ensure a good report.
and worldwide about the contribution and impact that university-based research is having on society and the economy.
The rise of worldwide ranking of universities is testament to this interest. This is the background against
which the Expert Group on Assessment of University-based Research was established by DG Research in 2008.
Because the results of research assessment can carry great significance for university, researcher and student reputation and status,
The Expert Group on Assessment of University-based Research has proposed a Multidimensional Research Assessment Matrix.
This would substantially enhance its user-friendliness. 6. 3 Contribution to Future Research Assessment Exercises The AUBR Expert Group hopes that this report will raise awareness of the principles that need to be observed in assessment of university-based research
Believes that the Multidimensional Research Assessment Matrix provides the optimum basis for strategic decision-making by government and government agencies, universities and other stakeholder organisations;
and depth of university-based research can be fairly and appropriately assessed and benchmarked; 59 Encourages a thorough assessment on a regular basis of university-based research along the lines proposed in this report rather than being guided merely by global rankings.
In sum, the Expert Group hopes that this report will serve as a guide to Users of information on the quality of university-based research,
Specialists engaged in assessment of university-based research, presenting them with a number of principles that need to inform assessment of university-based research.
The Expert Group also hopes that this report will provide inspiration to the European commission and Member State governments to launch initiatives
and projects designed to generate much needed comparable data and more appropriate and robust scenarios for the assessment of university-based research. 60 7 Appendix
I. Activities and Membership of Expert Group on Assessment of University-Based Research The members of the Expert Group were selected on the basis of their experience and knowledge of research assessment and higher education,
including a representative of the European Universities Association (EUA), plus two (2) international experts. MACKIEWCZ Wolfgang (Chairperson) HAZELKORN Ellen (Rapporteur) BERGHOFF, Sonja (Rapporteur) BONACCORSI, Andrea BORRELL-DAMIAN, Lidia EMPLIT, Philippe INZELT, Annamaria MARKLUND, Goran
Paloma The Expert Group was coordinated by Adeline Kroll (Scientific Officer, EC DG/RTD Directorate, Unit C4 Universities and Researchers;
he is the chair of the Expert Advisory Group FP7 Theme 8 and of the Assessment of University-Based Research Expert Group,
Wolfgang Mackiewicz studied English and German at FUB and at the University of Leeds, and wrote his Phd thesis on Daniel Defoe'sRobinson Crusoe'.
and is associated also with the International Association of Universities (IAU). Ellen is Rapporteur for the EU Expert Group on Assessment of University-based Research,
and a member of National Digital Research Centre (NDRC) Management Board, the Arts, Humanities and Social sciences Foresight Working group Ireland,
She is responsible for the research indicators in the different CHE rankings and especially leader of the CHE Researchranking for German universities.
She was responsible for the students'survey till 2006 and from 2002 leader of the CHE Researchranking of German universities.
or faculties in German universities that are most active in research. Sonja Berghoff was responsible for the bibliometric analyses in all disciplines as well as the aggregation and presentation of the results.
Sonja Berghoff studied Statistics at the University of Dortmund and wrote her Phd-Thesis on"Heteroscedasticity and autocorrelation consistent covariance matrix estimation within the linear regression model".
and a half years and in this way gained a lot of experience concerning the organisation of third party funding at German universities as well as evaluational aspects during the phase of (re-)application und examination.
Andrea is Professor of Economics and Management at the University of Pisa. He is author of papers in the most important journals in economics and policy of science, technology and innovation,
and the editor (with C. Daraio) of Universities and strategic knowledge creation (Edward Elgar, 2007).
He has pioneered the construction of datasets on universities in Europe, using microdata to carry out evidence-based policy analysis. He is member of the HLEG on Assessment of university based research and of the HLEG on the Future of Commission research policy at DG Research,
and advisor to the Italian Ministry of Economic Development on Structural Funds for innovation, as well as of several regional governments.
Lidia joined EUA in January 2006 and has been working within the areas of doctoral programmes and researchers'careers, university-industry collaborative research, knowledge transfer and also on the contribution of universities to regional innovation.
Lidia Borrell Damian holds a Doctorate in Chemical engineering from the University of Barcelona, where she was an assistant professor from 1990-1998.
She was a postdoctoral researcher in the US (North carolina State university) and Canada (The University of Western Ontario
she moved to the public research management sector at the Pompeu Fabra University (Barcelona), where she was director of research services in 2003-2005, managing research projects, knowledge transfer activities and developing university research policy.
Email: Lidia. Borrell-Damian@eua. be EMPLIT, Philippe. Philippe is director of the Service OPERA-Photonics and Professor of Physics at the Université Libre de Bruxelles (ULB), in the Engineering science and Human sciences faculties.
advising, on a scientific basis, the university authorities in their strategy versus rankings development. He also is a member of the Governing board and of the Research Council of ULB
Philippe Emplit is a member of the Interuniversity Council CIUF of the Belgian Communauté française CFB, the organism responsible for collecting in a coherent way the statistics for all CFB universities.
to study the methodology of assessment and ranking of universities and business schools, as well as some aspects of the so-called econophysics.
She is member of Economic Advisory Council of Budapest Business school, prime member of doctoral school at the University of Szeged and private professor at Budapest Corvinus University.
On various national and international training seminars she is teaching on STI indicators, HRST mobility and university-industry collaboration.
Her main research interest includes the theoretical and practical issues of the innovation systems, the innovative capabilities and 63 performance of the different actors, business organisations and universities.
She is studying thethird mission'of universities. The challenges imposed by the internationalisation of R&d are studied by her from the point of view of host countries.
and Observatory of European Universities working groups and author and co-author of several papers and articles originated from these projects.
He is also Associate professor in Economic History at Uppsala University with the focus on innovation and economic change.
He has also been guest researcher at the Center for International Technology Policy (CISTP) at George washington University.
Henk is a senior staff member at the Centre for Science and Technology Studies (CWTS), in the Department (Faculty of social sciences at Leiden University,
He obtained a Ph d. degree in Science Studies at the University of Leiden in 1989.
Arto is Professor of Russian Language at the University of Helsinki (UH. He has been involved in assessments and evaluations in different capacities.
which was responsible with two other universities for the Evaluation 64 of Uppsala University, the Evaluation Committee for Estonian Departments of Russian Philology and the International Evaluation Panel for the Faculty of arts of Jyväskylä University.
He has been a member of Steering committees for several university level research exercises. He was Vice-Rector at the UH in 1992-1998
He has published six monographs and numerous articles on scientific topics and a book on research ethics and another book on university administration.
Sir Newby, KB, CBE, BA, Phd, Acss, is Vice-chancellor of the University of Liverpool. Sir Howard was previously Vice-chancellor of the University of the West of England (UWE) in Bristol having previously spent five years as the Chief executive of the Higher education Funding Council for England (HEFCE.
He was Vice-chancellor of the University of Southampton from 1994 to 2001 and was previously Chairman and Chief executive of the Economic and Social Research Council (ESRC).
From 1999 to 2001, Sir Howard was President of Universities UK, the UK body which represents the university sector.
He was also President of The british Association for the Advancement of Science for 2001-2002.
He was made a CBE in 1995 for services to social science and a knighthood in 2000 for services to higher education.
He has been a corresponding member of the Academia Sinica of Taiwan since 2002, Adjunct Professor at the University of Science and Technology of Hong kong since 2005, Honorary Professor at the University of Beijing since 2007,
Professor Rowley is Deputy Vice-chancellor Research (DVCR) at the University of Technology, Sydney, Australia. Prior to her appointment in 2004, she was the Australian Research Council's Executive director for Humanities and Creative Arts.
She began her academic career at University of Wollongong in 1986 and was appointed UNSW Foundation Professor of Contemporary Australian Art and Head of the School of art History and Theory in 1995.
and the Australian Council of University Art and Design Schools (ACUADS) and served on the Australia Council's National Infrastructure Committee.
President of the Wollongong Women's Centre in the 1980s and Convener of the Richmond Education Centre, established by Labor's Schools Commission in the 1970s) and university bodies (i e. elected staff representative on the Councils of Prahran
Community college in the 1970s and University of Wollongong in the 1980s. Her current roles include membership of various boards (Capital Markets CRC,
Sue is convener of the NSW D/PVCR Group and a member of the Executive of Universities Australia'S d/PVCR Group.
Mr. Salmi has guided also the strategic planning efforts of several public and private universities in China, Colombia, Kazakhstan, Kenya, Madagascar, Mexico and Peru.
He also holds a Master's degree In public and International affairs from the University of Pittsburgh (USA) and a Ph d. in Development Studies from the University of Sussex (UK).
addresses the Challenge of Establishing World-Class Universities. E-mail: Jsalmi@worldbank. org SANCHEZ, Paloma. Paloma is a Phd in Economics and Professor of Applied Economics at the Autonomous University of Madrid (UAM.
She is currently the Director of an Interuniversity Master and a Phd Programme on Economics and Management of Innovation and the Director of a recently created Chair UAM-ACCENTURE.
She has been asked to evaluate research activities of University teachers, first, as a member of the National Committee to Assess Research Activities, in the Social sciences and Humanities area, of the Spanish Ministry of Education (2003-2004), second,
as a member of several evaluation committees appointed by different Spanish Universities. She has been Member of the Observatory of European Universities within the PRIME Network of Excellence.
She is currently the director of a research project commissioned by the Madrid Regional Government to assess the results of the Programas de Actividades de I+D entre Grupos de Investigadores de la Comunidad de
Over the last three years she has applied the intellectual capital framework to Universities and research institutions in order to help them to better measure, manage
Rankings compare universities using weighted indicators which are aggregated, and then hierarchically ordered. Rating Ratings are a system of presenting the results of performance
either by universities, or comes from an international database or increasingly from web-based technologies.
University-based data normally requires direct entry by researchers, often mediated through the Research Office.
verification is undertaken by universities and auditing may also be used to exclude publications which do not meet criteria for inclusion.
At national level, identifying categories for inclusion in data collection involves consultation with key discipline, research and university organisations and leaders,
Individual universities may be able to compile data on all categories of outputs but this needs a high degree of compatibility for cross-institutional and cross-national comparability.
Some universities have established major research centres in bibliometrics, including the Centre for Science and Technology Studies (CWTS) at Leiden University and the Research Evaluation and Policy Project (REPP) at the Australian National University.
These Centres use bibliometric data to undertake systematic evaluation and mapping of research at institutional, cross-institutional, national and international levels.
universities, discipline organisations and government agencies may decide to develop their own rankings of journals usually drawing on the journal impact factor reports.
and universities of the ranking assigned to journals that is, consensus thattier 1 journals'are indeed the most prestigious journals for each field.
as appropriate, Research Masters degree Completions Universities and, in some cases, government agencies, collect data for this indicator.
If resources are attached to performance against this indicator, universities, faculties and centres and institutes are likely to proactively promote quality control
it is not even comparable across universities within one country. 9. 6 RESEARCH ACTIVE ACADEMICS Description:
Number or Equivalent Full-time (FTE) ofResearch Active'Academics Employed by a University.Research active'is established by setting threshold levels of performance for a specific period (e g. the previous year
Some universities could require staff to achieve 2 or 3 out of these 4 indicators.
Universities collect data for this indicator. Pro 76 This indicator is useful to drive improvement in research performance in universities and in internal units (faculties, schools, departments.
Used as a proportion of total academic staff, this indicator can assist universities in building research capacity.
Con The threshold level can be raised as stronger performance is achieved across the institution. This approach to indicators can also be drilled down differentially to faculties and other units within universities. 9. 7 RESEARCH OUTPUT PER ACADEMIC STAFF Description:
Number of publications and other outputs per academic staff or Full Time Equivalent (FTE. The total research output is divided by the respective number of staff.
Pro This indicator provides a useful way to measure the contribution that individual active researchers make to the total university output.
Con Because universities are involved in a wide range of activities not all academic staff may be research active.
Number of Co-Publications within the Unit of Assessment or Collaborative Work with Researchers from other Universities/Organisations.
best done by individuals and reported by the university. Pro Collaborative work with researchers in other universities, both nationally and internationally, shows the extent of a research engagement.
The latter is an important indicator of internationalisation, which is itself a measure of peer esteem.
Level of funding attracted by researchers and universities from external sources. This indicator measures competitive grants and research income from government, industry, business and community organisations.
Universities usually collect data, although data may be provided by sponsoring organisations. Pro Research income is a useful indicator for measuring the scale of the research enterprise and its capacity to secure additional income through competitive grants and contact research, especially in science, technology and medicine.
and can be useful for comparing research performance across the system and within universities. The willingness of industry to pay for research is a good lead indicator of value-for-money and usefulness
and level within the university, faculty and other units in order to make internal comparisons. Competitive grants and funding are valuable indicators of past research performance
universities that do not have medical faculties will inevitably secure less funding. It may be difficult to collect data from end users
e g. the number of academic staff or FTE because of the way different universities and countries consider this category.
Total Investment in University-Based Research from All Sources This indicator includes all university allocations (e g. investment) in research allocated from the government block or operating grants and externally-earned income, e g. international
Universities collect data for this indicator and self-nominate levels of funding for research from consolidated revenues.
This indicator links university R&d investment to Government and Business Investment in R&d (BERD) at regional and national level.
the environment the university provides for conducting research. Pro This indicator measures the research environment as a predicator of research capability
because it is difficult to characterise the strength of a research university without being able to gauge the quality of the infrastructure.
There are also differences between universities which are focused on the sciences rather than the arts and humanities.
the number of disclosures, indicated to the appropriate university office, e g. Technology Transfer Office, of possible inventions to be considered for patenting;
Universities usually regularly report on patents, LOAS, start-ups etc. Verification is complicated possible (but: income and equity data are included in audited financial reports
Industry Employment of Phd Graduates Universities track the career destinations of their Phd graduates and alumni via postgraduation Career Destination Surveys and Alumni Databases.
The information does not always specify the universities from which Phd holders graduated; hence their use for research assessment of universities is limited at this stage.
Moreover, employability may be a factor external to the university, e g. the state of economic development or point in the economic cycle.
It is also difficult to align workforce information on Phd holders to the specific universities that trained them. 9. 10.3 SCIENTIFIC PARTNERSHIPS AND COLLABORATIONS Description:
Number of Collaborative Projects with External Partners or Participation in Programmes Designed to Foster Collaboration.
This information is collated generally at the university level. Pro 81 This is an important indicator for measuring scholarly involvement with other researchers and in turn the extent to
and the level of attractiveness by researchers in other universities and countries. Con The data is only in the early stages of being defined and collected.
so the university's data may not complete. Results are often only internally published; the amount of money given is often not public. 9. 11 END-USER ESTEEM Description:
and submitted by universities; discipline-appropriate assessment by 13 discipline-grouping panels of expert peers,
and accessed through digital repositories created by each university; the impact on the broader economy and society of the Research Group's research to be demonstrated primarily through Case studies;
and research training in universities (the actual quantum and programs had not yet been announced). ERA is intended to be differentiated from the RQF by its greater use of indicators and morestreamlined'processes.
Universities will report on research outputs by Field of Research codes. ERA is currently under development by the Australian Research Council.
compare Australia's university research effort against international benchmarks; create incentives to improve the quality of research;
promote collaboration between institutions and between university researchers and end users; encourage scale and focus and thereby efficient use of research infrastructure and resources;
As with the previous RQF, ERA involves exhaustive consultation with researchers and the 39 universities in the Australian system.
The Government will report outcomes for each university by Fields of Research at the twodigit and four-digit levels,
It is not intending to generate an integrated ranking of Australian universities. Intended and Unintended Consequences:
Some or all of the university block grants, currently based on performance-based indicators, for infrastructure, research training
'Greatly improved technical capacity for data collection across Australian universities. Greater concentration of research funding in universities that areresearch intensive'(a category that tends to coincide with larger, older universities with strength in natural sciences and medicine.
Greater concentration of resources (funding, staffing, scholarships etc) within universities in areas of recognised research strength and strategic fit with university profiles.
Raised international standing for Australian research and increased international collaboration by researchers. Unintended consequences: Destabilisation andchurn'in the system as the pressure to recruit talented staff results in rapid-paced mobility.
88 10.2 UNIVERSITÉ LIBRE DE BRUXELLES BRUXELLES BELGIUM Executive Summary In Belgium, assessment of university-based research has not yet been undertaken at national or regional/community level.
The fields of research represented at ULB were divided by the university authorities into ten mutually exclusive disciplines, two
and obtain a proper understanding of the University's research output and its evolution; to improve research performance and quality,
to develop a new internal managerial and governing tool for the university authorities and for the discipline-specific research teams.
who will be in charge of all contacts (with the university and the peer review panel) during the assessment process;
an academic coordinator is nominated by the university authorities (usually from outside the country; a panel of 10 experts is convened by the academic coordinator;
The ULB Research Assessment Exercise is similar to the system in operation since 1996 at ULB's sister university, the Vrije Universiteit Brussel (VUB),
pp. 45-57.91 10.3 FINLAND (AALTO UNIVERSITY) Executive Summary The research evaluation included some innovative ways of using the peer review method.
Research Evaluation of Aalto University 2009 (AALTO Evaluation) Policy Context, incl. circumstances under which the exercise came about:
The Helsinki University of Technology (TKK), the Helsinki School of economics (HSE), and the University of Art and Design Helsinki (Taik) are in the process of merging.
The new university will be called Aalto University. The idea underlying the merger is to create a world-class research university.
Aalto University is to become operational in August 2009. Policy Objective (s: According to the Charter of Foundation, Aalto University's activities are based on top-level research.
Thus, conducting a research evaluation even before the new university started its work was a logical decision.
As a result of the evaluation, senior management will know in which research areas Aalto University achieves the best results
or has the potential for reaching the highest international level, and in which areas additional support is needed.
At the same time, the evaluation will provide a benchmark for further development of research..Methodology, incl. time-frame, resources, costs, technologies:
The overall procedure of the evaluation was rather traditional: Thorough planning of the evaluation process by the Steering committee.
undertaken by nine international panels (all panels visiting the University at the same time). The panels were asked to prepare written assessments
Are the doctoral graduates of the Unit hired by the leading universities across the globe?
however, hope that compiling the documents urged them to think about these important things. 93 10.4 FINLAND (HELSINKI UNIVERSITY) Executive Summary The case study describes the research assessment exercise carried out by Helsinki University for its own purposes.
Research Assessment Exercise of the University of Helsinki 2005 (UH RAE) Policy Context, incl. circumstances under which the exercise came about:
In Finland several types of university assessments are in place: The Academy of Finland carries out international assessments of research fields at irregular intervals.
and teaching and in this context identifies, at three-year intervals, high quality educational university units, which are awarded substantial additional resources.
FINHEEC is also responsible for auditing procedures for universities and polytechnics. A number of individual universities carry out research assessments,
and evaluations of teaching and education on their own initiative. The first research assessment of this kind was carried out by the University of Helsinki in 1999.
Several other Finnish universities have organized since similar exercises. Their purpose varies: to enhance research within the university in question;
to gain visibility in the media and the opportunity to draw attention to the high level of research;
to acquaint foreign researchers (experts serving on evaluation panels) with the research carried out and the researchers working at the university in question.
Policy Objective (s: Helsinki University is one the founders of the League of European Research Universities.
As part of its strategy, it regards research and the training of researchers as its main objectives.
including information on the Finnish university system and research policy. They then spent one working week in Helsinki,
Publication of results on the university web site: 1 march 2006 (Summary Report, Individual Evaluation Reports. Organization:
Panellists are aware that their judgment will have a negative or positive impact on the allocation of resources to their own fields at the university concerned.
Perhaps the only way to obtain more reliable quantitative assessment results is to compare the outputs of departments with those of similar units at other universities. 96 10.5 FRANCE Executive Summary The case study describes the research assessment
one section is concered with the evaluation of higher education (including universities) and research institutions as a whole, the second one with the research undertaken in these institutions,
The french Ministry of Higher education and Research established the AERES National Agency two years ago to undertake, at national level, the evaluation of research in both research organisaitons and universities.
The organisations (National Committee of the CNRS and the National Council of Universities for Higher education institutions) which used to be responsible for the evaluation have not been disbanded.
and by one of France's 85 universities, as well as the units only supported by the universities,
and not linked to the CNRS etc. The evaluation committees convened by AERES are asked to identify excellence among the teams
2. Number of full research fellows (chercheurs) in the lab, compared to other members (university faculty) who teach
The evaluation reports are sent to the research teams and to the presidents of the universities,
universities and non-university research institutions in Germany are evaluated in a single, comprehensive exercise. In a multi-step assessment process, the institutions were evaluated first by at least two experts independently before each rating was discussed in plenary sessions.
but universities do use it to benchmark performance. Both rankings have helped to make it clear that there are differences between German universities.
Name/Title of Research Assessment Exercise: CHE Universityranking, CHE Researchranking Policy Context, incl. circumstances under which the exercise came about:
and involving the universities in the development process. The CHE is an independent nonprofit organisation.
The CHE University ranking is designed to help prospective students make an informed choice of study program and university,
while the CHE Research Ranking of German Universities addresses scientific communities or universities The latter is designed to identify research-active departments at German universities in specific disciplines.
It may also be used as a benchmarking instrument for universities. Both rankings are meant to develop transparency of university performance in different fields
and to stimulate competition between universities. Methodology, incl. time-frame, resources, costs, technologies: The CHE Rankings provide an indicator-based, multidimensional system.
Both rankings are discipline specific, covering about 25 disciplines (CHE Research Ranking: 15 disciplines), each of which are revisited every three years.
The indicators are based on different data sources: data collected directly at the universities, publication databases (Web of Science and national databases of scientific-scholarly publications) and a survey conducted among professors.
The main characteristics of the CHE Rankings are the following: no aggregation of indicators across the whole of a university,
but subject-specific data analysis and presentation of results. no weighted or non-weighted total value for the research performance of a given department,
but profiles of universities 103 Indicators used for research assessment: Amount of third stream funding (per researcher:
Different sources of third stream funding may be reported by the universities; no distinction is made between different sources.
Since 2006, in Germany, all inventions made by researchers at universities are in the first instance owned by the universities,
Every researcher who wants to have patented an invention first has to inform his/her university,
and only if the university refuses to have patented the invention, can the researcher do so on his/her own.
This regulation makes it possible to ask for the number of inventions registered with the universities offices dealing with the transfer of knowledge and technologies.
In the given case, CHE requested information about all the inventions reported to the universities in the previous three years by researchers from the fields included in the Ranking.
In the institutional survey, universities were asked for the number of doctorates completed in their departments over the previous three years.
The universities contribute to the exercise by completing the institutional questionnaires and helping with the logistics of the student survey.
Intended consequences The ranking has had some influence on the choice of university made by specific groups of students;
The ranking helped to make it clear that there are differences between universities. Unintended consequences The quality of the data is improving because of public pressure Since the publication of the first CHE Research Ranking,
At the same time, universities have begun to provide data in an"intelligent"way in order improve the position of their institutions in the Rankings.
The results of the Initiative were based on the assessment universities'plans for the future undertaken by an international panel.
The Initiative for Excellence of the German Federal government and the state governments aims at promoting excellent research at German universities.
The Initiative for Excellence aims at supporting top-level university-based research and improving its international visibility creating outstanding conditions for young scientists at universities deepening cooperation between disciplines
and institutions strengthening international cooperation in research promoting equal opportunities for men and women intensifying scientific and academic competition
1. Graduate schools for the promotion of young researchers 2. Clusters of Excellence for the promotion of top-level research 3. Institutional Strategies for advancing top-level university research.
In the pre-selection round, universities submitted Draft Proposals. These were reviewed by internationally appointed panels of experts.
The universities chosen from this stage subsequently presented their full proposals. These were assessed in an identical procedure.
In the first round of funding, 319 Draft Proposals were submitted by 74 universities. Of these, 90 propsals (39 Graduate schools, 41 Clusters of Excellence
Of these, 38 proposals submitted by 22 universities were selected for funding. They will be funded up to November 2011 at a total of 873 million euro. 305 Draft Proposals were received in the second round of funding,
A total of 47 proposals submitted by 28 universities were selected for funding. They will have received a total of more than one billion euro by November 2012.
These funds are to be available for the universities and their partner institutions for research and the promotion of young researchers until 2012.
on the universities and other institutions participating in the Initiative, along with the names and contact details of the key people involved.
and the viability of university-based research. Intended and Unintended Consequences: Observations/Additional Comments: 107 10.9 HUNGARY Executive Summary The research assessment related exercise is based on the Act on Higher education (2005) and its complementary law (2007.
After the recognition of systematic weaknesses of the Agreements both Ministry and research-oriented universities wish to modify the indicative indicators for assessment.
After the recognition of systematic weaknesses of the Agreements both Ministry and research-oriented universities wish to modify the indicative indicators for assessment.
The HAS research assessment is an up-to-date version of West European practices. 110 10.10 IRELAND Executive Summary The Sunday Times Irish Universities League Table is a relatively basic ranking system,
The Sunday Times Irish Universities League Table http://extras. timesonline. co. uk/tol gug/gooduniversityguide. php http://extras. timesonline. co. uk/stug
The Sunday Times six years ago launched the first Irish universities league table in a 94-page University Guide.
Universities are ranked according to marks scored in six key performance areas. Average points required for HE entry.
Universities HEA 2007 data; institutes 2006 Department of education and Science data. Completion rates. Percentage of 2002 entrants who completed courses for which they enrolled by conferring in 2007 Source:
HEA 2007 data, universities only. Nonstandard entry Individual institutions 2007 intake. Sports facilities Assessment by The Sunday Times in consultation with students unions.
Sunday Times University Guide is not a research assessment exercise but has been used as such because it highlights indicators on research.
The Sunday Times University Guide is not a significant sales driver, typically only adding less than 3%on 1. 4m sales.
etc. 113 10.11 ITALY Executive Summary In Italy the evaluation of university-based research witnessed two main periods:
-in 2009, a new Agency for the Evaluation of University System and Research (ANVUR) has been established by the Government,
taking the role of CIVR (evaluation of research) and of CNVSU (National Committee for the Evaluation of University System-evaluation of universities).
Grades were aggregated at department level and then at university level. The experience was considered almost unanimously positive.
No implication on university funding was derived in subsequent years, until 2009 when the 2001-2003 data were used by the Government to allocate, together with other indicators, 7%of the overall university funding.
Name/Title of Research Assessment Exercise: Comitato di Indirizzo per la Valutazione della Ricerca (CIVR) Committee for Steering of Evaluation of Research Policy Context, incl. circumstances under which the exercise came about:
-university departments Public Research Organisations (CNR, INFN, INFM, ENEA and many others Policy Objective (s:
To support internal evaluation bodies at University level and at PRO level to follow common methodologies
Carry out systematic and periodic evaluation of research produced by Universities. Methodology, incl. time-frame, resources, costs, technologies:
First mandate to evaluate research by Minister of University and Research in 2002. Guidelines for evaluation of research presented by CIVR in 2003.
Approved by Minister of University and Research in December 2003.114 Departments were asked to identify a small number of research outputs,
In particular, each University was asked to submit 0, 5 products per Full Time Equivalent researcher covering the last three years;
All universities and PROS were invited to submit their research products before June 2004, while complementary information on human resources, financial resources, other research outcomes,
Rather, in 2007 the new Government minister of Research Mussi) approved the creation of a new Agency, called National Agency for the Evaluation of the University System and of Research (ANVUR.
The goal of making research evaluation a pillar of University funding has nevertheless been declared in Government initiatives. 115 In fact, the Decree no. 180 of November 10th,
2008, stated that a portion of Government funding to universities, not less than 7%,%will be done following indicators regarding quality of education and research and efficiency in organization.
On 24th july 2009 the Government finally issued a Decree with which 7%of the block funding to universities (FFO, Fondo di Finanziamento Ordinario),
i e. 525 million euro, have been allocated to universities according to the following criteria: -teaching (1/3 of total:
50%in proportion to the grade received by the University from CIVR in 2006 30%according to share of EU funding (VI and VII Framework programmes) 20%share of funding from Ministry of Research in competitive grant
The Ministry published a list of universities with the percentage of increase or decrease of funding resulting from the application of these criteria.
Meanwhile, a new Agency for the evaluation of industrial research, not having competence on University-based research,
Universities receive a score representing the proportion of departments ranked top. 116 Names of experts are kept secret.
universities and academic communities in order to present methodology and discuss results. Intended and Unintended Consequences: The CIVR process had very positive consequences in terms of demonstration of feasibility of the evaluation exercise and credibility of methodology and quantitative scores.
leaving some frustration in the academic community supporting the evaluation culture 2. some politics at the level of universities,
trying to immunize universities from the effect of potential impact of evaluation (i e. differential funding for good
CIVR evaluation refer to 2001-2003 period) and that universities with very different size, age and subject mix have been treated in the same way.
The three main Dutch organisations responsible for publicly funded research the universities, the Royal Netherlands Academy of Arts and Sciences (KNAW) and The netherlands Organisation for Scientific research (NWO) defined a protocol for practical
i e. the board of a specific university, of KNAW or of NWO, is responsible for the organisation of the evaluation of that institute
for example, researchers may work both in an Academy institute and in a university-based research school.
On the basis of the report, the advisory board's advice and discussions with the institute, the university/KNAW/NWO board will draw conclusions for the future of the institute.
evaluating all research groups in all Netherlands universities at the same time (e g. computer science; chemistry). ) In other disciplines (e g. physics), several evaluation committees were established,
Leiden University and Delft University of Technology in the field of physics. 120 10.13 NORWAY Executive Summary A new model for result based university research funding was established in Norway in 2006.
The main policy objective was to stimulate increased research activities and allocate resources to centers performing excellent research.
New Model for Result based University Research Funding'Policy Context, incl. circumstances under which the exercise came about:
In 2006, Norway adopted a partly new system for allocating funding to universities and colleges.
and reallocated annually, based on the performance of universities and colleges, according to the 4 sets of indicators, during the two previous years.
Allocations are made nationally to the universities and colleges, which then decide how to allocate these funds locally.
The indicator for scientific publications was developed by the Norwegian University and College Council, on commission by the Norwegian Government.
to fully understand the workings and consequences of the system, particularly in relation to 121 strategic decisions at the level of universities and colleges.
The New norwegian model for university funding has been one of several inspirations for the new Swedish model for block funding of universities.
and potentially important for research policy within the EU. 122 10.14 SPAIN Executive Summary The research outputs of university teachers in terms of publications are evaluated on a voluntary basis every six years.
In 1989, a new scheme was launched to increase the research activities of both university teachers and researchers in public organizations.
124 10.15 SWEDEN Executive Summary A new model for allocation of university block grants was established in Sweden in the Governmental Research Bill in 2008.
The main policy objective was to incentivise strategic university management for increased research quality. The methodology is based on two components:
It is on the basis of these two components that the increased funding of universities is going to be allocated.
Allocation of General University Funds (GUF or Block grants. Established in the Governmental Research Bill October 2008.
'The resources that are allocated directly to universities and colleges represent the basis for the activities of higher education institutions.
but probably not substantial at the university level, i e. in addition to the money that would anyway be needed for management purposes.
However, the detailed descriptions and analyses of the performance of the various universities have not been published
in order to facilitate improvements in university research. Intended and Unintended Consequences: Intended consequences are to generate incentives
and resources for universities to prioritize, manage and perform research in a way that improves the scientific quality and,
or in terms of its benefits for society and industry. 127 10.16 UNITED KINGDOM Executive Summary Since 1986 the UK national funding bodies have evaluated the quality of research in UK universities through peer review
and will inform research funding for universities in 2009-10. Inevitably, the RAE results are converted by the media into league tables for ranking the quality of subject areas and universities.
The RAE is developing into a new Research Excellence Framework which has the intention of blending a lighter touch peer review with bibliometric indicators where these are appropriate.
an initiative of the Cybermetrics Lab (Spain), has produced theWorld Universities'ranking on the Web
'since 2004, measuring the web presence of universities around the world and comparing the size and scale of their web presence against expectations based on other rankings.
Webometrics produces theWorld Universities'ranking on the Web, 'which calculates the web presence of universities around the world
and compares the size and scale of the web presence against expectations based on other rankings.
and visibility of the institutions and it is a good indicator of impact and prestige of universities'(Webometrics, 2008).
The Webometrics world universities ranking are initiatives of the Cybermetrics Lab, which is a research group within the Centro de Información y Documentación (CINDOC).
or site design but instead on universities'web presence as illustrative of institutional outputs and web visibility.
university authorities should reconsider their web policy, promoting substantial increases of the volume and quality of their 130 electronic publications.'
or page design but global performance and visibility of the universities.As other rankings focused only on a few relevant aspects, specially research results,
university authorities should reconsider their web policy, promoting substantial increases of the volume and quality of their electronic publications.'
not narrowing the analysis to a few hundreds of institutions (world-class universities) but including as many organizations as possible.
'the current objective of the Webometrics Ranking is to promote Web publication by universities, evaluating the commitment to the electronic distribution of these organizations
which is evident even among world universities from developed countries. However, even when we do not intend to assess universities performance solely on the basis of their web output,
Webometrics Ranking is measuring a wider range of activities than the current generation of bibliometric indicators that focuses only in the activities of scientific elite.'
visibility and impact of the web pages published by universities, with special emphasis in the scientific output (referred papers, conference contributions, pre-prints, monographs, thesis, reports,)
Search engines are important for measuring visibility and impact of university's websites. There are a limited number of sources that can be useful for Webometric purposes:
so only universities and research centres with an independent web domain are considered. If an institution has more than one main domain,
the economic factor is considered not a major limitation (at least for the 3, 000 Top universities).''The only source for the data of the Webometrics Ranking is a small set of globally available, free access search engines.
of Education at Shanghai Jiao Tong University (China), was the first international league table, offering a scientific mechanism for comparing universities around the world.
ARWU uses several comparative and seemingly objective indicators of academic or research performance, including alumni and staff winning Nobel prizes and fields medals,
Shanghai Jiao Tong University (SJTU) Academic ranking of world universities (ARWU) Policy Context, incl. circumstances under which the exercise came about:
developed to offer a scientific mechanism for comparing universities around the world. Starting in 2001, researchers in the Institute of Higher education at Shanghai Jiao Tong University (SJTU) were compelled to develop a quantitative mechanism for comparing Chinese higher education,
to assess the gap between Chinese universities and world class universities'(Liu and Cheng, 2005,127).
As part of a comprehensive public agenda for development within in higher education sector, China was committed to promoting world-class standards among at least its best institutions.
as well as others not listed, are interested in rankings of universities for different purposes. At the time of its inception as an idea in 2001,
there was no ranking of world universities using multiple criteria to establish a single perspective on research output
The methodology used by Shanghai Jiao Tong University to develop its own World University rankings relies exclusively on seemingly objective indicators,
We have scanned every university that has any Nobel laureates, Highly Cited Researchers, or papers published in Nature or Science.
In addition, we scanned major universities of every country with significant amount of papers cited by SCIE and SSCI.
In total, we have collected data on about 2000 universities. http://ed. sjtu. edu. cn/rank/2003/FAQ. htm retrieved November 5, 2008.
Universities are ranked by several indicators of academic or research performance, including alumni and staff winning Nobel prizes and field medals,
ARWU does not explore the overall operational capacity of universities with regard to such inputs admission rates or the educational backgrounds of its faculty.
The researchers at SJTU are clear in disclaimers on their website that it would be impossible to have a comprehensive ranking of universities worldwide, because of the huge differences of universities, in the large variety of countries and funding capacities,
Liu, Nian Cai and Cheng, Ying (2005) The Academic ranking of world universities',Higher education in Europe, vol. 30, no. 2, pp. 127-135.137 10.19 GLOBAL-THE-QS WORLD UNIVERSITIES
the WUR intends to meet the needs of consumers (students, academic staff, researchers, policy makers) seeking reliable information about universities around the world.
THE-QS World Universities Ranking Policy Context, incl. circumstances under which the exercise came about:
These reviews evolved into data driven, comprehensive national institutional rankings (Times Good University Guide) in the 1990s.
and of new consumers who were seeking reliable information about universities across the country. The expansion from producing national rankings to developing an international one resulted from the recognition that student mobility was on the rise at a time
The methodology for this ranking supports a more holistic and subjective view of universities'relative strengths by seeking comparisons based on international reputation.
140 10.20 GLOBAL-PERFORMANCE RANKING OF SCIENTIFIC PAPERS FOR RESEARCH UNIVERSITIES Executive Summary First published online in 2007, the Performance Ranking of Scientific Papers for Research
Universities (PRSP) was developed by the Higher education Evaluation and Accreditation Council of Taiwan to gauge the research productivity of the best universities in the world.
PRSP employs bibliometric methods to analyze and rank the scientific papers performance of the top 500 universities in the world from an overall listing of 3000 institutions.
The performance measures are composed of numerous indicators within three different criteria of scientific papers'performance:
Performance Ranking of Scientific Papers for Research Universities Higher education Evaluation and Accreditation Council of Taiwan Policy Context, incl. circumstances under which the exercise came about:
First published online in 2007 and now in its second year, the Performance Ranking of Scientific Papers for Research Universities (PRSP) was developed to gauge the research productivity of the best universities in the world.
This performance ranking targeted research-oriented universities, especially those in newly developed countries. Through objective indicators which would also reflect short-term efforts,
each university would be able to understand its position and advantages in the world rankings,
and from there it would know how it fares against other universities, and it can track its annual progress in terms of the quality and quantity of its scientific papers'(HEEACT, 2007).
and rank the scientific papers performances of the top 500 universities in the world'(HEEACT, 2008).
The list then is reduced to exclude non-university institutions, and the remaining institutions are compared against other rankings, resulting in a final listing of approximately the top 500 institutions.
PRSP, finally, compares these universities'outputs using data from ISI's ESI, Web of Science (WOS),
and did not include other frequently used university evaluation indices such as teaching, research, and administration, nor did it emphasize on academic performance indices such as reputation and extraordinary achievements.
The indices designed for this performance ranking study were suitable as reference especially for research-oriented universities in newly developed countries.
and the number of subject fields where the university demonstrates excellence (fields of excellence). The PRSP staff analyzes all the SCI/SSCI bibliographic records in
which the address field contained one of the known forms of the university name, removing duplicate records containing different forms of that university's name.
They obtain the total number of citations by adding the number of citations on each of the articles from that university from its inclusion in SCI/SSCI to date.
To determine which citations are from separate, independent campuses within a single university system, which are combined often into one SCI/SSCI listing,
PRSP staff researchers identify manually the actual number of articles and citations within the SCI/SSCI to identify those produced by each individual campus. This staff-intensive approach is used also to ensure thehighly cited ESI'calculations are representative of overall institutional quality and not an outlier.
When universities obtain similar scores, the slight differences of the final scores may not necessarily suggest its superiority in scientific research.
This performance ranking is neither areputation ranking'nor anacademic ranking'of universities, so some people might feel that the outcome might be different from
In addition, for those universities in a given country that were included in the ranking, there might have been some discrepancies in their actual rank and
but the relative position in rank between universities of the same country may still match society's expectations.
Summary The Centre for Science and Technology Studies (CWTS) at Leiden University has developed a ranking system of universities entirely based on its own bibliometric indicators.
The work focuses on all universities worldwide with more than 700 Web of Science indexed publications per year.
About 1000 largest (in terms of number of publications) universities in the world are covered. Name/Title of Research Assessment Exercise:
During the past years, rankings of world universities have become increasingly important and popular. At a global level universities increasingly compete to attract the best students and research workers,
and the European commission launched the concept of a European Research Area. The press started publishing rankings of higher education institutions
The Centre for Science and Technology Studies (CWTS) at Leiden university is specialised in the development and application of research assessment methodologies based on bibliometric indicators.
Its members developed a critique on existing rankings of universities such as the Shanghai (ARWU) Ranking and the QS World Universities Ranking
They applied the methodologies they had developed in many bibliometric studies of individual universities during the past two decades to a large collection of world universities.
reflect a university's recent research performance more adequately than the other ranking systems do.
and upon the criteria applied to select universities in the ranking. Methodology, incl. time-frame, resources, costs, technologies:
'because standardisation using the impact of publications in the given field prevents non-generalist universities such as engineering schools
or technical universities from being penalised by the citations-per-publication calculation. Total number of publications multiplied by the relative impact in the given field The system presents these for rankings for collections of universities:
144 Europe: the 100 and the 250 largest universities in Europe for the period 2000-2007, World:
the 100 and the 250 largest universities worldwide for the period 2003-2007, These two different size thresholds illustrate clearly how smaller universities that are not present in the top-100 (in size) may take high position in impact ranking
if the size threshold is lowered. Bibliometric data are extracted from a bibliometric version of Thomson Reuters'Web of Science,
Important factors in the interpretation of bibliometric rankings of world universities. Research Evaluation 17,71-81.145 11 Appendix V. Bibliography This list of publications does not represent a comprehensive bibliography.
universities for Europe's competitiveness in a global knowledge economy, 16096/1/07 REV 1. Retrieved 26 july 2009,
on approaches to improve the excellence of research in universities'Executive Summary. European commission (2005a) European Universities.
Enhancing Europe's Research Base DG Research, Brussels. Report by the Forum on University-based Research.
Retrieved 26 july 2009, from http://ec. europa. eu/research/conferences/2004/univ/pdf/enhancing europeresearchbase en. pdf. European commission (2005b) Working together for growth and jobs:
Retrieved http://eurlex. europa. eu/Lexuriserv/site/en/com/2005/com2005 0488en01. pdf European commission (2006) Delivering on the modernisation agenda for universities:
'Retrieved 26 july 2009, from http://ec. europa. eu/commission barroso/figel/speeches/docs/06 04 25 eit stakeh en. p df Forum on University-based Research (2005) European Universities:
Salmi, J. 2009) The Challenge of Establishing World-Class Universities, World bank: Washington, D c. Spaagen, J.,Dijstelbloem, H. and Wamelink, F. 2007) Evaluating Research in Context.
A Global Survey of University League tables, Canadian Education Report Series, Educational Policy Institute: Toronto. Retrieved 11 august 2009, from http://www. educationalpolicy. org/pdf/World-of-Difference-200602162. pdf. Van Raan, A f. J. 2005) Fatal Attraction:
Conceptual and methodological problems in the ranking of universities by bibliometric methods',Scientometrics, vol. 62, no. 1, pp. 133-143.
Webography Academic ranking of world universities, Shanghai Jiao Tong University, Retrieved 22 july 2009, from http://www. arwu. org/rank/2007/ranking2007. htm. Classifying European Universities, Centre for Higher education Policy Studies,
University of Twente, Retrieved 22 july 2009, from www. utwente. nl/cheps/research/projects/ceihe/./Design and Testing the Feasibility of a Multidimensional Global university ranking.
Retrieved 22 july 2009, from http://ec. europa. eu/education/programmes/calls/3608/index en. html. Elsevier Scopus, Retrieved 11 august 2009, from http://www. info. scopus. com/overview/what/.
from http://www. esf. org/research-areas/humanities/researchinfrastructures-including-erih/erih-initial-lists. html. European University Association,
option=com frontpageanditemid=1. Leiden World Ranking, Centre for Science and Technology Studies, University of Leiden, Retrieved 22 july 2009,
. org/document/22/0, 3343, en 2649 35961291 40624662 1 1 1 1, 00. ht ml Performance Ranking of Scientific Papers for World Universities,
from http://ranking. heeact. edu. tw/en-us/2008/page/Background QS World Universities (2008) Retrieved 12 august 2009,
Ranking Web of World Universities, Cybermetrics Lab CSIC. Retrieved 22 july 2009, from http://www. webometrics. info/.
/European commission EUR 24187 Assessing Europe's University-Based Research-Expert Group on Assessment of University-Based Research Luxembourg:
In 2008, the European commission, DG Research set up the Expert Group on Assessment of University-Based Research to identify the framework for a new
and more coherent methodology to assess the research produced by European universities. There is no single, correct methodology.
Any assessment of the quality of university-based research will have to take into consideration the multifunctional and diverse nature of universities
the diverse nature of disciplines, the level at which universities are assessed (i e.,, identify the level at which knowledge is created and shared),
there is a need to design flexible and multidimensional methodologies that will adapt to the diverse and complex nature of research, disciplines and of our universities.
Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011