business strategy, development, ICT strategy development, networking, information system, innovation Classification JEL: M15 1. Introduction The accentuated requirements for effectiveness
the need for communication and for information technologies and software support that communication. Information and communication technologies (ICT) have made possible new business models and even new business structures.
in many cases, Customer relationship management (CRM) Enterprise resource planning (ERP) and Business intelligence (BI. Other problems arise when developing new methods
The core business is not always stable. Sometimes customer requirements change and suppliers, too, must change their business.
A dynamic business environment requires changes in core competencies. Core competence is one aspect of companies'business vision.
This vision usually moves as customer requirements and the business environment moves. It depends on the business as to how far ahead the vision states are targeted.
and the specialists have defined ICT as the hardware, software application programs, telecommunication networks and technical expertise that support information processing and communications activities at all levels of a company (Marchand et al. 2001).
ICT has had a big impact on product development. Products today are more and more intelligent and it is not a new idea to provide extended products,
New emerging technologies like smart materials, micro-mechanical sensors and wireless and faster data transfer solutions etc. have presented new opportunities to develop product features, especially those intangible features
and PDM (Product Data Management) can be used to manage product-related information. The main point is that ICT should fully support the business processes
Ward and Griffiths (1996) have presented the relationships between business, information system and information technology strategies. Figure 2. The relationship between business IS and IT Strategies (source:
Companies chosen business models are dependent on their core business. Companies, furthermore, have individual structures, locations, types of organization and so on.
for example Business intelligence solutions to manage business data and information from marketing and customers. Markets are providing a huge number of different solutions to different needs,
Companies should clarify their data administration vision: which kind of solutions best fits the company and
Chan, Y.,Huff, S.,Barclay, D.,Copeland, D.-Business Strategic Orientation, Information systems Strategic Orientation and Strategic Alignment.
Information systems Research, 1997,8 (2): 125-150.3. Child, J. & Faulkner, D.-Strategies for Cooperation. Managing Alliances, Networks, and Joint ventures.
A Framework for Strategic Information technology Management. Center for Information systems Research, Working Paper No. 190, Massachusetts institute of technology, Cambridge, 2000.5.
Henderson, J.,Venkatraman, N.-Strategic Alignment: Leveraging information technology for transforming organizations. IBM Systems Journal, 193,32 (1), 4-16.6.
Kaplan, R.,Norton, D.-The Strategy-Focused Organization: How balanced Scorecard Companies Thrive in the New Business Environment.
Harvard Business school Press, 2000.7. Marchand, D.,Kettinger, W.,Rollins, J.-Information orientation-The link to Business Performance.
I.-Measuring the link between Business and Information technology Objectives. MIS Quarterly, Mar.:55-82.1996.12. Ward, J.,Griffiths, P.-Strategic Planning for Information systems, 2 nd.
Ed. Chicester: Wiley, 1996
Assessing Europe's University-Based Research Expert Group on Assessment of University-Based Research EUR 24187 EN European Research Area Science & society EUROPEAN COMMISSION Research
Adeline Kroll European commission Office SDME 9/17 B-1049 Brussels Tel. 32-2) 29-85812 Fax (32-2) 29-64287 E-mail:
European commission Directorate-General for Research Communication Unit B-1049 Brussels Fax (32-2) 29-58220 E-mail:
research-eu@ec. europa. eu Internet: http://ec. europa. eu/research/research-eu EUROPEAN COMMISSION Assessing Europe's University-Based Research Expert Group on Assessment of University-Based Research RTD.
A great deal of additional information on the European union is available on the Internet. It can be accessed through the Europa server (http://europa. eu). Cataloguing data can be found at the end of this publication.
Luxembourg: Publications Office of the European union, 2010 ISBN 978-92-79-14225-3 ISSN 1018-5593 doi 10.2777/80193 European union, 2010 Reproduction is authorised provided the source
Moreover, global rankings tend to rely on qualitative indicator-based data, which tend to have an inbuilt bias in favour of hard sciences and biosciences,
There is also a substantial lack of cross-national comparative data. THE RAISON D'ÊTRE OF THE ASSESSMENT OF UNIVERSITY-BASED RESEARCH EXPERT GROUP In this context, the Commission's Directorate-General for Research decided to convene an expert group on assessment of university-based research.
and identifying data and indicator requirements. The Expert Group had 15 members from 12 EU Member States, Australia, a European association and an international organisation.
including experience and/or expertise in national and international rankings and bibliometrics, data collection and analysis, concrete research assessment exercises, the workings of leading national and European research funding organisations
12 While some of the data required may be readily available or relatively easy to obtain,
other data are either not available or only available in limited circumstances. This makes comparability across universities and countries difficult.
It studied both the value and limitations of bibliometric data which are used commonly to measure research productivity and quality,
For example, the Group came to realise that even bibliometric indicators might be flawed due to manipulation of data.
The use of peer review panels, to ensure a broader understanding of the research being assessed, as well as of its contribution to knowledge,
2) Assessment of university-based research should combine quantitative indicator-based data with qualitative information, for example information based on expert peer assessment or validation,
It links specified users with their defined purposes and objectives to specific data, quantitative and qualitative indicators,
While some purposes and objectives require extremely detailed and robust data on research outputs, other requirements demand only a few, relatively simple indicators.
As user purposes and objectives frequently overlap, a comprehensive web-enabled and personalized tool-kit can be developed readily to meet different policy and university needs.
analyse and disseminate standardised data, so as to enable inter-institutional and cross-national comparisons. It was suggested also that the challenges
and comprehensive data. In view of this, the AUBR EG recommends that the European commission Take the lead in establishing a European Observatory for Assessment of University-based Research to identify and prioritise data requirements of a European Research Assessment Framework
as well as to further develop and disseminate guidelines for use by universities, national agencies, government, and other stakeholders, based on the principles outlined in this report;
Invest in developing a shared information infrastructure for relevant data to be collected, maintained, analysed, and disseminated across the European union;
Adapt the Multidimensional Research Assessment Matrix to web-based technologies in order to facilitate personalisation, thereby meeting different user requirements,
Moreover, in the absence of comprehensive reliable and comparable cross-national data, rankings cannot be a valid tool to achieve the overarching aim of improving the quality of university-based research across the European union. 17 2 Introduction This chapter outlines the national
European University Data Collection a project studying the feasibility of a sustainable European system of data collection on the activities and performance of the European higher education institutions in the areas of education, research and innovation.
European Multidimensional University ranking System, a pilot project funded by DG Education and Culture, aimed at mapping multiple excellences (e g. teaching, innovation, community engagement and employability).
They usually use a combination of public or institutional data and/or peer or student surveys.
It was followed quickly by the times QS World University ranking (henceforth Times QS, 2004), Webometrics or the Ranking Web of World Universities (2004), the Taiwan Ranking of Scientific Papers for World Universities (henceforth Taiwan, 2007),
Considerable concern has been raised about their over-reliance on international bibliometric and citation databases, e g. Thomson-Reuters World of Science or Elsevier-Scopus.
'which occurs when respondents deliberately downgrade competitors or upgrade their assessment in order to influence the outcome.
identification of peer institutions, improve data collection and increase participation in broader discussions about institutional success. Unintended consequences can occur
Sound, verifiable and comparable data is a necessary prerequisite for institutional autonomy and to enable European universities to manage strategically, effectively and efficiently.
identifying data and indicators requirements (if necessary propose different approaches for different types of users).
and identifies data and indicators requirements within a policy context. The concluding Chapter 6 identifies potential risks and unintentional consequences
if simplistic interpretations of the data are made, and onedimensional correlations are drawn between research assessment and policy choices.
which can affect the type of quantitative data and qualitative analysis. Depending upon the university, scientific field or policy environment,
the complexity of knowledge has led to a diverse range of output formats, inter alia, audio 26 visual recordings, computer software and databases, technical drawings, designs or working models,
or interpret the data. Likely Target Users, including: HE Governance and Management: These groups require a wide range of information to help
Because higher education is both a generator and user of the data, its position is different than the other users. o Governing Bodies
data to assess the quality of research and HE performance and output and to support return-on-investment.
while QA agencies use institutional data to benchmark and assess quality and performance. o Funding Agencies o Enterprise and Development Agencies Academic Organisations and Academies In many countries,
Increasingly, employers use such data to identify likely sources of potential employees. o Private firms and entrepreneurs o Public organizations o Employers Civil Society and Civic Organizations
are likely to use benchmarking data to identify potentialinvestment'opportunities, using the information as a proxy for value-for-money
and Uses of Research Assessment Data User Group Why Research Assessment Data Is required? What Research Assessment Data Is required?
HE MANAGEMENT AND GOVERNANCE Governing Bodies/Councils Policy and planning Strategic positioning Research strategy development/management Investor confidence/value-for-money and efficiency Quality assurance Institutional
and discipline/field data re. level of intensity, expertise, quality and competence Benchmarking against peer institutions, nationally and worldwide Efficiency level:
and efficiency Quality assurance Publicity Student and academic recruitment Improve and benchmark performance and quality Institutional and discipline/field data re. level of intensity, expertise,
HE Research Groups Strategic positioning Research strategy development/management Investor confidence/value-for-money and efficiency Student and academic recruitment Discipline data re. level of intensity, expertise,
and HEIS Determine national/international competitiveness Quality, sustainability, relevance and impact of research activity System and institutional data re level of intensity, expertise,
development/management Investor confidence/value-for-money and efficiency Quality assurance Institutional and discipline/field data re. level of intensity, expertise, quality and competence Benchmarking against peer institutions
and quality Improve system functionality System and institutional data re level of intensity, expertise, quality and competence Performance of HE system and individual institutions Benchmarking between nationally and worldwide Indicator of national competitiveness Attraction capacity:
professional and academic performance and quality Academic and discipline/field data re. level of intensity, expertise,
nationally and worldwide Quality of academic staff and Phd students INDIVIDUALS Academics and Researchers Identify career opportunities Identify research partners Identify best research infrastructure and support for research Institutional and field data re level of intensity,
Staff/student ratio Institutional research support Students Inform choice of HEI Identify career opportunities Institutional and field data re level of intensity, expertise, quality,
and best research partners Institutional and field data re level of intensity, expertise, quality, competence and sustainability Performance of individual institutions and researchers benchmarked against peers in field of interest Research capacity of institution
and expertise Identify potential employees Institutional and field data re level of intensity, expertise, quality,
and expertise Identify potential employees Institutional and field data re level of intensity, expertise, quality,
and expertise Identify potential employees Institutional and field data re level of intensity, expertise, quality,
technology transfer and knowledge transfer partners Institutional and field data re expertise, quality and competence Peer esteem indicators MINISTRIES OF HIGHER EDUCATION IN DEVELOPING COUNTRIES To help determine which foreign higher education institutions are applicable for overseas scholarships studies.
and technology transfer Institutional and discipline/field data re. level of intensity, expertise, quality and competence Competitive positioning of institution and researchers Trends in graduate employment and competence Quality of academic staff and Phd students SPONSORS AND PRIVATE INVESTORS Benefactors/Philanthropists
relevance and impact of research activity Quality of academic staff and Phd student Contributor to own brand image Institutional data re level of quality and international competitiveness
and worldwide Quality of academic staff and Phd students Alumni Determine institutional performance vis-a-vis national and international competitors Institutional data re level of quality and international competitiveness
choice and career opportunities Investor/parental confidence and value-for-money Institutional data re. level of intensity, expertise,
interregional and global networks, involving mono-disciplinary, inter-and multi-and trans-disciplinary forms of inquiry and teams of researchers.
and including translations, software, encyclopaedia entries, research or technical reports, legal cases and maps. Drawing on the experience of rankings and existent research assessment exercises, Table 2 presents a comprehensive survey of the wide range of stakeholders and uses to
and quality while industry and employer groups want to be able to identify potential employees. 3. Some of the required data may be readily available
while other data are either not available or are limited available in circumstances, which makes comparability across universities or countries difficult.
For example, bibliometric data on peer-reviewed publications are available commercially, but there is no similar information available for the wide range of research outputs
while research performance data may be collected for one purpose, it is used often by other stakeholder groups for very different purposes.
or re-tabulates research data as aleague table'or ranking. These are significant findings,
Many governments and universities strongly support the interconnection between teaching and research as one of the core principles of higher education.
Science, Mathematics and Computing Engineering, Manufacturing and Construction Agriculture and Veterinary Sciences Health and Welfare Services In addition
This is especially important for international comparability. 4. 4 Bibliometric Methods Bibliometric data is an important method to quantify research activity in terms of that
Thomson Reuters Web of Science covers over 9, 000 international and regional journals and book series in the natural sciences, social sciences,
According to its website, 3, 000 of these journals accounts for about 75%of published articles and over 90%of cited articles.
with an abstract and citation database of research literature and quality web sources covering almost 18,
According to its webpage, the database includes extensive conference coverage (3. 6 million conference papers), 600 trade publications and 350 book series plus 23 million patent records
securely storing primary data, acknowledging the role of collaborators and other participants, and ensuring professional behaviour between supervisor and research students.
but also develop mechanisms to collect accurate and comparable data. The indicators can be quantitative and qualitative.
Suitable data bases for a variety of disciplines and research related outputs, especially in social sciences and humanities.
Data must be verified accurate and. Although one of the most popular indicators, it is not always the most appropriate one.
Citations Citation data are derived from citation indexes, i e. databases that do not only contain meta data on included In the exact sciences,
peers tend to consider citation impact a relevant aspect in Citations reflect intellectual influence but do not fully coincide with research quality.
Expansion of existing databases and creation of new databases (e g. based on data from institutional repositories) will 44 INDICATORS DESCRIPTION PRO/POTENTIALITIES CON/LIMITATIONS
WHAT DEVELOPMENT IS REQUIRED publications but also their reference lists. Principal indexes are Web of Science Scopus and Google Scholar. assessments of research performance.
Widely used, especially in the exact sciences which tend to be well covered, although the most popular indicators are not always the most appropriate ones.
Data must be verified accurate and. are limited of value in disciplines not well covered by the citation indexes, especially certain parts of social sciences,
humanities and engineering. improve the value of this indicator and coverage of disciplines. Theoretical research into the meaning of citations (clusters) in social sciences and humanities.
Number Keynote addresses at Nat'l/Int'l Conferences A count of the number of invited
Data can be verifiable by conference programme. No agreed equivalences that apply internationally and facilitate comparison across disciplines.
Data is verifiable. No agreed equivalences that apply internationally and facilitate comparison across disciplines. Unless lists are publically available this will require direct entry by researchers.
Data is agreed verifiable No equivalences that apply internationally and facilitate comparison across disciplines. Unless lists are publically available this will require direct entry by researchers.
Comparable data, verifiable through audit, is useful for comparing research performance across the system and within universities.
Data collection may be difficult in case of funding by end users because this information is known not to the University administration.
Agree international comparative data base. Number and percentage competitive grants won Level of funding won competitively this is a sub-set of the indicator above.
Comparable data, verifiable through audit, is useful for comparing research performance across the system and within universities.
Agree international comparative data base. Research income per academic staff or FTE Research income per academic staff or FTE supports cross-institutional comparisons,
Data needs to be adjusted to scale and mission of university. Employability of Phd graduates Industry employment of Phd graduates can be an indicator of the contribution of research to the highly Used to measure the quality of the graduates,
Career paths and opportunities can differ for different disciplines which data is collected. Commercialisation of researchgenerated intellectual property (IP) Provides measure of the extent of income from commercialisation of intellectual property created through patents, licences or start ups.
Lack of agreed basis of capturing data and comparability could undermine legitimacy. Agree basis of international comparability and verifiability.
Lack of agreed basis of capturing data and comparability could undermine legitimacy. Agree basis of international comparability and verifiability.
and verify the data due to lack of clarity as to what is being measured. Agree precise definition, inter alia:
Data is verifiable by universities although there can be a time lag. Rates of completion may differ across disciplines.
Difficult to get valid, comparable institutional data, even within the same institution. Agree basis on which to calculate full cost of research investment. 48 INDICATORS DESCRIPTION PRO/POTENTIALITIES CON/LIMITATIONS
Difficult to get valid, comparable data. Favours older, well-endowed universities. Develop appropriate comparative indicators.
and web-enabled Toolkit. Examples are provided below in section 5. 3 illustrating how it could be implemented by the various user groups identified above in section 3. 4
Data Collection through Digital Repositories: Technology provides an easy way to store and access research for the actual research assessment process,
Although digital institutional repositories (Australia) and web-based tools (e g. Webometrics and Google Scholar) currently cover only a limited part of universitybased research outputs,
in the future they could become important sources of information and overcome some of the limitations of traditional bibliometric databases.
Digital repositories and web-based tools can facilitate scientific collaboration in line with the movement for open science.
They promote transparency in experimental methodology, observation, and collection of data; public availability and reusability of scientific data;
and public accessibility and transparency of scientific communication. Peer review Panels: Several case studies underscore the importance of peer review panels.
The process helps ensure a broader understanding of the research and its contribution to knowledge, including the importance of new disciplines and interdisciplinarity.
In Finland, France, The netherlands and the UK, panels include international experts; in Finland, France and The netherlands, this also involves visits to the university.
Peer review panels are used also to assess the quality of research outputs and outlets of individual researchers for career promotion,
such as in Spain. Indicators: All systems use bibliometric indicators, although many balance this with other information about the research environment, research strategy and management,
and impact on teaching. The netherlands combines retrospective and prospective analysis. The 51 UK and Australia are adopting an indicator-based system with lighter-touch peer review and social impact.
Each purpose requires different data. Some requirements demand extremely detailed and robust data on research outputs;
other requirements demand only a few, relatively simple indicators. All indicators have advantages and disadvantages, and there are limitations to all assessment exercises (see Chapter 6). Indicators designed to meet a particular objective
e g. indicator-based data with peer or end-user review. There are several advantages to this approach:
The actual choice of indicators depends upon the purpose and the availability of data. There are four important steps:
AND SCALE RESEARCH INFRASTRUCTURE Allocate Resources Research output/bibliometric data Citation data Peer review Keynote, awards, etc.
Drive Research Mission Differentiation Research output/bibliometric data Output per research academic Peer review Self evaluation Ratio of research income:
Percentage Funding from Endusers Patents, Licenses, Spin-offs Number of collaborations and partnerships Improve Research Performance Research output/bibliometric data Citation data Number and percentage publication in topranked,
or Cost-Benefit of Research Research output/bibliometric data Output per research academic Peer review and/or citation data Commercialisation data End user reviews Social, economic,
data with focus on European & International collaborations Percentage of Research Income from International Sources Number of collaborations and partnerships Increase Multidisciplinary Research Research Output/Bibliometric data with focus on interdisciplinary fields Peer review Self evaluation
New research fields, interdisciplinary teaching programmes, etc. Research Conducted by People from Different Disciplines Some illustrative scenarios follow. 54
and/or citation data to determine impact Some measure of research infrastructure/environment e g. Libraries, equipment, postgraduate student numbers, etc,Esteem'factors e g. prizes, research income etc.
Data on Research Outputs, including output per academic staff Data on ratio of research income:
teaching income Data on ratio of undergraduate students: master & doctorate research students Peer review Panels Self evaluation Reports
IF you want to use research assessment to INCREASE REGIONAL/COMMUNITY ENGAGEMENT, then what is required is:
Data on cooperation agreements with local governments and organisations of the region; Data on agreements with other public or private institutions located in the targeted area of influence;
Indicators of results (publications, policy reports, patents, spin-offs..coming from these agreements; Ratio of business or other external funding of research:
Data onmerit'of research as assessed by end users, rather than peer review; Peer Esteem, e g. expert opinion, professional memberships, media visibility.
and/or Citation data to determine impact Option B: Use holistic peer review assessment panels to benchmark performance against international comparators,
assisted by simple output indicators. IF you want to ASSESS VALUE-FOR-MONEY OR THE COST-BENEFIT OF RESEARCH,
Data on research outputs, including output per academic staff; Peer review and/or citation data to determine scholarly impact;
55 Indicators of commercialisation of IP; Indicators of social, economic, cultural and environmental impact and benefits;
Data on employability of Phd graduates; Data on collaborations and partnerships. IF you want to use research assessment to ENCOURAGE INTERNATIONAL COOPERATION,
then what is required is: Data on European and international cooperation agreements; Data on joint publications with scholars from other countries;
Proportion of research funding, domain by domain, coming from overseas research institutions. If you want to use research assessment TO INCREASE MULTIDISCIPLINARY RESEARCH,
then what is required is: Use knowledge clusters as unit of assessment by peer review panels; Data on output according to knowledge cluster perhaps using bibliometrics with focus on authors from different disciplines;
Data on other results, e g. new research areas, courses or teaching programmes,)designed together by people from difference disciplines (or Schools, or Faculties;
Peer review; Self evaluation. 56 6 Conclusion 6. 1 Limitations and Unintended Consequences University-based research has become one of the most critical factors shaping national competitiveness and university reputation.
This situation is likely to intensify as global competition increases further, (public) funding for research is reduced,
Likewise, the absence of appropriate, verifiable and trustworthy data can undermine the usefulness of cross-national comparisons
Bibliometric and citation data is by definition backward looking; in other words, it assesses past performance as a proxy for future performance.
However, the absence of verifiable and accessible cross-national data and common definitions raises questions as to the efficacy of this approach on an international basis given all the limitations that have been identified throughout this report.
'orintelligent'presentation of data by universities and researchers anxious to ensure a good report.
Because of unintended consequences, the choice of indicators, methodology and data sources are critical. Qualitative indicators can easily ignore differences between disciplines;
Reliance on data that is easily measured can distort research towards that which is more predictable;
open source and open repositories is vital. Although the added value of these approaches including 58 their implications for scientific-scholarly practice
they could eventually also help overcome some of the limitations inherent in currently available bibliometric and citation databases.
Good practice'suggests that research assessment should 1. Combine indicator-based quantitative data with qualitative information, for example information based on expert peer assessment.
Adapting the Matrix to web-based technologies would enable different users to personalise the various dimensions
and projects designed to generate much needed comparable data and more appropriate and robust scenarios for the assessment of university-based research. 60 7 Appendix
Email: erasmspr@zedat. fu-berlin. de. 61 HAZELKORN Ellen (Rapporteur. Professor Hazelkorn is the Director of research and Enterprise,
"She worked as manager for the collaborative research centre 475"Reduction of complexity in multivariable data structures"for about three
Email: Lidia. Borrell-Damian@eua. be EMPLIT, Philippe. Philippe is director of the Service OPERA-Photonics and Professor of Physics at the Université Libre de Bruxelles (ULB), in the Engineering science and Human sciences faculties.
Email: annamaria. inzelt@uni-corvinus. hu MARKLUND, Goran. Dr. Marklund is Deputy Director General (Acting) at VINNOVA,
the creation of bibliometric databases from raw data from Thomson Scientific's Web of Science and Elsevier's Scopus;
which was responsible with two other universities for the Evaluation 64 of Uppsala University, the Evaluation Committee for Estonian Departments of Russian Philology and the International Evaluation Panel for the Faculty of arts of Jyväskylä University.
She was Chair of the Research Quality Framework (RQF) Creative Arts Panel in 65 its developmental phase.
Australian Technology Park Innovations (ATPI), INTERSECT (NCRIS-funded NSW node for eresearch services and computing infrastructure), Sydney Institute of Marine Science (SIMS) and ARC Centre
Email: Susan. Rowley@uts. edu. au SALMI, Jamil. Jamil, a Moroccan education economist, is the World bank's tertiary education coordinator.
developing computer models of the atmosphere to improve weather forecasting, and examining the effects of the environment on human health or lifestyle.
Data usually comes from an international database e g. Thompson Reuters Web of Science or Elsevier-Scopus.
Blue-sky Research Often referred to as fundamental or basic research blue-sky research is aimed at gaining more comprehensive knowledge or understanding of the subject under study, without specific applications in mind.
non-refereed, extracts of paper) Edited volumes of conference proceedings Audiovisual recordings Computer software, databases Technical drawings, designs or working models Design (major works
industry or other Technical reports Legal cases Entries in a dictionary/encyclopaedia Maps Translations and editing of major works Case studies Data collection is undertaken
or comes from an international database or increasingly from web-based technologies. University-based data normally requires direct entry by researchers,
often mediated through the Research Office. Data usually comes from an international database e g. Thompson Reuters Web of Science or Elsevier-Scopus.
There are also various citation indices, the most important of which are the Science Citation Index Expanded, the Social sciences Citation Index,
and the Arts & Humanities Citation Index. Increasingly web-based interfaces, such as Google Scholar, institutional repositories or other standardised webbased technologies, are used.
In effect, researchers create and maintain online curricula vitae. In these cases verification is undertaken by universities
Pro Bibliometric data is collected on all research outputs in order to quantify the full extent of research activity.
At national level, identifying categories for inclusion in data collection involves consultation with key discipline, research and university organisations and leaders,
research contributions to innovation and socioeconomic benefit, e g. research and technical reports, patents and plant breeder rights, computer software, designs and prototypes,
The Australia research quality assessment framework (ERA) will employ ISI products for Science disciplines and Scopus for its better coverage of the humanities and social sciences.
Individual universities may be able to compile data on all categories of outputs but this needs a high degree of compatibility for cross-institutional and cross-national comparability.
International bibliometric databases focus primarily on journals that publish full text in English or at very least, their bibliographic information in English.
They do not claim to have complete journal coverage, but rather to include the most important.
ISI coverage tends to be excellent in physics chemistry, molecular biology and biochemistry, biological sciences related to humans and clinical medicine;
A principal cause of non-excellent coverage is the importance of sources other than international journals, such as books and conference proceedings.
In fields with a moderate ISI coverage, language or national barriers play a much greater role than they do in other domains of science and scholarship.
Citations Data Derived from Standard Bibliometric Measures Citation counts are manifestations of intellectual influence, as represented by the adage:
Data is purchased from commercial bibliometric providers, the most significant of which are Thomson Reuters and Scopus.
Since 2004, Google Scholar has provided a freely-accessible Web search engine that indexes the full text of scholarly literature across an array of publishing formats and disciplines.
These Centres use bibliometric data to undertake systematic evaluation and mapping of research at institutional, cross-institutional, national and international levels.
Additionally the databases from which citations counts are assembled draw upon comprehensive publications sets (9 000 for Thomson Reuters;
but extraneous factors can also impact on the data, including: Publication language; Coherence of research communities;
Coverage of a certain discipline in the data base used citations might be found in publications not counted here.
Their importance is assessed through a combination of an objective and truly unique internal monitor based on citation relationships among journals with assessments by experts from the various fields.
One of the indicators applied in the internal monitor is nowadays known as the journal impact factor.
it is available on the ESF website. Pro This indicator is a relatively recent innovation for systematic research assessment.
Con These data are verifiable but there is no systematic verification protocol or technology available for verifying claims across diverse indicators yet.
as appropriate, Research Masters degree Completions Universities and, in some cases, government agencies, collect data for this indicator.
Data is verifiable. The use of this indicator promotes quality postgraduate supervision and programs. If resources are attached to performance against this indicator, universities,
Universities collect data for this indicator. Pro 76 This indicator is useful to drive improvement in research performance in universities and in internal units (faculties, schools, departments.
and to get reliable data on this. 9. 8 NUMBER OF CO-PUBLICATIONS Description: Number of Co-Publications within the Unit of Assessment or Collaborative Work with Researchers from other Universities/Organisations.
However, collaborative research activity, especially with nonacademic partners, is reflected not easily in the major international bibliometric data bases,
Universities usually collect data, although data may be provided by sponsoring organisations. Pro Research income is a useful indicator for measuring the scale of the research enterprise and its capacity to secure additional income through competitive grants and contact research, especially in science, technology and medicine.
This indicator is comparable, and verifiable through audit, and can be useful for comparing research performance across the system and within universities.
It may be difficult to collect data from end users because this information may not be collected routinely by the Research Office.
It may also be difficult to find comparable data on research income. 9. 9. 3 TOTAL R&d INVESTMENT Description:
Universities collect data for this indicator and self-nominate levels of funding for research from consolidated revenues.
comparable institutional data, because a significant proportion of institutional investment is cross-institutional subsidisation. 79 9. 9. 4 RESEARCH INFRASTRUCTURE Description:
It is often difficult to find quantifiable and comparable data or to express the existing facilities in terms of money.
Typically, the indicator assembles data on: Invention Disclosures: the number of disclosures, indicated to the appropriate university office, e g.
income and equity data are included in audited financial reports and information on company ownership and current value is in the public domain.
The key legitimating factors are the link between IP commercialisation and economic benefit, the availability of data to support broad comparisons across national and international systems,
and longitudinal analyses and the in principle verifiability of the data. Con 80 Patents are a very poor indicator.
Moreover, it excludes the move towards open science or open source IT software or the embodied expertise of the social sciences. 9. 10.2 EMPLOYABILITY OF PHD GRADUATES Description:
Industry Employment of Phd Graduates Universities track the career destinations of their Phd graduates and alumni via postgraduation Career Destination Surveys and Alumni Databases.
the data can be unreliable. The information does not always specify the universities from which Phd holders graduated;
Con The data is only in the early stages of being defined and collected. Many of the projects are conducted by researchers individually or privately
so the university's data may not complete. Results are often only internally published; the amount of money given is often not public. 9. 11 END-USER ESTEEM Description:
The data can be collected quantitatively or qualitatively. The latter can be captured by involving key stakeholders and end-users directly in review panels or in written assessments.
It includes: Appointments to relevant national or international organisations, committees and research councils; Policy recommendations; Requests for expert services;
Scientific communication (public understanding of science, conferences to the public, events, media coverage. Normally this requires direct entry by researchers.
Con 82 Data is verifiable but there is no systematic verification protocol or technology available for verifying claims across diverse indicators yet.
with Panels appointed, Submission Specifications released in September 2007, and submission date set for April 2008.
discipline-appropriate assessment by 13 discipline-grouping panels of expert peers, including research end-users and international researchers;
Peer panels will not be required to read publications accessed via repositories but will rely instead on discipline-appropriate indicators.
Eight ERA Panels will replace the 13 RQF Panels. Universities will report on research outputs by Field of Research codes.
Dissemination, incl. how much information is available regarding data and methods: As with the previous RQF, ERA involves exhaustive consultation with researchers and the 39 universities in the Australian system.
Following a complete cycle through the 8 panels over 3-4 years, the Government is likely to attach funding to outcomes.
'Greatly improved technical capacity for data collection across Australian universities. Greater concentration of research funding in universities that areresearch intensive'(a category that tends to coincide with larger, older universities with strength in natural sciences and medicine.
and an on-site visit of a panel of internationally recognised experts of the discipline in question.
and the peer review panel) during the assessment process; an academic coordinator is nominated by the university authorities (usually from outside the country;
a panel of 10 experts is convened by the academic coordinator; each team has to prepare a self evaluation document, based on a common template:
89 1. 2. Presentation of the most important research results (2 pages max) 1. 3. Outlook on future research activities envisaged (1 page max.
1. Staff 3. 2. Teaching activities (incl. size of the classes) 3. 3. Financial data 3. 4. Third-mission activities the documents prepared by the disciplinary
teams are compiled by the Research Department and sent to the expert panel; based on the analysis of this compilation, each expert assesses each team on seven indicators,
a one-day on-site panel meeting is organized, attended by the 10 members of the panel, the team leaders,
two assessment reports are written by the coordinator and the members of the panel: 1. General report on the discipline("public")1. 1. Context 1. 2. The discipline and its teams 1. 3. The assessment methodology 90 1. 4. Conclusions of the members
of the panel 1. 5 Comments on the process of assessment 1. 6. Observations of the academic coordinator Annexes:
data collection on the teams, evaluation files, CVS of the members of the panel 2. Report on the teams("private")2. 1. The team 2. 2. The team's research activities
incl. how much information is available regarding data and methods All documents concerning the ULB Research Assessment Exercise (goals,
The panels were asked to give their written statements and quantitative grades not only on research quality,
Data collection and self evaluations of departments according to strict rules laid down by the Steering committee. Evaluation, including one-week site visits,
undertaken by nine international panels (all panels visiting the University at the same time). The panels were asked to prepare written assessments
and to give grades (from 1 to 5) on the basis on the material submitted by the departments
The panels were asked to assess issues such as research leadership research strategy, including human resources strategy and the focus of research,
the panels were asked to focus on such indicators of future research potential as the Unit's vision and plans for the future,
Dissemination, incl. how much information is available regarding data and methods: The Evaluation has its own web site with all the necessary information in English http://www. aaltoyliopisto. info/en/view/innovaatioyliopisto-info/research-evaluation Intended and Unintended Consequences:
2) strict guidelines for collecting background data; 3) expert panels comprised of eminent foreign scholars/scientists;
4) site-visits of one week duration; 5) clear preference for quality over quantity of publications.
All of them are published on the website of the Academy of Finland. The evaluation report for biotechnology can serve as an example:
to acquaint foreign researchers (experts serving on evaluation panels) with the research carried out and the researchers working at the university in question.
and editorial assignments), budget and external funding The assessment was conducted by 21 international peer review panels (altogether 148 experts:
Panel members received all relevant materials ahead of the site visit, including information on the Finnish university system and research policy.
the Panel Chair finalized it afterwards. For quantitative evaluation a scale of 7 to 1 was used.
Site visits of the panels: May-November 2005. Publication of results on the university web site: 1 march 2006 (Summary Report, Individual Evaluation Reports.
Dissemination, incl. how much information is available regarding data and methods: All materials of UH RAE (Terms of reference, Guidelines for the departments, evaluation report, etc.
The evaluation committees (or panels) are comprised of seven to ten members each; each committee has a chair.
Dissemination, incl. how much information is available regarding data and methods: Which outlets are considered ranked international scientific outlets is explained in detail on the AERES website.
Apart from the best journals in each discipline, monographs and chapters in books are taken into consideration in the Humanities and Social sciences,
99 10.6 GERMANY-FORSCHUNGSRATING (CONDUCTED BY WISSENSCHAFTSRAT Executive Summary The German Science Council Rating carried out a pilot study based on peer review, information from departments, metrics and a reviewers'panel.
Research quality is assessed by informed peer review on the basis of an extensive analysis of quantitative and qualitative data.
Criteria and data are defined in a discipline-specific manner by experts from the individual fields of research.
other research products, e g. databases, software; third-stream projects; prizes (chemistry: work-done-at principle;
different data bases used for the two disciplines) Impact/effectiveness: Number of publications; number of patents;
Dissemination, incl. how much information is available regarding data and methods: The final results from the pilot study were published in December 2007 (chemistry) and April 2008 (sociology.
The indicators are based on different data sources: data collected directly at the universities, publication databases (Web of Science and national databases of scientific-scholarly publications) and a survey conducted among professors.
The main characteristics of the CHE Rankings are the following: no aggregation of indicators across the whole of a university,
but subject-specific data analysis and presentation of results. no weighted or non-weighted total value for the research performance of a given department,
but examination of different indicators in a multidimensional ranking no individual rank positions, but profiles of universities 103 Indicators used for research assessment:
but only a certain sample which is selected on the basis of the data base (s) used, the authors'names and the time window used.
If very heterogeneous data bases are used, a weighting scheme is applied taking into account the number of pages and of authors.
In very few cases a set of core journals was established to identify important publications, which were given then more weight.
If the Web of Science is used, a citation analysis is carried out as well; the indicator is weighted not differently for different fields.
in each cycle, data are collected from the previous three years. The ranking team is comprised of six CHE members of staff;
additional human resources are provided by Die Zeit for the publication and programming of the online-version.
Dissemination, incl. how much information is available regarding data and methods: The results of the CHE University ranking are published in"Der Studienführer"once a year;
The CHE Research Ranking is published as a pdf download, containing all indicators and additional figures.
A comprehensive paper on the methods used in the two rankings is available on the CHE 104 website,
Unintended consequences The quality of the data is improving because of public pressure Since the publication of the first CHE Research Ranking,
At the same time, universities have begun to provide data in an"intelligent"way in order improve the position of their institutions in the Rankings.
The results of the Initiative were based on the assessment universities'plans for the future undertaken by an international panel.
These were reviewed by internationally appointed panels of experts. The reviews of the Graduate schools and the Clusters were discussed in theExpert Commission,
Dissemination, incl. how much information is available regarding data and methods: A special brochure has been published, presenting all 85 institutions involved in the Initiative for Excellence.
Dissemination, incl. how much information is available regarding data and methods: Beyond the internal dissemination of information has decided not yet.
and the Ministry also can monitor better the processes and implement the allocation of changing subsidy by norms.
Its draws on information provided by HEIS and publicly available data. While its principal aim has been to assist third level entrants and their parents
The Sunday Times Irish Universities League Table http://extras. timesonline. co. uk/tol gug/gooduniversityguide. php http://extras. timesonline. co. uk/stug
/universityguide. php Policy Context, incl. circumstances under which the exercise came about: The Sunday Times is leading newspaper in Britain and Ireland with 1. 3m weekly sales.
The median Leaving certificate points obtained by honours degree course entrants, weighted by the latest data on the number of students on each course.
CAO 2008, round 1 data. Source: Calculated from CAO entry data 2007. Research. A measure of research efficiency which compares competitive research funding won in 2007 with the number full-time equivalent academic staff.
NUI Maynooth had the best ratio, which was scored 100 in the table. All other scores were expressed 111 then as a percentage of the NUI Maynooth result.
Individual colleges extracted from latest available Higher education Authority (HEA) data. Firsts/2: 1s The percentage of highest quality degrees in 2007.
Universities HEA 2007 data; institutes 2006 Department of education and Science data. Completion rates. Percentage of 2002 entrants who completed courses for which they enrolled by conferring in 2007 Source:
Individual Institutes. Trinity college, estimated figure. OTHER INDICATORS IN THE PROFILES Undergraduates/postgraduates Full-time undergraduate and postgraduate enrolments.
HEA 2007 data, universities only. Nonstandard entry Individual institutions 2007 intake. Sports facilities Assessment by The Sunday Times in consultation with students unions.
E-mail, special Excel templates, MS Access database. Dissemination, incl. how much information is available regarding data and methods:
Results disclosed in the newspaper and on-line. http://www. timesonline. co. uk/tol/life and style/education/sunday times university guid e/article2497779. ece Intended and Unintended Consequences:
and it 112 has introduced a new competitive dynamic into the system despite concern about the indicators and data.
No data is available on the extent to which the information is informing student choice.
However, because of the absence of good verifiable and comparable data the results are controversial. Sunday Times University Guide is not a research assessment exercise
The amount and quality of educational data available in Ireland is compared poor to that in the UK and other countries.
DVDS etc would do much better and be far cheaper to produce. Any increase in sales would be offset by the cost of advertising it
It gets them interested in the type of editorial coverage that the newspaper provides. It is for this reason that The irish Times
Members of the disciplinary panels established by CIVR used at least 2 independent opinions from international experts,
when the 2001-2003 data were used by the Government to allocate, together with other indicators, 7%of the overall university funding.
to be submitted to the Panels for evaluation. In particular, each University was asked to submit 0, 5 products per Full Time Equivalent researcher covering the last three years;
Each areas was assigned to a Panel, with a Chairman and a number of experts between 5 and 9 units (151 in total),
or more external evaluations into a unique score and submitted to the overall panel for the consensus. The evaluation was published only after consensus. First evaluation was carried out on research products for the period 2001-2003.
The Ministry of Research used data from CIVR, CNVSU, and Ministry sources. CIVR data refer to the evaluation of research in the period 2001-2003.
The Ministry published a list of universities with the percentage of increase or decrease of funding resulting from the application of these criteria.
Dissemination, incl. how much information is available regarding data and methods: Full transparency on methods and mandate to the external experts.
Only in one case (economics) a strong debate was reported within the Evaluation Panel, based on conflict between a view supporting international publications as the exclusive valuable research output,
On the basis of a yearly monitoring system, the institutes maintain data needed for these evaluations in a systematic way.
The three research organisations intend to create a national research information system, accessible through the Internet,
to store all relevant data. Policy Objective (s: The evaluation system aims at three objectives with regard to research and research management:
Data must be provided about funding and resources. The academic reputation of a given institute may be indicated in several ways.
119 Dissemination, incl. how much information is available regarding data and methods: The final evaluation reports are sent to the advisory boards of the institutes evaluated.
preferably, results will be made available on the Internet. Intended and Unintended Consequences: A systematic account of intended and unintended consequences would require a separate study.
evaluating all research groups in all Netherlands universities at the same time (e g. computer science; chemistry). ) In other disciplines (e g. physics), several evaluation committees were established,
incl. how much information is available regarding data and methods: All information regarding data and methods is available.
However, the overall system, as well as its indicator part, is complicated quite and therefore requires quite sophisticated competences on indicators
incl. how much information is available regarding data and methods: Because there is not complete transparency in the way each candidate is evaluated by the Committee,
whereas the aggregate data by faculty, scientific area, etc. are, and are used for comparisons. Intended and Unintended Consequences:
or other national or international databases, have improved the evaluation and peer review processes. As an unintended consequence we should highlight the fact that there has been a change in the behaviour of a substantial number of researchers,
and comparing it between science fields, using ISI Web of Science, taking into account both (science) area adjusted'publication volume andfield normalized'citations. 2) The external funding part essentially includes all external funding,
incl. how much information is available regarding data and methods: All information about methods is, in principle, available,
and all data should be fully transparent in order to facilitate improvements in university research. Intended and Unintended Consequences:
RAE 2008-Peer review with panels meeting to determine a collective view on the quality of research for each submitted unit of assessment in each higher education institution.
Dissemination, incl. how much information is available regarding data and methods: The RAE 2008 assessment method is transparent and all aspects of the methodology are in the public domain.
Full information about RAE 2008 can be obtained from the HEFCE website www. hefce. ac. uk-The RAE results are made available publicly both by the funding bodies and the media. 128 Intended and Unintended Consequences:
an initiative of the Cybermetrics Lab (Spain), has produced theWorld Universities'ranking on the Web
'since 2004, measuring the web presence of universities around the world and comparing the size and scale of their web presence against expectations based on other rankings.
with data collection occurring in January and July, and the results of the data analysis are published a month later.
The indicators correlate web measures with traditional scientometric and bibliometric indicators used in other rankings.
Name/Title of Research Assessment Exercise: Webometrics, CINDOC, Spain Policy Context, incl. circumstances under which the exercise came about:
Webometrics produces theWorld Universities'ranking on the Web, 'which calculates the web presence of universities around the world
and compares the size and scale of the web presence against expectations based on other rankings.
Web presence measures the activity and visibility of the institutions and it is a good indicator of impact and prestige of universities'(Webometrics, 2008).
The Webometrics world universities ranking are initiatives of the Cybermetrics Lab, which is a research group within the Centro de Información y Documentación (CINDOC).
with data collection occurring in January and July, and the results of the data analysis are published a month later.
The indicators used correlate web indicators with traditional scientometric and bibliometric indicators. The goal of the webometrics project is to showcase the importance of the web for the academic community, for dissemination of academic knowledge as well as for measuring scientific activities, performance, and impact.
Policy Objective (s: Objectives: Initially, Webometric's analysis aimed to simply highlight the significance of Web presence and publication.
CINDOC supports Open Access initiatives, promoting electronic access to scientific publications and to other academic material.
Using this data for a ranking was something of an afterthought, but ultimately made sense,
as the web indicators used in the analysis are not based on number of page hits
or site design but instead on universities'web presence as illustrative of institutional outputs and web visibility.
We intend to motivate both institutions and scholars to have a web presence that reflect accurately their activities.
If the web performance of an institution is expected below the position according to their academic excellence,
university authorities should reconsider their web policy, promoting substantial increases of the volume and quality of their 130 electronic publications.'
'However web indicators are very useful for ranking purposes too as they are not based on number of visits
or page design but global performance and visibility of the universities.As other rankings focused only on a few relevant aspects, specially research results,
web indicators based ranking reflects better the whole picture, as many other activities of professors and researchers are showed by their web presence.
The Web covers not only formal (e-journals, repositories) but also informal scholarly communication. Web publication is cheaper,
maintaining the high standards of quality of peer review processes. It could also reach much larger potential audiences,
offering access to scientific knowledge to researchers and institutions located in developing countries and also to third parties (economic, industrial, political or cultural stakeholders) in their own community.
The Webometrics ranking has a larger coverage than other similar rankings (see table below. The ranking is focused
and scholars to have a web presence that reflect accurately their activities. If the web performance of an institution is expected below the position according to their academic excellence,
university authorities should reconsider their web policy, promoting substantial increases of the volume and quality of their electronic publications.'
'The project intends to have true global coverage, not narrowing the analysis to a few hundreds of institutions (world-class universities)
but including as many organizations as possible. The only requirement in our international rankings is having an autonomous web presence with an independent web domain.'
'With these rankings we intend to provide extra motivation to researchers worldwide for publishing more
and better scientific content on the Web, making it available to colleagues and people wherever they are located.'
'the current objective of the Webometrics Ranking is to promote Web publication by universities, evaluating the commitment to the electronic distribution of these organizations
and to fight a very concerning academic digital divide which is evident even among world universities from developed countries.
However, even when we do not intend to assess universities performance solely on the basis of their web output,
and other repositories related initiatives can be represented roughly from rich file and Scholar data. The huge numbers involved with the PDF and doc formats means that not only administrative reports
Postscript and Powerpoint files are clearly related to academic activities.''Methodology, incl. time-frame, resources, costs, technologies:
131This ranking has the largest coverage with more than 16,000 Higher education institutions worldwide listed in the Directory.'
visibility and impact of the web pages published by universities, with special emphasis in the scientific output (referred papers, conference contributions, pre-prints, monographs, thesis, reports,)
but also taking into account other materials (courseware, seminars or workshops documentation, digital libraries, databases, multimedia, personal pages,)
'Access to the Web information is done mainly through search engines. These intermediaries are free, universal, and very powerful even when considering their shortcomings (coverage limitations and biases, lack of transparency, commercial secrets and strategies, irregular behaviour).
Search engines are important for measuring visibility and impact of university's websites. There are a limited number of sources that can be useful for Webometric purposes:
7 general search engines (Google*,Yahoo Search*,Live (MSN) Search*,Exalead*,Ask (Teoma), Gigablast and Alexa) and 2 specialised scientific databases (Google Scholar*and Live Academic.
All of them have very large (huge) independent databases, but due to the availability of their data collection procedures (Apis), only those marked with asterisk are used in compiling the Webometrics Ranking.'
'The unit for analysis is the institutional domain, so only universities and research centres with an independent web domain are considered.
If an institution has more than one main domain, two or more entries are used with the different addresses.
About 5-10%of the institutions have no independent web presence, most of them located in developing countries.'
'So the best way to build the ranking is combining a group of indicators that measures these different aspects.
Almind and Ingwersen proposed the first Web indicator, Web Impact Factor (WIF), based on link analysis that combines the number of external inlinks and the number of pages of the website, a ratio of 1
Number of documents, measured from the number of rich files in a web domain, and number of publications being collected by Google Scholar database.
As it has been commented already, the four indicators were obtained from the quantitative results provided by the main search engines as follows:
Size (S). Number of pages recovered from four engines: Google, Yahoo, Live Search and Exalead.
For each engine, results are log-normalised to 1 for the highest value. Then for each domain, maximum and minimum results are excluded
Visibility (V). The total number of unique external links received (inlinks) by a site can be obtained only confidently from Yahoo Search, Live Search and Exalead.
Adobe 132 Acrobat(.pdf), Adobe Postscript(.ps), Microsoft word(.doc) and Microsoft Powerpoint(.ppt). ) These data were extracted using Google and merging the results for each filetype after log-normalising in the same way as described before.
Scholar (Sc. Google Scholar provides the number of papers and citations for each academic domain.
These results from the Scholar database represent papers, reports and other academic items. The four ranks were combined according to a formula where each one has a different weight:
Webometrics Rank (position)= 4*Rankv+2*Ranks+1*Rankr+1*Ranksc'Dissemination, incl. how much information is available regarding data and methods:
Entirely web disseminated, with data and methodology clearly presented and articulated on their website. Intended and Unintended Consequences:
Limitations:The use of link farms and paid backlinks to improve the position in our Webometrics Rankings is not acceptable as this is a non academic practice
and it is contrary to the aims of this Ranking. The involved institutions do not have a place in our Ranking
and will not be classified in future editions. Random checks are made to ensure the correctness of the data obtained.'
'Current identified biases of the Webometrics Ranking includes the traditional linguistic one more than half of the internet users are English-speaking people),
and a new disciplinary one (technology instead of biomedicine is at the moment the hot topic) Since in most cases the infrastructure (web space) and the connectivity to the Internet already exists,
the economic factor is considered not a major limitation (at least for the 3, 000 Top universities).''The only source for the data of the Webometrics Ranking is a small set of globally available, free access search engines.
All the results can be duplicated according to the describing methodologies taking into account the explosive growth of the web contents, their volatility and the irregular behaviour of the commercial engines.'
'Observations/Additional Comments: This information is: of a general nature only and is intended not to address the specific circumstances of any particular individual or entity;
not necessarily comprehensive, complete, accurate or up to date; sometimes linked to external sites over which consortium members have no control and for
which they assume no responsibility; not professional or legal advice (if you need specific advice, you should always consult a suitably qualified professional).
133 http://www. webometrics. info/disclaimer. html 134 10.18 GLOBAL-ACADEMIC RANKING OF WORLD UNIVERSITIES (ARWU) Executive Summary The Academic ranking of world universities (ARWU), first published in 2003 and updated annually by the Institute
of Education at Shanghai Jiao Tong University (China), was the first international league table, offering a scientific mechanism for comparing universities around the world.
According to SJTU's Academic ranking of world universities website, the objective of their ranking is to fill a gap in the global information on higher education.
In total, we have collected data on about 2000 universities. http://ed. sjtu. edu. cn/rank/2003/FAQ. htm retrieved November 5, 2008.
incl. how much information is available regarding data and methods: Dissemination is mainly through its web site,
The website contains clear links and descriptions of data and methodology used. The actual data analyzed are made not available, however.
Intended and Unintended Consequences: Its developers define the ARWU as an academic ranking and not a comprehensive one.
This limitation to more objective data is what also gives this SJTU-ARWU ranking its strength and reputation as the most reliable among the global rankings.
The researchers at SJTU are clear in disclaimers on their website that it would be impossible to have a comprehensive ranking of universities worldwide, because of the huge differences of universities, in the large variety of countries and funding capacities,
and the technical difficulties in obtaining internationally comparable data. According to the SJTU ARWU website,People should be cautious about any ranking including our Academic ranking of world universities.
Nevertheless, our Academic ranking is based on internationally comparable data that everyone could check.''A 2007 article by Rãzvan V. Florian, in Scientometrics, found, in fact,
that the results emerging from the ARWU data were not replicable, calling into question the comparability and methodology of the data used in the ranking.
One final bias that deserves mention is that related to the use of English as the language of international scholarship.
As citations depend on having access to published scholarship, and the preponderance of published scholarship occurs in English,
Using subjective inputs peer reviews from academics and employers and quantitative data, such as the numbers of international students and faculty,
These reviews evolved into data driven, comprehensive national institutional rankings (Times Good University Guide) in the 1990s.
to produce the data used in the rankings. Together, QS and THE have published the WUR for 5 years
and global presence, with the quality of each determined by a combination of qualitative, subjective inputs peer reviews from academics and employers and quantitative data,
Dissemination, incl. how much information is available regarding data and methods: Annually, this ranking is disseminated in the following ways:
which is among the best selling editions of THE in any given year, on the THE website,
and on the QS website. The information about its data used and methodology is on the website.
The actual data analyzed are made not available, however. Intended and Unintended Consequences: The WUR's limitations lie in the same breadth of data that the QS/THE developers cite as its strengths the inconsistency and variability of its findings year to year.
Over the five years of production, QS/THE have sought methods to tighten and strengthen its analysis. Over the past few years,
however, a disproportionate number of institutions from within the UK have risen to the top of the table,
giving a perception of bias in the methodology. It is biased also fundamentally toward historically significant institutions,
And, finally, the commercial nature of the WUR, with consumers required to buy the paper to access the data,
Instead, PRSP tracks the academic outputs to provide some comparative data on the work produced by institutions and its utility to the community outside its campus. Policy Objective (s:
Based on objective data obtained to measure both the qualitative and quantitative impact scientific papers, PRSP then utilizes quantitative analytical indicators to illustrate objective characteristics.
Once these quantitative data are generated, the PRSP staff use that data in conjunction with concepts that would observe the quality of the papers,
making the PRSP a ranking that creates a quantitative deductive ranking utilizing qualitative assessments. Methodology, incl. time-frame, resources, costs, technologies:
PRSP, finally, compares these universities'outputs using data from ISI's ESI, Web of Science (WOS),
Dissemination, incl. how much information is available regarding data and methods: Dissemination is through the website,
and the data and methods used in this ranking are explained there. Specific data are presented not, however.
Intended and Unintended Consequences: Limitations: When universities obtain similar scores, the slight differences of the final scores may not necessarily suggest its superiority in scientific research.
The hierarchical nature of the performance ranking may insinuate greater distinction in quality comparators than are 142 accurate.
The work focuses on all universities worldwide with more than 700 Web of Science indexed publications per year.
Bibliometric data are extracted from a bibliometric version of Thomson Reuters'Web of Science, created at CWTS.
Dissemination, incl. how much information is available regarding data and methods: The CWTS ranking system is publicly available through the following website:
http://www. cwts. nl/ranking/Leidenrankingwebsite. html. Intended and Unintended Consequences: The actual use and the effects of the Leiden Ranking have not yet been analyzed systematically.
Retrieved http://eurlex. europa. eu/Lexuriserv/site/en/com/2005/com2005 0488en01. pdf European commission (2006) Delivering on the modernisation agenda for universities:
and reporting on intangibles (Intellectual Capital Report), Vodafone Foundation: Madrid. Committee on Facilitating Interdisciplinary Research (2004) Facilitating Interdisciplinary Research, National Academy of Sciences, National Academy of Engineering, Institute of Medicine, USA.
Retrieved 31 july 2009, from http://books. nap. edu/openbook. php? record id=11153andpage=2. Garfield, E. 1979) Citation Indexing.
from http://www. arc. gov. au/era/journal list. htm. 148 International Observatory on Academic rankings and Excellence, Retrieved 22 july, from http://www. ireg-observatory. org/index. php?
Ranking Web of World Universities, Cybermetrics Lab CSIC. Retrieved 22 july 2009, from http://www. webometrics. info/.
You can obtain their contact details on the Internet (http://bookshop. europa. eu) or by sending a fax to+352 2929-42758.
Free publications: via EU Bookshop (http://bookshop. europa. eu; at the European commission's representations or delegations.
You can obtain their contact details on the Internet (http://ec. europa. eu) or by sending a fax to+352 2929-42758.
In 2008, the European commission, DG Research set up the Expert Group on Assessment of University-Based Research to identify the framework for a new
Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011