and generated energy data along with actionable commands to customers. With technologies such as WIFI Zigbee, and home area network (HAN) communication systems, smart meters can now act as interfaces for energy management entities, customers,
Data management is critical for the widespread operation of the smart grid in the near future
Reconfiguring the system in islanded mode may require hitherto unknown rate and amount of data exchange, two-way communication links,
IEEE Standard 1451.4 requires analog sensors to have a transducer electronic data sheet to provide calibration information to the data acquisition system 37
Fig. 5 shows how several communication technologies can be applied for such data, according to their characteristics.
Data Sheet (TEDS) Formats, IEEE Standard 1451.4-2004, Dec. 2004.38 Q. Zou and L. Qin, Integrated communications in smart distribution grid, in Proc.
In a survey of European SME perspectives on cloud computing, the security of corporate data and potential loss of control featured highly among the concerns for SME owners (ENISA 2009
even though 49%of SMES already exchange data in an automated fashion with other ICT systems outside their own enterprise (Giannakouris and Smihily 2010.
and improving the data management of the chemicals2. www. aga. com CAR2GO: selling new forms mobility The next step for carsharing models may be the concept of CAR2GO.
including carbon footprint data of over 300 materials, energy carriers, and delivery of waste treatment and transport lca. jrc. ec. europa. eu/lcainfohub/datasetarea. vm Guide to PAS 2050 How to assess the carbon footprint of goods and services shop
EIO 2012 based on data from Demea (N=92) Figure 6: Yearly savings potential in SMES introducing material efficiency solutions Eco-innovate production processes Eco-innovate!
The most in demand information includes data on the origin of resources used in products and evidence on the social and environmental impacts of resource use across the supply chain. 3. 3 Supply chains Supply chain management includes coordination and collaboration with suppliers,
page 44) What data and tools are available to assess the (quantified) environmental impacts in Key challenges for your business Design may be performed by product designers, design engineers,
Communicating data or information on a product's environmental impacts is not always a strong motivator for customers
a cleaner printed cicruit board Crawford Hansford & Kimber developed a cleaner printed circuit board (PCB) that is incorporated into equipment that interfaces with data loggers that is now in use in higher education around the world.
How do we incorporate environmental performance related data in communications? What is the product's Unique Selling Proposition (USP) relative to competitive offers?
including reduction greenhouse gas emission across the life cycle of bio-based products. www. biochem-project. eu The European commission has funded a range of projects collecting data and good practice examples
EIO 2012 based on data from Demea (N=92) p. 28: Source: www. lisec. com/LPS/Glas-Vorspannanlagen-speziell-fuer-Duennglas p. 29:
meaning internet connections, web collaborative tools, sharing of open data and a process of bottom-up peer-supported activities and applications.
Examples are given on the novel use of information platforms, data from sensor networks and community use of mobile phones.
Our data comes from the EU activities and R&d grants awarded up to 2014. We describe the concept, the context,
to support stronger links (data exchange, visualization) and thus to multiply the potential effect of grass-root initiatives.
This study, in fact, mapped 590 organisations with 645 projects active in the filed across Europe (data of August 2014.
Actors and initiatives were crowd-mapped trough the project platform digitalsocial. eu were data are updated constantly.
the Self-assessment toolkit (SAT) and the User Data Gathering Interphase (UDGI. The first one is dedicated to CAPS projects coordinators and partners and the second one to CAPS users.
and human capital because its outputs and its activities are not leading to this kind of impacts. 5. At this point the SAT will show all the questions related to the impact dimensions selected by the project representatives. 6. The data inserted by CAPS representatives will be elaborated in real time by the SAT
IA4SI team will use all the gathered data for developing two impact assessment reports: one will include the assessment of each CAPS project
and one will analyse the data at aggregated, domain level. Besides this, a set of best practice will be identified
that more diverse sources of data improves impact measurement, but that ultimately it is stakeholder engagement that makes the difference to sustainable social innovation.
Twenty-first century social science needs to have access to new data gathering resources to collect to sample to validate hypotheses
The open data portal is experimenting with this distributed data resource. The findings can be reapplied to generate more collective intelligence.
"Guidelines for Collecting and Interpreting Innovation Data Passani A.,Monacciani F.,Van der Graaf S.,Spagnoli F.,Bellini F.,Debicki M,
Available data are not reassuring. As shown below, in Figure 1, the gap between the EU15 and the United states in terms of expenditure in R&d has been in place since the early 1980s.
Kristian Uppenberg, presentation at the first Task force meeting, 18 september 2009, OECD data. 14 INTRODUCTION. TOWARDS EUROPE 2020 An important issue is the ability of the European Investment Bank (EIB) to reach dynamic and innovative small firms
Section 6 briefly concludes. 27 2. A NEW APPROACH TO INNOVATION POLICY IN THE EU 2. 1 Innovation is a changing concept The data reported in the previous section show that Europe is facing a structural problem
'Available data testify to a Europeanlag'vis-à-vis the United states, Asia and several emerging economies in terms of research, development and innovation (R&d&i.
Available data 6 See European Innovation Scoreboard 2009 at http://ec. europa. eu/enterprise/newsroom/cf/document. cfm?
Also, European companies perform about 30%of R&d outside the EU. Data presented at the Task force meetings are based on evidence of collaboration between EU
and data using common indicators. 2. 2. 3 Taking innovation seriously: improving governance through accountability and coordination Putting innovation at the core of the EU policy-making process cannot only be a declaration of intent.
and interpreting innovation data, at http://www. oecd. org/dataoecd/35/61/2367580. pdf. And see INNODRIVE's activities at http://innodrive. org/38 A NEW APPROACH TO INNOVATION POLICY IN THE EU partners,
Such data shows that companies with a large share of their business in the US or Japan receive a substantial advantage from their own patent 17 As a matter of fact,
p. 25.24 Data are available in the Communication from the Commission to the European parliament and the Council Enhancing the patent system in Europe, COM (2007) 165 def, p. 7. A NEW APPROACH TO INNOVATION POLICY IN THE EUROPEAN UNION 47 case of SMES.
together with studies and empirical data provide useful inputs for policy decision regarding a future patent system,
and available empirical data on the fact that European universities are good at sciences and technology,
and interpreting innovation data (http://www. oecd. org/dataoecd/35/61/2367580. pdf). Polk Wagner, R. 2009), Understanding Patent Quality Mechanism, Public law and Legal Theory, University of Pennsylvania Law school, Research Paper No. 09-22, subsequently published as 157 U. Penn.
New emerging technologies like smart materials, micro-mechanical sensors and wireless and faster data transfer solutions etc. have presented new opportunities to develop product features, especially those intangible features
and PDM (Product Data Management) can be used to manage product-related information. The main point is that ICT should fully support the business processes
for example Business intelligence solutions to manage business data and information from marketing and customers. Markets are providing a huge number of different solutions to different needs,
Companies should clarify their data administration vision: which kind of solutions best fits the company and
It can be accessed through the Europa server (http://europa. eu). Cataloguing data can be found at the end of this publication.
Moreover, global rankings tend to rely on qualitative indicator-based data, which tend to have an inbuilt bias in favour of hard sciences and biosciences,
There is also a substantial lack of cross-national comparative data. THE RAISON D'ÊTRE OF THE ASSESSMENT OF UNIVERSITY-BASED RESEARCH EXPERT GROUP In this context, the Commission's Directorate-General for Research decided to convene an expert group on assessment of university-based research.
and identifying data and indicator requirements. The Expert Group had 15 members from 12 EU Member States, Australia, a European association and an international organisation.
including experience and/or expertise in national and international rankings and bibliometrics, data collection and analysis, concrete research assessment exercises, the workings of leading national and European research funding organisations
12 While some of the data required may be readily available or relatively easy to obtain,
other data are either not available or only available in limited circumstances. This makes comparability across universities and countries difficult.
It studied both the value and limitations of bibliometric data which are used commonly to measure research productivity and quality,
For example, the Group came to realise that even bibliometric indicators might be flawed due to manipulation of data.
2) Assessment of university-based research should combine quantitative indicator-based data with qualitative information, for example information based on expert peer assessment or validation,
It links specified users with their defined purposes and objectives to specific data, quantitative and qualitative indicators,
While some purposes and objectives require extremely detailed and robust data on research outputs, other requirements demand only a few, relatively simple indicators.
analyse and disseminate standardised data, so as to enable inter-institutional and cross-national comparisons. It was suggested also that the challenges
and comprehensive data. In view of this, the AUBR EG recommends that the European commission Take the lead in establishing a European Observatory for Assessment of University-based Research to identify and prioritise data requirements of a European Research Assessment Framework
as well as to further develop and disseminate guidelines for use by universities, national agencies, government, and other stakeholders, based on the principles outlined in this report;
Invest in developing a shared information infrastructure for relevant data to be collected, maintained, analysed, and disseminated across the European union;
Moreover, in the absence of comprehensive reliable and comparable cross-national data, rankings cannot be a valid tool to achieve the overarching aim of improving the quality of university-based research across the European union. 17 2 Introduction This chapter outlines the national
European University Data Collection a project studying the feasibility of a sustainable European system of data collection on the activities and performance of the European higher education institutions in the areas of education, research and innovation.
European Multidimensional University ranking System, a pilot project funded by DG Education and Culture, aimed at mapping multiple excellences (e g. teaching, innovation, community engagement and employability).
They usually use a combination of public or institutional data and/or peer or student surveys.
identification of peer institutions, improve data collection and increase participation in broader discussions about institutional success. Unintended consequences can occur
Sound, verifiable and comparable data is a necessary prerequisite for institutional autonomy and to enable European universities to manage strategically, effectively and efficiently.
identifying data and indicators requirements (if necessary propose different approaches for different types of users).
and identifies data and indicators requirements within a policy context. The concluding Chapter 6 identifies potential risks and unintentional consequences
if simplistic interpretations of the data are made, and onedimensional correlations are drawn between research assessment and policy choices.
which can affect the type of quantitative data and qualitative analysis. Depending upon the university, scientific field or policy environment,
or interpret the data. Likely Target Users, including: HE Governance and Management: These groups require a wide range of information to help
Because higher education is both a generator and user of the data, its position is different than the other users. o Governing Bodies
data to assess the quality of research and HE performance and output and to support return-on-investment.
while QA agencies use institutional data to benchmark and assess quality and performance. o Funding Agencies o Enterprise and Development Agencies Academic Organisations and Academies In many countries,
Increasingly, employers use such data to identify likely sources of potential employees. o Private firms and entrepreneurs o Public organizations o Employers Civil Society and Civic Organizations
are likely to use benchmarking data to identify potentialinvestment'opportunities, using the information as a proxy for value-for-money
and Uses of Research Assessment Data User Group Why Research Assessment Data Is required? What Research Assessment Data Is required?
HE MANAGEMENT AND GOVERNANCE Governing Bodies/Councils Policy and planning Strategic positioning Research strategy development/management Investor confidence/value-for-money and efficiency Quality assurance Institutional
and discipline/field data re. level of intensity, expertise, quality and competence Benchmarking against peer institutions, nationally and worldwide Efficiency level:
and efficiency Quality assurance Publicity Student and academic recruitment Improve and benchmark performance and quality Institutional and discipline/field data re. level of intensity, expertise,
HE Research Groups Strategic positioning Research strategy development/management Investor confidence/value-for-money and efficiency Student and academic recruitment Discipline data re. level of intensity, expertise,
and HEIS Determine national/international competitiveness Quality, sustainability, relevance and impact of research activity System and institutional data re level of intensity, expertise,
development/management Investor confidence/value-for-money and efficiency Quality assurance Institutional and discipline/field data re. level of intensity, expertise, quality and competence Benchmarking against peer institutions
and quality Improve system functionality System and institutional data re level of intensity, expertise, quality and competence Performance of HE system and individual institutions Benchmarking between nationally and worldwide Indicator of national competitiveness Attraction capacity:
professional and academic performance and quality Academic and discipline/field data re. level of intensity, expertise,
nationally and worldwide Quality of academic staff and Phd students INDIVIDUALS Academics and Researchers Identify career opportunities Identify research partners Identify best research infrastructure and support for research Institutional and field data re level of intensity,
Staff/student ratio Institutional research support Students Inform choice of HEI Identify career opportunities Institutional and field data re level of intensity, expertise, quality,
and best research partners Institutional and field data re level of intensity, expertise, quality, competence and sustainability Performance of individual institutions and researchers benchmarked against peers in field of interest Research capacity of institution
and expertise Identify potential employees Institutional and field data re level of intensity, expertise, quality,
and expertise Identify potential employees Institutional and field data re level of intensity, expertise, quality,
and expertise Identify potential employees Institutional and field data re level of intensity, expertise, quality,
technology transfer and knowledge transfer partners Institutional and field data re expertise, quality and competence Peer esteem indicators MINISTRIES OF HIGHER EDUCATION IN DEVELOPING COUNTRIES To help determine which foreign higher education institutions are applicable for overseas scholarships studies.
and technology transfer Institutional and discipline/field data re. level of intensity, expertise, quality and competence Competitive positioning of institution and researchers Trends in graduate employment and competence Quality of academic staff and Phd students SPONSORS AND PRIVATE INVESTORS Benefactors/Philanthropists
relevance and impact of research activity Quality of academic staff and Phd student Contributor to own brand image Institutional data re level of quality and international competitiveness
and worldwide Quality of academic staff and Phd students Alumni Determine institutional performance vis-a-vis national and international competitors Institutional data re level of quality and international competitiveness
choice and career opportunities Investor/parental confidence and value-for-money Institutional data re. level of intensity, expertise,
and quality while industry and employer groups want to be able to identify potential employees. 3. Some of the required data may be readily available
while other data are either not available or are limited available in circumstances, which makes comparability across universities or countries difficult.
For example, bibliometric data on peer-reviewed publications are available commercially, but there is no similar information available for the wide range of research outputs
while research performance data may be collected for one purpose, it is used often by other stakeholder groups for very different purposes.
or re-tabulates research data as aleague table'or ranking. These are significant findings,
This is especially important for international comparability. 4. 4 Bibliometric Methods Bibliometric data is an important method to quantify research activity in terms of that
securely storing primary data, acknowledging the role of collaborators and other participants, and ensuring professional behaviour between supervisor and research students.
but also develop mechanisms to collect accurate and comparable data. The indicators can be quantitative and qualitative.
Data must be verified accurate and. Although one of the most popular indicators, it is not always the most appropriate one.
Citations Citation data are derived from citation indexes, i e. databases that do not only contain meta data on included In the exact sciences,
peers tend to consider citation impact a relevant aspect in Citations reflect intellectual influence but do not fully coincide with research quality.
Expansion of existing databases and creation of new databases (e g. based on data from institutional repositories) will 44 INDICATORS DESCRIPTION PRO/POTENTIALITIES CON/LIMITATIONS
Data must be verified accurate and. are limited of value in disciplines not well covered by the citation indexes, especially certain parts of social sciences,
Data can be verifiable by conference programme. No agreed equivalences that apply internationally and facilitate comparison across disciplines.
Data is verifiable. No agreed equivalences that apply internationally and facilitate comparison across disciplines. Unless lists are publically available this will require direct entry by researchers.
Data is agreed verifiable No equivalences that apply internationally and facilitate comparison across disciplines. Unless lists are publically available this will require direct entry by researchers.
Comparable data, verifiable through audit, is useful for comparing research performance across the system and within universities.
Data collection may be difficult in case of funding by end users because this information is known not to the University administration.
Comparable data, verifiable through audit, is useful for comparing research performance across the system and within universities.
Data needs to be adjusted to scale and mission of university. Employability of Phd graduates Industry employment of Phd graduates can be an indicator of the contribution of research to the highly Used to measure the quality of the graduates,
Career paths and opportunities can differ for different disciplines which data is collected. Commercialisation of researchgenerated intellectual property (IP) Provides measure of the extent of income from commercialisation of intellectual property created through patents, licences or start ups.
Lack of agreed basis of capturing data and comparability could undermine legitimacy. Agree basis of international comparability and verifiability.
Lack of agreed basis of capturing data and comparability could undermine legitimacy. Agree basis of international comparability and verifiability.
and verify the data due to lack of clarity as to what is being measured. Agree precise definition, inter alia:
Data is verifiable by universities although there can be a time lag. Rates of completion may differ across disciplines.
Difficult to get valid, comparable institutional data, even within the same institution. Agree basis on which to calculate full cost of research investment. 48 INDICATORS DESCRIPTION PRO/POTENTIALITIES CON/LIMITATIONS
Difficult to get valid, comparable data. Favours older, well-endowed universities. Develop appropriate comparative indicators.
Data Collection through Digital Repositories: Technology provides an easy way to store and access research for the actual research assessment process,
and collection of data; public availability and reusability of scientific data; and public accessibility and transparency of scientific communication.
Peer review Panels: Several case studies underscore the importance of peer review panels. The process helps ensure a broader understanding of the research
Each purpose requires different data. Some requirements demand extremely detailed and robust data on research outputs;
other requirements demand only a few, relatively simple indicators. All indicators have advantages and disadvantages, and there are limitations to all assessment exercises (see Chapter 6). Indicators designed to meet a particular objective
e g. indicator-based data with peer or end-user review. There are several advantages to this approach:
The actual choice of indicators depends upon the purpose and the availability of data. There are four important steps:
AND SCALE RESEARCH INFRASTRUCTURE Allocate Resources Research output/bibliometric data Citation data Peer review Keynote, awards, etc.
Drive Research Mission Differentiation Research output/bibliometric data Output per research academic Peer review Self evaluation Ratio of research income:
Percentage Funding from Endusers Patents, Licenses, Spin-offs Number of collaborations and partnerships Improve Research Performance Research output/bibliometric data Citation data Number and percentage publication in topranked,
or Cost-Benefit of Research Research output/bibliometric data Output per research academic Peer review and/or citation data Commercialisation data End user reviews Social, economic,
data with focus on European & International collaborations Percentage of Research Income from International Sources Number of collaborations and partnerships Increase Multidisciplinary Research Research Output/Bibliometric data with focus on interdisciplinary fields Peer review Self evaluation
New research fields, interdisciplinary teaching programmes, etc. Research Conducted by People from Different Disciplines Some illustrative scenarios follow. 54
and/or citation data to determine impact Some measure of research infrastructure/environment e g. Libraries, equipment, postgraduate student numbers, etc,Esteem'factors e g. prizes, research income etc.
Data on Research Outputs, including output per academic staff Data on ratio of research income:
teaching income Data on ratio of undergraduate students: master & doctorate research students Peer review Panels Self evaluation Reports
Data on cooperation agreements with local governments and organisations of the region; Data on agreements with other public or private institutions located in the targeted area of influence;
Indicators of results (publications, policy reports, patents, spin-offs..coming from these agreements; Ratio of business or other external funding of research:
Data onmerit'of research as assessed by end users, rather than peer review; Peer Esteem, e g. expert opinion, professional memberships, media visibility.
and/or Citation data to determine impact Option B: Use holistic peer review assessment panels to benchmark performance against international comparators,
Data on research outputs, including output per academic staff; Peer review and/or citation data to determine scholarly impact;
55 Indicators of commercialisation of IP; Indicators of social, economic, cultural and environmental impact and benefits;
Data on employability of Phd graduates; Data on collaborations and partnerships. IF you want to use research assessment to ENCOURAGE INTERNATIONAL COOPERATION,
then what is required is: Data on European and international cooperation agreements; Data on joint publications with scholars from other countries;
Proportion of research funding, domain by domain, coming from overseas research institutions. If you want to use research assessment TO INCREASE MULTIDISCIPLINARY RESEARCH,
then what is required is: Use knowledge clusters as unit of assessment by peer review panels; Data on output according to knowledge cluster perhaps using bibliometrics with focus on authors from different disciplines;
Data on other results, e g. new research areas, courses or teaching programmes,)designed together by people from difference disciplines (or Schools, or Faculties;
Peer review; Self evaluation. 56 6 Conclusion 6. 1 Limitations and Unintended Consequences University-based research has become one of the most critical factors shaping national competitiveness and university reputation.
Likewise, the absence of appropriate, verifiable and trustworthy data can undermine the usefulness of cross-national comparisons
Bibliometric and citation data is by definition backward looking; in other words, it assesses past performance as a proxy for future performance.
However, the absence of verifiable and accessible cross-national data and common definitions raises questions as to the efficacy of this approach on an international basis given all the limitations that have been identified throughout this report.
'orintelligent'presentation of data by universities and researchers anxious to ensure a good report.
Because of unintended consequences, the choice of indicators, methodology and data sources are critical. Qualitative indicators can easily ignore differences between disciplines;
Reliance on data that is easily measured can distort research towards that which is more predictable;
Good practice'suggests that research assessment should 1. Combine indicator-based quantitative data with qualitative information, for example information based on expert peer assessment.
and projects designed to generate much needed comparable data and more appropriate and robust scenarios for the assessment of university-based research. 60 7 Appendix
Data usually comes from an international database e g. Thompson Reuters Web of Science or Elsevier-Scopus.
industry or other Technical reports Legal cases Entries in a dictionary/encyclopaedia Maps Translations and editing of major works Case studies Data collection is undertaken
University-based data normally requires direct entry by researchers, often mediated through the Research Office.
Data usually comes from an international database e g. Thompson Reuters Web of Science or Elsevier-Scopus.
Pro Bibliometric data is collected on all research outputs in order to quantify the full extent of research activity.
At national level, identifying categories for inclusion in data collection involves consultation with key discipline, research and university organisations and leaders,
Individual universities may be able to compile data on all categories of outputs but this needs a high degree of compatibility for cross-institutional and cross-national comparability.
Citations Data Derived from Standard Bibliometric Measures Citation counts are manifestations of intellectual influence, as represented by the adage:
Data is purchased from commercial bibliometric providers, the most significant of which are Thomson Reuters and Scopus.
These Centres use bibliometric data to undertake systematic evaluation and mapping of research at institutional, cross-institutional, national and international levels.
but extraneous factors can also impact on the data, including: Publication language; Coherence of research communities;
Con These data are verifiable but there is no systematic verification protocol or technology available for verifying claims across diverse indicators yet.
as appropriate, Research Masters degree Completions Universities and, in some cases, government agencies, collect data for this indicator.
Data is verifiable. The use of this indicator promotes quality postgraduate supervision and programs. If resources are attached to performance against this indicator, universities,
Universities collect data for this indicator. Pro 76 This indicator is useful to drive improvement in research performance in universities and in internal units (faculties, schools, departments.
and to get reliable data on this. 9. 8 NUMBER OF CO-PUBLICATIONS Description: Number of Co-Publications within the Unit of Assessment or Collaborative Work with Researchers from other Universities/Organisations.
Universities usually collect data, although data may be provided by sponsoring organisations. Pro Research income is a useful indicator for measuring the scale of the research enterprise and its capacity to secure additional income through competitive grants and contact research, especially in science, technology and medicine.
This indicator is comparable, and verifiable through audit, and can be useful for comparing research performance across the system and within universities.
It may be difficult to collect data from end users because this information may not be collected routinely by the Research Office.
It may also be difficult to find comparable data on research income. 9. 9. 3 TOTAL R&d INVESTMENT Description:
Universities collect data for this indicator and self-nominate levels of funding for research from consolidated revenues.
comparable institutional data, because a significant proportion of institutional investment is cross-institutional subsidisation. 79 9. 9. 4 RESEARCH INFRASTRUCTURE Description:
It is often difficult to find quantifiable and comparable data or to express the existing facilities in terms of money.
Typically, the indicator assembles data on: Invention Disclosures: the number of disclosures, indicated to the appropriate university office, e g.
income and equity data are included in audited financial reports and information on company ownership and current value is in the public domain.
The key legitimating factors are the link between IP commercialisation and economic benefit, the availability of data to support broad comparisons across national and international systems,
and longitudinal analyses and the in principle verifiability of the data. Con 80 Patents are a very poor indicator.
the data can be unreliable. The information does not always specify the universities from which Phd holders graduated;
Con The data is only in the early stages of being defined and collected. Many of the projects are conducted by researchers individually or privately
so the university's data may not complete. Results are often only internally published; the amount of money given is often not public. 9. 11 END-USER ESTEEM Description:
The data can be collected quantitatively or qualitatively. The latter can be captured by involving key stakeholders and end-users directly in review panels or in written assessments.
Con 82 Data is verifiable but there is no systematic verification protocol or technology available for verifying claims across diverse indicators yet.
Dissemination, incl. how much information is available regarding data and methods: As with the previous RQF, ERA involves exhaustive consultation with researchers and the 39 universities in the Australian system.
'Greatly improved technical capacity for data collection across Australian universities. Greater concentration of research funding in universities that areresearch intensive'(a category that tends to coincide with larger, older universities with strength in natural sciences and medicine.
1. Staff 3. 2. Teaching activities (incl. size of the classes) 3. 3. Financial data 3. 4. Third-mission activities the documents prepared by the disciplinary
data collection on the teams, evaluation files, CVS of the members of the panel 2. Report on the teams("private")2. 1. The team 2. 2. The team's research activities
incl. how much information is available regarding data and methods All documents concerning the ULB Research Assessment Exercise (goals,
Data collection and self evaluations of departments according to strict rules laid down by the Steering committee. Evaluation, including one-week site visits,
Dissemination, incl. how much information is available regarding data and methods: The Evaluation has its own web site with all the necessary information in English http://www. aaltoyliopisto. info/en/view/innovaatioyliopisto-info/research-evaluation Intended and Unintended Consequences:
2) strict guidelines for collecting background data; 3) expert panels comprised of eminent foreign scholars/scientists;
Dissemination, incl. how much information is available regarding data and methods: All materials of UH RAE (Terms of reference, Guidelines for the departments, evaluation report, etc.
Dissemination, incl. how much information is available regarding data and methods: Which outlets are considered ranked international scientific outlets is explained in detail on the AERES website.
Research quality is assessed by informed peer review on the basis of an extensive analysis of quantitative and qualitative data.
Criteria and data are defined in a discipline-specific manner by experts from the individual fields of research.
Dissemination, incl. how much information is available regarding data and methods: The final results from the pilot study were published in December 2007 (chemistry) and April 2008 (sociology.
The indicators are based on different data sources: data collected directly at the universities, publication databases (Web of Science and national databases of scientific-scholarly publications) and a survey conducted among professors.
The main characteristics of the CHE Rankings are the following: no aggregation of indicators across the whole of a university,
in each cycle, data are collected from the previous three years. The ranking team is comprised of six CHE members of staff;
Dissemination, incl. how much information is available regarding data and methods: The results of the CHE University ranking are published in"Der Studienführer"once a year;
Unintended consequences The quality of the data is improving because of public pressure Since the publication of the first CHE Research Ranking,
At the same time, universities have begun to provide data in an"intelligent"way in order improve the position of their institutions in the Rankings.
Dissemination, incl. how much information is available regarding data and methods: A special brochure has been published, presenting all 85 institutions involved in the Initiative for Excellence.
Dissemination, incl. how much information is available regarding data and methods: Beyond the internal dissemination of information has decided not yet.
Its draws on information provided by HEIS and publicly available data. While its principal aim has been to assist third level entrants and their parents
The median Leaving certificate points obtained by honours degree course entrants, weighted by the latest data on the number of students on each course.
CAO 2008, round 1 data. Source: Calculated from CAO entry data 2007. Research. A measure of research efficiency which compares competitive research funding won in 2007 with the number full-time equivalent academic staff.
NUI Maynooth had the best ratio, which was scored 100 in the table. All other scores were expressed 111 then as a percentage of the NUI Maynooth result.
Individual colleges extracted from latest available Higher education Authority (HEA) data. Firsts/2: 1s The percentage of highest quality degrees in 2007.
Universities HEA 2007 data; institutes 2006 Department of education and Science data. Completion rates. Percentage of 2002 entrants who completed courses for which they enrolled by conferring in 2007 Source:
Individual Institutes. Trinity college, estimated figure. OTHER INDICATORS IN THE PROFILES Undergraduates/postgraduates Full-time undergraduate and postgraduate enrolments.
HEA 2007 data, universities only. Nonstandard entry Individual institutions 2007 intake. Sports facilities Assessment by The Sunday Times in consultation with students unions.
Dissemination, incl. how much information is available regarding data and methods: Results disclosed in the newspaper and on-line. http://www. timesonline. co. uk/tol/life and style/education/sunday times university guid e/article2497779. ece Intended and Unintended Consequences:
and it 112 has introduced a new competitive dynamic into the system despite concern about the indicators and data.
No data is available on the extent to which the information is informing student choice.
However, because of the absence of good verifiable and comparable data the results are controversial. Sunday Times University Guide is not a research assessment exercise
The amount and quality of educational data available in Ireland is compared poor to that in the UK and other countries.
when the 2001-2003 data were used by the Government to allocate, together with other indicators, 7%of the overall university funding.
The Ministry of Research used data from CIVR, CNVSU, and Ministry sources. CIVR data refer to the evaluation of research in the period 2001-2003.
The Ministry published a list of universities with the percentage of increase or decrease of funding resulting from the application of these criteria.
Dissemination, incl. how much information is available regarding data and methods: Full transparency on methods and mandate to the external experts.
On the basis of a yearly monitoring system, the institutes maintain data needed for these evaluations in a systematic way.
to store all relevant data. Policy Objective (s: The evaluation system aims at three objectives with regard to research and research management:
Data must be provided about funding and resources. The academic reputation of a given institute may be indicated in several ways.
119 Dissemination, incl. how much information is available regarding data and methods: The final evaluation reports are sent to the advisory boards of the institutes evaluated.
incl. how much information is available regarding data and methods: All information regarding data and methods is available.
However, the overall system, as well as its indicator part, is complicated quite and therefore requires quite sophisticated competences on indicators
incl. how much information is available regarding data and methods: Because there is not complete transparency in the way each candidate is evaluated by the Committee,
whereas the aggregate data by faculty, scientific area, etc. are, and are used for comparisons. Intended and Unintended Consequences:
incl. how much information is available regarding data and methods: All information about methods is, in principle, available,
and all data should be fully transparent in order to facilitate improvements in university research. Intended and Unintended Consequences:
Dissemination, incl. how much information is available regarding data and methods: The RAE 2008 assessment method is transparent and all aspects of the methodology are in the public domain.
with data collection occurring in January and July, and the results of the data analysis are published a month later.
with data collection occurring in January and July, and the results of the data analysis are published a month later.
Using this data for a ranking was something of an afterthought, but ultimately made sense,
and other repositories related initiatives can be represented roughly from rich file and Scholar data. The huge numbers involved with the PDF and doc formats means that not only administrative reports
but due to the availability of their data collection procedures (Apis), only those marked with asterisk are used in compiling the Webometrics Ranking.'
) These data were extracted using Google and merging the results for each filetype after log-normalising in the same way as described before.
Webometrics Rank (position)= 4*Rankv+2*Ranks+1*Rankr+1*Ranksc'Dissemination, incl. how much information is available regarding data and methods:
with data and methodology clearly presented and articulated on their website. Intended and Unintended Consequences:
Random checks are made to ensure the correctness of the data obtained.''Current identified biases of the Webometrics Ranking includes the traditional linguistic one more than half of the internet users are English-speaking people),
'The only source for the data of the Webometrics Ranking is a small set of globally available, free access search engines.
In total, we have collected data on about 2000 universities. http://ed. sjtu. edu. cn/rank/2003/FAQ. htm retrieved November 5, 2008.
incl. how much information is available regarding data and methods: Dissemination is mainly through its web site,
The website contains clear links and descriptions of data and methodology used. The actual data analyzed are made not available, however.
Intended and Unintended Consequences: Its developers define the ARWU as an academic ranking and not a comprehensive one.
This limitation to more objective data is what also gives this SJTU-ARWU ranking its strength and reputation as the most reliable among the global rankings.
and the technical difficulties in obtaining internationally comparable data. According to the SJTU ARWU website,People should be cautious about any ranking including our Academic ranking of world universities.
Nevertheless, our Academic ranking is based on internationally comparable data that everyone could check.''A 2007 article by Rãzvan V. Florian, in Scientometrics, found, in fact,
that the results emerging from the ARWU data were not replicable, calling into question the comparability and methodology of the data used in the ranking.
One final bias that deserves mention is that related to the use of English as the language of international scholarship.
Using subjective inputs peer reviews from academics and employers and quantitative data, such as the numbers of international students and faculty,
These reviews evolved into data driven, comprehensive national institutional rankings (Times Good University Guide) in the 1990s.
to produce the data used in the rankings. Together, QS and THE have published the WUR for 5 years
and global presence, with the quality of each determined by a combination of qualitative, subjective inputs peer reviews from academics and employers and quantitative data,
Dissemination, incl. how much information is available regarding data and methods: Annually, this ranking is disseminated in the following ways:
The information about its data used and methodology is on the website. The actual data analyzed are made not available, however.
Intended and Unintended Consequences: The WUR's limitations lie in the same breadth of data that the QS/THE developers cite as its strengths the inconsistency and variability of its findings year to year.
Over the five years of production, QS/THE have sought methods to tighten and strengthen its analysis. Over the past few years,
And, finally, the commercial nature of the WUR, with consumers required to buy the paper to access the data,
Instead, PRSP tracks the academic outputs to provide some comparative data on the work produced by institutions and its utility to the community outside its campus. Policy Objective (s:
Based on objective data obtained to measure both the qualitative and quantitative impact scientific papers, PRSP then utilizes quantitative analytical indicators to illustrate objective characteristics.
Once these quantitative data are generated, the PRSP staff use that data in conjunction with concepts that would observe the quality of the papers,
making the PRSP a ranking that creates a quantitative deductive ranking utilizing qualitative assessments. Methodology, incl. time-frame, resources, costs, technologies:
PRSP, finally, compares these universities'outputs using data from ISI's ESI, Web of Science (WOS),
Dissemination, incl. how much information is available regarding data and methods: Dissemination is through the website,
and the data and methods used in this ranking are explained there. Specific data are presented not, however.
Intended and Unintended Consequences: Limitations: When universities obtain similar scores, the slight differences of the final scores may not necessarily suggest its superiority in scientific research.
Bibliometric data are extracted from a bibliometric version of Thomson Reuters'Web of Science, created at CWTS.
Dissemination, incl. how much information is available regarding data and methods: The CWTS ranking system is publicly available through the following website:
Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011