49 3. 1 Introduction 49 3. 2 Stakeholders'involvement 49 3. 3 Overview of indicators 52 Teaching and learning 52 3. 3
assessment of data procedures and communication...107 Figure 5-3: Follow up survey: assessment of data collection process...
but also parents and other stakeholders, to make informed choices between different higher education institutions and their programmes.
In a first phase running until the end of 2009 the consortium would design a multidimensional ranking system for higher education institutions in consultation with stakeholders.
interested and committed stakeholder representatives met with the project team over the life of the project.
The stakeholder consultations provided vital input on the relevance of potential performance dimensions and indicators
Stakeholder workshops were held four times during the project with an average attendance of 35 representatives drawn from a wide range of organisations including student bodies, employer organisations, rectors'conferences, national university associations and national representatives.
yet higher education and research systems are becoming more complex and at first sight less intelligible for many stakeholders.
and can cater to the different needs of a wide variety of stakeholders. An enhanced understanding of the diversity in the profiles and performances of higher education and research institutions at a national, European and global level requires a new ranking tool.
and it is driven user (as a stakeholder with particular interests, you are enabled to rank institutions with comparable profiles according to the criteria important to you).
On the basis of an extensive stakeholder consultation process (focusing on relevance) and a thorough methodological analysis (focusing on validity, reliability and feasibility),
In addition the users are given the opportunity to choose the indicators on which they want to rank the institutions selected.
There are however clear signals that there would be significant continuing interest from outside Europe from institutions wishing to benchmark themselves against European institutions.
Many European stakeholders are interested in assessing and comparing European higher education and research institutions and programmes globally.
classifications, and rankings-from the point of view of the information they could deliver to assist different stakeholders in their different decisions regarding higher education and research institutions.
and research institutions have for different groups of stakeholders, to define explicitly our conceptual framework regarding the different functions of higher education institutions,
Users and stakeholders themselves should be enabled to decide which indicators they want to select to create the rankings that are relevant to their purposes.
They reflect a growing international competition among universities for talent and resources; at the same time they reinforce competition by their very results.
On the positive side they 1 http://en. wikipedia. org/wiki/Three points for a win 25 urge decision-makers to think bigger and set the bar higher,
Transparency tools are instruments that aim to provide information to stakeholders about the efforts and performance of higher education and research institutions.
Quality assurance, evaluation or accreditation, also produces information to stakeholders (review reports, accreditation status) and in that sense helps to achieve transparency.
global) Businessweek (business schools, USA+global) The Economist (business schools; global) The major dimensions along which we analysed the classifications, rankings and league tables included:
while that type of scientific communication is prevalent only in a narrow set of disciplines (most natural sciences, some fields in medicine) but not in many others (engineering, other fields in medicine and natural sciences, humanities
-methodology. html http://www. topuniversities. com/university rankings/world-university rankings http://www. socialsciences. leiden. edu/cwts/products-services/leiden-ranking-2010
Surveys among stakeholders such as staff members, students, alumni or employers. Surveys are strong methods to elicit opinions such as reputation or satisfaction,
Student satisfaction and to a lesser extent satisfaction of other stakeholders is used in national rankings, but not in existing global university rankings.
and give them the opportunity to verify thepre-filled'data as well. The U-Map test withpre-filling'from national data sources in Norway appeared to be resulted successful
Student demand. There is evidence that student demand and enrolment in study programmes rises after positive statements in national, student-oriented rankings.
Both in the US and Europe rankings are used not equally by all types of students (Hazelkorn, 2011:
In nations across the globe, global rankings have prompted the desire forworld-class universities'both as symbols of national achievement and prestige and supposedly as engines of the knowledge economy (Marginson, 2006.
The problem of the reputation race is that the investments do not always lead to better education and research,
and to policy-makers to consider where in the higher education system investment should be directed for the system to fulfil its social functions optimally.
The point of the preceding observations was not that all kinds of stakeholders react to rankings,
The various functions of higher education and research institutions for a heterogeneity of stakeholders and target groups can only be addressed adequately in a multidimensional approach.
Involvement of stakeholders in the process of designing a ranking tool and selecting indicators is crucial to keep feedback loops short,
This includes statistical procedures as well as the inclusion of the expertise of stakeholders, rankings and indicator experts, field experts (for the field-based rankings) and regional/national experts.
Different users and stakeholders should be able to construct different sorts of rankings. This is one of the Berlin Principles.
and does not produce the information most valued by major groups of stakeholders: students, potential students, their families, academic staff and professional organizations.
These stakeholders are interested mainly in information about a particular field. This does not mean that institutional-level rankings are not valuable to other stakeholders and for particular purposes.
The new instrument should allow for the comparisons of comparable institutions at the level of the organization as a whole and also at the level of the disciplinary fields in
or its transfer to stakeholders outside the higher education and research institutions (knowledge transfer) or to various groups oflearners'(education).
A fourth assumption refers to the different stakeholders or users of rankings. Ranking information is produced to inform users about the value of higher education and research,
So 41 stakeholders and users have to rely on information that is provided by a variety of transparency tools and quality assessment outcomes.
Other stakeholders (students and institutional leaders are prime examples) are interested precisely in what happens inside the box.
Students might also be interested in the long-term impact of taking the program as they may see higher education as an investment
which are relevant to the different stakeholders and their motives for using rankings. The conceptual grid shown below must be applied twice:
For different dimensions (research, teaching & learning, knowledge transfer) and different stakeholders/users the relevance of information about different aspects of performance may vary.
Additional context information may be needed to allow for the valid interpretation of specific indicators by different stakeholders. 42 Table-2-1:
The AUBR Expert Group5 (a o.)underlines the importance of stakeholders'needs and involvement, as well as the principles of purposefulness, contextuality,
and availability of the various indicators in practice. 3. 2 Stakeholders'involvement The indicator selection process is illustrated in Figure 3-1. This process is driven highly stakeholder.
Various categories of stakeholders (student organizations, employer organizations, associations and consortia of higher education institutions, government representatives, international organizations) have been involved in an iterative process of consultation to come to a stakeholder-based assessment of the relevance
This first list was exposed for feedback to stakeholders as well as to groups of specialist experts. Stakeholders were asked to give their views on the relative relevance of various indicators
presented to them as potential items in the five dimensions of U multirank (see 3. 3). In addition,
The information gathered was fed into a second round of consultations with stakeholder organizations. In all some 80 national and international organizations participated in the consultation process.
To further support the stakeholder consultation process, an on-line questionnaire was used. Through this process an additional 40 organizations offered their views.
The stakeholders'consultation process led to the selection of a set of indicators based on the criterion of relevance (according to stakeholders'perspectives.
Literature review Review of existing rankings Review of existing databases First selection Stakeholder consultation Expert advice Second selection Pre-test Revision Selection
Based on the various stakeholders'and experts'assessments of the indicators as well as on our analyses using the four additional criteria,
As one of the main objectives of our U multirank project is to inform stakeholders such as students,
peer learning, counselling services, etc. 8 Outputs are direct products of a process, outcomes relate to achievements due to the outputs. 9 http://www. oecd. org/document/22/0, 3343, en 2649 35961291 40624662 1 1 1 1,
Students'opinions may derive from investment or from consumption motives but it is an axiom of economic theories as well as of civil society that persons know their own interest (and experience) best.
Therefore we have chosen indicators reflecting both. An issue might be whether student satisfaction surveys are prone to manipulation:
and findings that came out during the stakeholder/expert consultations and the pretesting phases of the selection process (Table 3-1). Table 3-1:
Stakeholders questioned relevance. 2 Graduation rate The percentage of a cohort that graduated x years after entering the program (x is stipulated the normal')time expected for completing all requirements for the degree times 1. 5
) Graduation rate regarded by stakeholders as most relevant indicator. Shows effectiveness of schooling process. More selective institutions score better compared to (institutions in) open access settings.
Relevant indicator according to stakeholders: shows teaching leads to broadly-educated graduates. But sensitive to regulatory (accreditation) and disciplinary context.
of graduates 18 months after graduation)( for bachelor graduates and master graduates) Reflects extent to which institution isin sync'with environment.
1. 5) See above institutional ranking 8 Investment in laboratories for Engineering FBR Investment in laboratories (average over last five years, in millions in national currencies) per student High
opportunities for early participation in research and stimulation to give conference papers. 18 Student satisfaction:
support services/e-services. 20 Student satisfaction: Organization of program The satisfaction of students with the organization of a program,
Opportunities for a stay abroad Index made up of several items: The attractiveness of the university's exchange programs and the partner universities;
Student services Quality of a range of student services including: general student information, accommodation services,,
financial services, career service, international office and student organizations/associations 27 Student Satisfaction: University webpage Quality of information for students on the website.
English-language information (for international students in non-English speaking countries) One indicator dropped from the list during the stakeholder consultation is graduate earnings.
Stakeholders'feedback on the student satisfaction indicators revealed that they have a positive view overall of the relevance of the indicators on student satisfaction.
and services provided by the institution to enhance the learning experience (e g. laboratories, curriculum). Research 3. 3. 2selecting indicators for capturing the research performance of a higher education and research institution or a disciplinary unit (e g. department,
Impact indicators, referring to the contribution of research outcomes to society, culture, the environment and/or the economy.
X Book chapters X Monographs/Books X Artefacts X Prototypes X Source: Expert Group on Assessment of University-Based Research (2010) Apart from using existing bibliometric databases,
along with some comments reflecting their assessment (by stakeholders and experts) against the criteria discussed in the first section of this chapter.
and in (international cultural competitions, including awards granted by academies of science. Indicator of peer esteem.
However, stakeholders regarded them as relevant, even though data availability and definitions may sometimes pose a challenge.
The process by which the knowledge, expertise and intellectually linked assets of Higher education institutions are applied constructively beyond Higher education for the wider benefit of the economy and society, through two-way engagement with business, the public sector, cultural and community partners.
i e. business and the economy, has now become a preoccupation of many governing and funding bodies, as well as policy-makers.
TTOS provide services in terms of assessing inventions, patenting, licensing IP, developing and funding spin-offs and other start-ups and approaching firms for contract based arrangements.
A typical classification of mechanisms and channels for knowledge transfer between higher education and research institutions and other actors would include four main interaction channels for communication between higher education and research institutions and their environment:
together with in the right hand column some of the pros and cons of the indicators expressed by experts and stakeholders during the indicator selection process.
as a proportion of all patents See above institutional ranking 70 12 Joint research contracts with private sector Budget and number of joint research projects with private enterprises per fte academic
Cultural awards and prizes won in (inter) national cultural competitions would be an additional indicator that goes beyond the traditional technology-oriented indicators.
and increase international competition. The rationales that drive these activities are diverse. Among others, they comprise (IAU, 2005:
Foreign academic staff is academic staff with a foreign Considered to be relevant by stakeholders.
Some stakeholders see it as less relevant. Availability of data problematic. 4 International joint research publications Relative number of research publications that list one or more author affiliate addresses in another country relative to research staff
Stakeholders question relevance. 10 Student satisfaction: Internationalization of programs Index including the attractiveness of the university's exchange programs, the attractiveness of the partner universities, the sufficiency of the number of exchange places;
Stakeholders consider the indicator important. 13 Student satisfaction: International orientation of programs Rating including several issues:
'This describes the opportunities for students to go abroad. Students'judgments about the opportunities to arrange a semester
or an internship abroad are an aspect of the internationalization of programs. This indicator is relevant for the field level.
but dropped during the stakeholders'consultation process isSize of international office'.'While this indicates the commitment of the higher education and research institution to internationalization,
stakeholders consider this indicator not very important. Moreover, the validity is questionable as the size of the international office as a facilitating service is a very distant proxy indicator.
because a large majority of stakeholders judged this to be insufficiently relevant. At the field level this indicator was seen
However, it was dropped from the list during the stakeholder consultation as there is no clear internationally accepted way of counting partnerships.
a social dimension, an enterprise dimension and an innovation dimension. The latter two dimensions are covered in the U multirank dimensionKnowledge Transfer'.
along with the comments made during the stakeholder and expert consultations. Table 3-6: Indicators for the dimension Regional Engagement in the Focused Institutional and Field-based Rankings Focused Institutional Ranking Definition Comments 1 Graduates working in the region The number of graduates working in the region,
Stakeholders like indicator. No national data on graduate destinations. 19http://epp. eurostat. ec. europa. eu/portal/page/portal/region cities/regional statistics/nuts classification 20 http://www. oecd
New type of indicator. 5 Student internships in local/regional enterprises The number of student internships in regional enterprises as a percentage of total enrolment (with defined minimum of weeks
and/or credits) Internships open up communication channels between HEI and regional/local enterprises. Stakeholders see this as important indicator.
Definition of internship problematic and data not readily available. Disciplinary bias. Field-based Ranking Definition Comments 6 Degree theses in cooperation with regional enterprises Number of degree theses in cooperation with regional enterprises as a percentage of total number
of degree theses awarded, by level of program Reflects regional cooperation and curricular engagement. Indicator hardly ever used. 7 Graduates working in the region The number of graduates working in the region,
Indicator hardly ever used. 9 Student internships in local/regional enterprises Number of internships of students in regional enterprises (as percentage of total students See above institutional ranking,
and stakeholders did not particularly favor the indicator. Therefore it was dropped from our list. The same holds for measures of the regional economic impact of a higher education institution,
However, stakeholders felt this indicator not to be relevant. A high percentage of new entrants from the region may be seen as the result of the high visibility of regionally active higher education and research institutions.
US AR, CA, SA, US Regional Engagement income from regional sources AU, CA, SA, ZA student internships in local/regional enterprises AU
joint R&d projects with local enterprises; students: total number (female/international degree and exchange students;
degree theses in cooperation with local enterprises; regional engagement: continuing education programmes/professional development programmes;
This was the case forgraduates working in the region'andstudent internships in regional enterprises'.
and learning experience/environment. Some students would have preferred more questions about the social climate at the institution
Based on approved instruments from other fields (e g. surveys on health services) we have usedanchoring vignettes'to test sociocultural differences in assessing specific constellations of services/conditions in higher education with respect to teaching and learning.
All institutions had clear communication partners from the U multirank team. 4. 4 A concluding perspective This chapter, providing a quick survey of existing databases,
One could even go beyond these stakeholder groups and include employers and other clients of higher education and research institutions,
respondents will always have to have the opportunity to provide footnotes and comments to the data they submit through the questionnaires.
an Institute for Water and Environment, an agricultural university, a School of Petroleum and Minerals, a military academy, several music academies and art schools, universities of applied sciences and a number of technical universities.
This provided the institutions with an opportunity for a second submission in which they could provide answers to the questions,
assessment of data procedures and communication Other questions in the follow-up survey referred to the efficiency of data collection and the clarity of the questionnaires.
regional engagement)( See Figure 5-4). 02468 10 12 very good good neutral poor very poor General procedures Communication with U multirank 0123456789
all universities had clear communication partners in the U multirank team. The main part of the verification process consisted of the data cleaning procedures after receiving the data.
A general and central feature of these procedures was the direct communication with the institutions.
www. socialsciences. leiden. edu/cwts/products-services/scoreboard. html 4) Regional joint research publications Frequency count of publications with at least one author address referring to the selected main organization
and the potential upscaling of U multirank to a globally applicable multidimensional ranking tool. 6. 2 Feasibility of indicators In the pilot study we analyzed the feasibility of the various indicators that were selected after the multi-stage process of stakeholder
the relative importance of the indicator according to the various stakeholders'perspectives validity: the indicator measures
ratingB'indicates that some stakeholders and/or experts have expressed some doubts regarding one or two selection criteria.
For this reconsideration process a special and final stakeholders'workshop was organized. For indicators with a problematic feasibility score there are two options:
'The last column (In/Out in the tables shows the respective conclusions on those indicators based on consultation with stakeholders
nevertheless as highly relevant by stakeholders. The indicatorinclusion of work experience'is a composite indicator using a number of data elements (e g. internships, teachers'professional experience outside HE) on employability issues;
Stakeholders, in particular representatives of art schools, stressed the relevance of this indicator despite the poor data situation.
initiatives should also come from providers of (bibliometric) databases as well as stakeholder associations in the sector.
staff with work experience outside HE A b Joint research contracts with private enterprise A b Patents awarded**C C Out Co-patenting**B c Out Annual income from licensing B c
There was an agreement among stakeholders therefore, that those indicators should be used for focused institutional rankings only. 129 International orientation 6. 2. 4most of the indicators on the dimensioninternational orientation'proved to be relatively unproblematic in terms of feasibility.
A a-B Opportunities to study abroad (student satisfaction) A b International orientation of programs A b International academic staff B A-B International joint research publications
working in the region B c In Research contracts with regional partners B b Regional joint research publications*B A Percentage of students in internships in local enterprises B c In*Data source:
and the relevance of higher education and research to the regional economy and the regional society at large,
and stakeholders were strongly in favor of keeping the indicator (both for institutional and for field-based rankings).
education B c Out Student internships in local enterprises B b-C In Degree theses in cooperation with regional enterprises B b-C In Summer schools C C
Based on feedback from institutions and stakeholders, this indicator cannot be seen as feasible; there is probably no way to improve the data situation in the short term.
While far from good, the data situation on student internships in local enterprises and degree theses in cooperation with local enterprises turned out to be less problematic in business studies than that found in the engineering field.
and knowledge of local higher education institutions to be utilized in a regional context, in particular in small-and medium-sized enterprises.
Although it implied a major investment of time by the project team, this procedure proved to be very efficient
and use ordinal response categories to evaluate services and social situations in general (cf. King et al 2004, King and Wand 2006.
For communication purposes, authors may prefer to replace the name of the university with the name of a 137 network.
While this provides a unique opportunity to compare and benchmark with over 100 other institutions worldwide,
From their participation in the various stakeholder meetings, we can conclude that there is broad interest in the further development and implementation of U multirank.
We also expect that there will be continuing interest from outside Europe from institutions wishing to benchmark themselves against European institutions.
And we believe that there are opportunities for the targeted recruitment of groups of institutions from outside Europe of particular interest to European higher education.
some issues of up-scaling to other fields have been discussed in the course of the stakeholder consultation.
Following the user-and stakeholder-driven approach of U multirank, we suggest that field-specific indicators for international rankings should be developed together with stakeholders from these fields.
We encourage stakeholders and organizations to actively participate in the development of relevant field-specific indicators,
in particular in those areas and fields which so far have largely been neglected in international rankings due to the lack of adequate data
%income third party funding CPD courses offered startup firms international academic staff%international students joint international publ. graduates working in the region student internships in local enterprise
student internships in local enterprise regional copublication Institution 2-Institution 4 Institution 1--Institution 3-Institution 7---Institution 8--Institution 9 Institution 5-Institution 6-Teaching & Learning Research Knowledge
and discussed at a U multirank stakeholder workshop and there was a clear preference for thesunburst'chart similar to the one used in U-Map.
but it offers users the opportunity to create informed judgments of the importance of specific contexts
During the further development of U multirank the production of contextual information will be an important topic. 31 See www. lisboncouncil. net 151 7. 5 User-friendliness U multirank is conceived as a user-driven and stakeholder
the definition of the indicators, processes of data collection and discussion on modes of presentation have been based on intensive stakeholder consultation.
and a feasible business model to finance U multirank (see chapter 8). Another important aspect of user-friendliness is the transparency about the methodology used in rankings.
'An authoritative ranking could be produced from the perspective of a specific stakeholder or client organization.
Despite this clear need for cross-national/European/global data there will be a continued demand for information about national/regional higher education systems, in particular with regard to undergraduate higher education.
The prototypes of the instrument will demonstrate the outcomes and benefits of U multirank. 2. Setting of standards and norms and further development of underdeveloped dimensions and indicators.
Business model 8. Business plan and marketing. If the objective is to establish U multirank as largely self-sustainable,
a business plan is required. It could be a good idea to involve organizations with professional business expertise in the next project phase in order to work out a business plan,
and to analyze the revenue-generating potential, development of marketable products, pricing issues etc. The business plan must address a fundamental contradiction:
the user-driven approach imbues U multirank with strong democratic characteristics and a role far from commercial interests,
Communication 10. Communication and recruitment drive. The features of and opportunities offered by U multirank need to be communicated continuously.
Since the success of U multirank requires institutions'voluntary participation a comprehensive promotion and recruitment strategy will be needed,
requiring the involvement of many key players 160 (governments, European commission, higher education associations, employer organizations, student organizations).
A crucial issue related to communication is the user-friendliness of U multirank. This could be guaranteed by the smoothness of data collection
and the services delivered to participants in the ranking process. But user-friendliness also deals with the design of the web tool,
Elements of a new project phase Work package Products Deadline Database and web tool Functioning database Functioning web tool prototype 06/2012 Standards
-filling opportunities (including EUMIDA cooperation) Pre-filled questionnaires Coordination with national rankings 06/2012 12/2012 12/2012 Roll out Invitation Targeted non-European recruitment
business plan, user-friendliness and communication Advisory boards work Consortium, formal organization and business plan (including funding structure) established Development of marketable products 06/2012 12/2013
The criteria were derived from the analytical findings of the feasibility study, from the stakeholder consultation process,
The transparency tool must have the trust of participating institutions and other stakeholders. This means that the organization managing the instruments must be accountable and subject to continuous evaluation and assessment.
This will guarantee high standards of the planning, implementation, communication and further development of the instruments,
A key element of U multirank is the flexible, stakeholder-oriented, user-driven approach. The implementation has to ensure this approach,
for instance by integrating stakeholders into consultation structures, creating information products for stakeholder needs and service-oriented communication processes.
In general, the involvement of relevant actors in both the implementation of U multirank and its governance structure is a crucial success factor.
those parties taking responsibility for the governance of U multirank should be accepted broadly by stakeholders. Those who will be involved in the implementation should allow their names to be affiliated with the new instrument
In this model (a consortium of) private, for-profit organizations would run the instrument with for-profit objectives.
Products and services would be made available to users at market-based tariffs. The strategy, use of the instrument
and its further development would be driven by market demands. Potential organizations could be founded newly, but existing institutions could also take on the role,
Stakeholder model: In this model, major stakeholders, i e. student organizations and associations of higher education institutions, would be responsible for the operation of the transparency instrument.
Independent nonprofit model: In this model, an existing or new cross-national/international organization (or alliance of organizations) independent of government or direct stakeholder interests would act as principal of the transparency tools.
The organization would work under nonprofit conditions and would have to find a funding structure covering the cost.
Only services with sufficient demand are offered (market dependence), multidimensional approach and inclusiveness endangered. Independence from direct political influence.
Profit orientation is a good incentive to be efficient. Profit orientation endangers quality and credibility.
Doubts about financial feasibility, because if HEI experience high workloads with data collection they expect free products in return
Service orientation might not be the primary interest in a state-run system. 165 STAKEHOLDER MODEL PRO CON High legitimacy and acceptance among stakeholders (included.
If not all stakeholders are represented inclusiveness becomes difficult. Good chance for international orientation. Inefficiency because of difficulties to find a common ground between stakeholders.
No independence from stakeholder organization interests could be ensured, therefore problems with credibility from the point of view of the end user.
INDEPENDENT NONPROFIT MODEL PRO CON Institutions with strong funding base such as foundations enhance sustainability. Institutions with weak funding base such as research institutes endanger sustainability.
Assessment of the four models for implementing U multirank Criteria Model Inclusiveness International orientation Independence Professionalism Sustainability Efficiency Service orientation Credibility Commercial--+Government++-Stakeholder-+-Independent
The stakeholder model should be recognized in so far as an advisory board could guarantee the connection to relevant groups of stakeholders.
The business plan has to include opportunities to charge fees, sell products and involve commercial partners.
therefore that rankings would be operated (initially) on a project basis by existing professional organizations with a strong involvement of both stakeholder and expert advisory bodies.
Stakeholder and expert advisory councils should be installed in a form that could continue to operate after the two years'project Phase in order to support the development of a viable business plan a partnership with professional
The professional organizations responsible for the first phase could establish the ranking unit as a joint venture with the stakeholder
This structure also allows the commercial unit to operate as a joint venture with for-profit partners.
for-profit institution (s) Operating Project Consortium national field-based ranking partner (s) Stakeholder advisory council/expert advisory council advice governance
Methodological development and updates Communication activities Implementation of (technical) infrastructure Development of a database Provision of tools for data collection Data collection (again including communication) 170 Data analysis (including self-collected
data as well as analysis based on existing data sets as e g. bibliometric analysis) Data publication (including development and maintenance of an interactive web tool) Basic information services for users Internal
This determines the volume of data that has to be processed and the communication efforts. The number of countries/institutions which deliver data for free through a bottom-up system (this avoids costs.
An additional factor is the technological environment for the surveys for instance a student survey is much more expensive
Fixed and variable cost factors STEP FIXED COST FACTOR VARIABLE COST FACTOR Methodological developments and updates Staff demand Cycle of revision/update of concepts
Intensity of stakeholder involvement Communication activities Staff demand Number of countries and institutions covered Intensiveness of communication (written only, electronic, workshops etc) Implementation of (technical) infrastructure Basic
to present results Information services for users Staff Basic IT costs Number of countries and institutions covered Range of indicators and databases Scope of information services Internal organization
Costs for internal communication/meetings Size of operative unit**The size of the operative unit itself is dependent on all factors listed above. 172 The major fixed cost elements in overview include:
and performance measurement and in stakeholder communication processes (as head of the project/unit);(2) two junior staff members with experiences in statistics, empirical research, large-scale data collection, IT;(
Marketing and communication: the design and development of information packages on ranking and the dissemination of the outcomes as well as the staff time needed to do this.
There are a number of opportunities to find funding sources for U-Multirank: a) Basic funding by the governing institutions in form of a lump sum.
), Government Governmental funding (c),(d),(f),(g h i) Stakeholder None (b),(d e f),(g h i) Independent nonprofit Basic funding by owners (b),(d e f),(g h i) Again,
together with the stakeholder model it has the broadest set of funding options, and in contrary to the stakeholder model also a clear potential basic funding source.
If it is combined with the commercial model all relevant funding options are available. The funding scenarios could be specified further
but there is a possibility of some cross-subsidization from selling more sophisticated products such as data support to institutional benchmarking processes, special information services for employers, etc.
and a business plan has to be designed. After two more years, the roll out of the system should include about 700 European higher education institutions and about 500 institutions in the field-based ranking for each of three fields.
Organizational options such as market, stakeholder, government or independent nonprofit models should be seen as complementary approaches.
The Economics of Credence Goods"Journal of Economic Literature 44 (1): 5-42. Enquist, G. 2005) The internationalisation of higher education in Sweden, the National Agency for Higher education, Högskoleverkets rapportserie 2005: 27 R, Stockholm Espeland, W. N,
Nuffic (2010) Mapping internationalization, http://www. nuffic. nl/international-organizations/services/quality-assurance-and-internationalization/mapping-internationalization-mint OECD (2003), Turning Science into Business:
Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011