The socalled consumer doubles as a domestic producer a cook, a mother, a carer, a shopper, a driver, a nurse, a gardener, a teacher or student entailing so much of
Until recently, Un Techo para Chile had no legal status it was simply a loose network of students, young professionals, and residents.
One example 5 SCALING AND DIFFUSION 93 Organic farming students at Everdale. Everdale is an organic farm and environmental learning centre.
giving students the experience of working in small social enterprises. These could play a critical role in training up a future cadre of social innovators. 249) Mutual help and mentoring by users.
a two-millionsquare-foot research centre that brings together scientific leaders and postdoctoral students, with a target of 4, 000 researchers on-site by 2015,
where students are divided into action learning sets for the duration of the one year course. 306) Membership organisations like the Royal Society for the Encouragement of the Arts, Manufactures and Commerce (RSA) in the UK
This is the West Philly Hybrid X Team, a group of students from West Philadelphia High school's Academy of Automotive and Mechanical engineering with their entry, the EVX.
when a group of students set up their own visual arts studio. The students work Processing bamboo as part of Prosperity Initiative's plan to transform the bamboo sector in Northwest Vietnam.
In two years the project has enabled 22,000 people to move out of poverty. The project's goal is to move 750,000 people out of income poverty across Vietnam, Lao PDR and Cambodia by 2020.
The students are responsible for running the studio and raising funds. In this way, it combines creative freedom, business practice,
and opening parts of parks or schools for residents and students to grow flowers, fruits, and vegetables.
which can then be eaten by students at lunchtime. 515) Community centres that merge into household activities childcare, entertainment,
Secondly, the university's continuous capacity to provide students with new ideas, skills and entrepreneurial talent has become a major asset in the Knowledge Society.
Students are not only the new generations of professionals in various scientific disciplines, business, culture etc. but they can also be trained
and job creation (see, for example Startx, Stanford's student start-up accelerator, which in less than a year 6 trained 90 founders and 27 companies4,
or the Team Academy-the Entrepreneurship Centre of Excellence of JAMK University of Applied sciences in Jyväskylä, Finland, where students run their own cooperative businesses based on real-life projects5).
The fashion department of the Antwerp Academy in Belgium encourages students to create and explore innovative forms,
In 2010, Kista Science City counted over 1, 000 ICT companies and over 5, 000 ICT students and scientists, a high concentration of expertise, innovation and business opportunities within ICT
Teaching & Learning (student satisfaction scores...124 Table 6-4: Focused institutional ranking indicators: Research...
transparent and comparable information would make it easier for students and teaching staff, but also parents and other stakeholders, to make informed choices between different higher education institutions and their programmes.
Education and Culture but other experts drawn from student organisations, employer organisations, the OECD, the Bologna Follow-up Group and a number of Associations of Universities.
Stakeholder workshops were held four times during the project with an average attendance of 35 representatives drawn from a wide range of organisations including student bodies, employer organisations, rectors'conferences, national university associations and national representatives.
or field based rankings. student staff ratiograduation ratequalification of academic staffresearch publication outputexternal research incomecitation index%income third party fundingcpd courses offeredstartup firmsinternational academic
e g. students vs. institutional leaders vs. policy-makers Methodology and producers: which methodological principles are applied and
and focusing on performance Rankings for students such as those of CHE and Studychoice123, which have a clear focus based on a single target group,
and with a transparent methodology Qualifications frameworks and Tuning Educational Structures, showing that at least qualitatively it is possible to define performances regarding student learning
thus strengthening the potential information base for other dimensions than fundamental research Comparative assessment of higher education student's learning outcomes (AHELO):
this feasibility project of the OECD to develop a methodology extends the focus on student learning introduced by Tuning and by national qualifications frameworks into an international comparative assessment of undergraduate students,
(4. 5%)Income per staff (2. 25%)Ratio Phd awards/bachelor awards (2. 25%)Faculty student ratio (20%)31 HEEACT 2010
and Fields Medals (20%)Highly cited researchers in 21 broad subject categories (20%)Reputation Peer review survey (19.5+15=34.5%)International staff score (5%)International students score (5
staff and students (5%)Industry income per staff (2. 5%)International faculty (5%)International students (5%)Website http://ranking. heeact. edu. tw
Surveys among stakeholders such as staff members, students, alumni or employers. Surveys are strong methods to elicit opinions such as reputation or satisfaction,
Student satisfaction and to a lesser extent satisfaction of other stakeholders is used in national rankings, but not in existing global university rankings.
Student demand. There is evidence that student demand and enrolment in study programmes rises after positive statements in national, student-oriented rankings.
Both in the US and Europe rankings are used not equally by all types of students (Hazelkorn, 2011:
less by domestic undergraduate entrants, more at the graduate and postgraduate levels. Especially at the undergraduate level, rankings appear to be used particularly by students of high achievement and by those coming from highly educated families (Cremonini, Westerheijden & Enders, 2008;
Heine & Willich, 2006; Mcdonough, Antonio & Perez, 1998. Institutional management. Rankings strongly impact on the management in higher education institutions.
The majority of higher education leaders report that they use potential improvement in rank to justify claims on resources (Espeland & Saunder, 2007;
i e. a situation where already strong institutions are able to attract more resources from students (e g. increase tuition fees),
Most of the effects discussed above are rather negative to students, institutions and the higher education sector.
Similarly, rankings may provide useful stimuli to students to search for the best-fitting study programmes
For some target groups, in particular students and researchers, information has to be based field; for others, e g. university leaders and national policy-makers, information about the higher education institution as a whole has priority (related to the strategic orientation of institutions;
students, potential students, their families, academic staff and professional organizations. These stakeholders are interested mainly in information about a particular field.
Other stakeholders (students and institutional leaders are prime examples) are interested precisely in what happens inside the box.
For instance, students may want to know the quality of teaching in the field in which they are interested.
Students might also be interested in the long-term impact of taking the program as they may see higher education as an investment
budgets, personnel, students, facilities, etc. Then too, inputs and 43 processes can be influenced by managers of higher education and research institutions.
but in the end it rests with the students to learn and, after graduation, work successfully with the competencies they have acquired.
the percentage of international students is a valid indicator only if scores are influenced not heavily by citizenship laws.
therefore a higher validity than using citizenship of the student. Reliability Reliability refers to the consistency of a set of measurements or measuring instrument.
This is particularly an issue with survey data (e g. among students, alumni, staff) used in rankings.
if we know that doctoral students are counted as academic staff in some countries and as students in others,
Various categories of stakeholders (student organizations, employer organizations, associations and consortia of higher education institutions, government representatives, international organizations) have been involved in an iterative process of consultation to come to a stakeholder-based assessment of the relevance
students). ) Education can be conceived as a process subdivided in enablers (inputs, 6 process7) and performance (outputs and outcomes8.
Students'learning outcomes after graduation would be a good measure of outcomes. However measures of learning outcomes that are internationally comparable are only now being developed in the AHELO project (see chapter 1) 9. At this moment such measures do not exist,
As one of the main objectives of our U multirank project is to inform stakeholders such as students,
and student quality and quantity. 7 The process of education includes design and implementation of curricula, with formal teaching, self study,
Besides, measures of students'progressing through their programs can be seen as indicators for the quality of their learning.
'indicators for quality can be sought in student and graduate assessments of their learning experience. The student/graduate experience of education is conceptually closer to
what those same students learn than judgments by external agents could be. Students'opinions may derive from investment or from consumption motives
but it is an axiom of economic theories as well as of civil society that persons know their own interest (and experience) best.
Therefore we have chosen indicators reflecting both. An issue might be whether student satisfaction surveys are prone to manipulation:
do students voice their loyalty to the institution rather than their genuine (dissatisfaction? This is not seen as a major problem as studies show that loyalty depends on satisfaction (Athiyaman, 1997;
Brown & Mazzarol, 2009; OECD, 2003. Nevertheless we should remain vigilant to uncover signs of university efforts to manipulate their students'responses;
in our experience, including control questions in the survey on how and with which additional information students were approached to participate gives a good indication.
Non-plausible student responses (for instance an extremely short time to complete the online questionnaire) could be eliminated.
Another issue about using surveys in international comparative studies concerns differences in culture that affect tendencies to respond in certain ways.
Field-based Ranking Definition Comments 6 Student-staff ratio The number of students per fte academic staff Fairly generally available.
1. 5) See above institutional ranking 8 Investment in laboratories for Engineering FBR Investment in laboratories (average over last five years, in millions in national currencies) per student High
joint courses/projects with business students (engineering; business knowledge (engineering; project management; presentation skills; existence of external advisory board (including employers) Problems with regard to availability of data. 56 13 Inclusion of work experience into the program Rating based on duration (weeks/credits) and modality
access to computer support Data easily available. 15 Student gender balance Number of female students as a percentage of total enrolment Indicates social equity (a balanced situation is considered preferable.
Student satisfaction indicators Indicators reflecting students'appreciation of several items related to the teaching & learning process.
Student satisfaction is of high conceptual validity. It can be made available in a comparative manner through a survey.
whether student satisfaction surveys are prone to manipulation: do students voice their loyalty to the institution rather than their genuine (dissatisfaction?
Global comparability problematic: Cross-cultural differences may affect the students'answers to the questions. 16 Student satisfaction:
Overall judgment of program Overall satisfaction of students with their program and the situation at their higher education institution Refers to single question to give anoverall'assessment;
no composite indicator. 17 Student satisfaction: research orientation of educational program Index of four items: research orientation of the courses, teaching of relevant research methods,
opportunities for early participation in research and stimulation to give conference papers. 18 Student satisfaction:
Evaluation of teaching Satisfaction with regard to student's role in the evaluation of teaching, including prevalence of course evaluation by students,
relevance of issues included in course evaluation, information about evaluation outcomes, impact of evaluations 57 19 Student satisfaction:
Facilities The satisfaction of students with respect to facilities, including: Classrooms/lecture halls: Index including: Availability/access for students;
number of places; technical facilities/devices; Laboratories: Index including: Availability/access for students; number of places;
technical facilities/devices; Libraries: Index including: availability of literature needed; access to electronic journals; support services/e-services. 20 Student satisfaction:
Organization of program The satisfaction of students with the organization of a program, including possibility to graduate in time,
access to classes/courses, class size, relation of examination requirements to teaching 21 Student satisfaction:
Promotion of employability (inclusion of work experience) Index of several items: Students assess the support during their internships, the organization, preparation and evaluation of internships, the links with the theoretical phases 22 Student satisfaction:
Quality of courses Index including: Range of courses offered, coherence of modules/courses, didactic competencies of staff, stimulation by teaching quality of learning materials, quality of laboratory courses (engineering) 23 Student satisfaction:
Social climate Index including: Interaction with other students Interaction with teachers Attitude towards students in city Security 24 Student satisfaction:
Support by teachers Included items: Availability of teachers/professors (e g. during office hours, via email;
informal advice and coaching; feedback on homework, assignments, examinations; coaching during laboratory/IT tutorials (engineering only;
suitability of handouts. 25 Student satisfaction: Opportunities for a stay abroad Index made up of several items:
integration of the stay abroad into studies (no time loss caused by stay abroad) and support in finding internships abroad. 26 Student satisfaction:
Student services Quality of a range of student services including: general student information, accommodation services,,
financial services, career service, international office and student organizations/associations 27 Student Satisfaction: University webpage Quality of information for students on the website.
Index of several items including general information on institution and admissions, information about the program, information about classes/lectures;
English-language information (for international students in non-English speaking countries) One indicator dropped from the list during the stakeholder consultation is graduate earnings.
Although the indicator may reflect the extent to which employers value the institution's graduates,
Instead, the quality of the learning experience is reflected in the student satisfaction indicators included in Table 3-1. These indicators can be based on a student survey carried out among a sample of students from Business studies and Engineering.
Stakeholders'feedback on the student satisfaction indicators revealed that they have a positive view overall of the relevance of the indicators on student satisfaction.
objective indicators are used in addition to the student satisfaction indicators. Most are similar to the indicators in the focused institutional rankings.
People, including students and researchers, Artefacts, including equipment, protocols, rules and regulations, Money. Texts are an obvious knowledge transfer channel.
and promote international mobility of students and staff, Activities to develop and enhance international cooperation,
The increasing emphasis on the need to prepare students international labor markets and to increase their international cultural awareness,
Nationality not the most precise way of measuring international orientation. 72 nationality, employed by the institution or working on an exchange base 3 International doctorate graduation rate The number of doctorate degrees awarded to students
but bias towards certain disciplines and languages. 5 Number of joint degree programs The number of students in joint degree programs with foreign university (including integrated period at foreign university) as a percentage of total
Field-based Ranking Definition Comments 6 Incoming and outgoing students Incoming exchange students as a percentage of total number of students and the number of students going abroad as a percentage of total
number of students enrolled Important indicator of the internationalatmosphere'of a faculty/department. Addresses student mobility and curriculum quality.
or in an international organization as a percentage of the total number of graduates employed Indicates the student preparedness on the international labor market.
Stakeholders question relevance. 10 Student satisfaction: Internationalization of programs Index including the attractiveness of the university's exchange programs, the attractiveness of the partner universities, the sufficiency of the number of exchange places;
but no problems of disciplinary distortion because comparison is made within the field. 12 Percentage of international students The number of degree-seeking students with a foreign diploma on entrance as percentage of total enrolment in degree programs.
Reflects attractiveness to international students. Data available but sensitive to location (distance to border) of HEI.
Stakeholders consider the indicator important. 13 Student satisfaction: International orientation of programs Rating including several issues:
existence of joint degree programs, inclusion of mandatory stays abroad, international students (degree and exchange), international background of staff
It should be pointed out here that one of the indicators is a student satisfaction indicator:Student satisfaction:
Internationalisation of programs'.'This describes the opportunities for students to go abroad. Students'judgments about the opportunities to arrange a semester
or an internship abroad are an aspect of the internationalization of programs. This indicator is relevant for the field level.
An indicator that was considered, but dropped during the stakeholders'consultation process isSize of international office'.
learning and scholarship that engage faculty, students and region/community in mutual beneficial and respectful collaboration.
How much does the institution draw on regional resources (students, staff, funding) and how much does the region draw on the resources provided by the higher education and research institution (graduates and facilities)?
New type of indicator. 5 Student internships in local/regional enterprises The number of student internships in regional enterprises as a percentage of total enrolment (with defined minimum of weeks
Indicator hardly ever used. 9 Student internships in local/regional enterprises Number of internships of students in regional enterprises (as percentage of total students See above institutional ranking,
but disciplinary bias not problematic at field level. 10 Summer school/courses for secondary education students Number of participants in schools/courses for secondary school students as a percentage of total enrolment
and from students. 4. 2 Databases Existing databases 4. 2. 1one of the activities in the U multirank project was to review existing rankings
US AR, CA, SA, US Regional Engagement income from regional sources AU, CA, SA, ZA student internships in local/regional enterprises AU
three for the institutions and one for students. 88 The four surveys are: U-Map questionnaire institutional questionnaire field-based questionnaire student survey.
students: numbers; modes of study and age; international students; students from region; graduates: by level of program;
subjects; orientation of degrees; graduates working in region; staff data: fte and headcount; international staff; income:
total income; income by type of activity; by source of income; expenditure: total expenditure; by cost centre;
students: enrolments; programme information: bachelor/master programmes offered; CPD courses; graduates: graduation rates; graduate employment;
students: total number (female/international degree and exchange students; internships made; degree theses in cooperation with local enterprises;
regional engagement: continuing education programmes/professional development programmes; summer schools/courses for secondary education students; description:
accreditation of department; profile with regard to teaching & learning, profile with regard to research. A second part of the questionnaire asks for details of the individual study programmes to be included in the ranking.
number of students enrolled in the programme; number of study places and level of tuition fees;
number of exchange students from abroad; courses held in foreign language; special features; number of graduates;
Student survey 4. 3. 2for measuring student satisfaction (see section 3. 3. 1), the main instrument is an online student survey.
In order to assure that students are pressured not by their institution/teachers to rate their own institution favorably,
the institutions were asked to invite their students individually to participate in the survey either by mail or email rather than having them complete the survey in the classroom.
The student questionnaire uses a combination of open questions and predefined answers and asks for the students'basic demographic data and information on their programme.
The main focus of the survey is on the assessment of the teaching and learning experience
Students were asked for information on how they received the invitation and whether there were any attempts by teachers,
As students were asked to rate their own institution and programme, students who had started just their degree programme were excluded from the sample.
Hence students from the second year onwards in bachelor and master programmes and from the third year onwards in long (pre-Bologna) programmes were meant to be included.
In order to have a sample size that allows for analysis, the survey aimed to include up to 500 students by institution and field.
Pretesting the instruments 4. 3. 3a first version of the three new data collection instruments (the institutional questionnaire,
department questionnaire and student questionnaire) was tested between June and September 2010. The U-Map questionnaire had already been tested.
and to distribute 20 student questionnaires. The selection was based on the list of institutions that had expressed their interest in participating in the project.
Questions about student numbers and study programmes seem to be unproblematic in most cases. Problems emerge however with some output-related 92 data elements such as graduate employment,
This was the case forgraduates working in the region'andstudent internships in regional enterprises'.
Information on international students and staff, as well as on programmes in a foreign language was largely available.
As expected, the question of how to define aninternational student'came up occasionally. In sum, the institutional questionnaire worked well in terms of its structure and usability.
The student survey was pretested on a sample of over 80 students. In general, their comments were very positive.
and captured relevant issues of the students'teaching and learning experience/environment. Some students would have preferred more questions about the social climate at the institution
and about the city or town in which it was situated; a number of reactions (also from pre-test institutions) indicated that the questionnaire should not be any lengthier, however.
A major challenge deduced from these comments is how to compare across cultures students'assessment of their institutions.
For the student questionnaire the conclusion was that there is no need for changes in the design.
one will have to rely to a large extent on data collected by means of questionnaires sent to representatives of institutions, their students and possibly their 95 graduates.
institutions, representatives of departments in the institution and students. Sampling techniques (selecting/identifying institutions, departments/programmes,
their representatives and their students) are crucial, as is the intelligent use of technology (internet, visualisation techniques, supporting tools).
As has been explained, the field pilot study included a student satisfaction survey. Participating institutions were asked to send invitations to their bachelor
and master's students to take part in a survey. 106 departments agreed to do so. Some institutions decided to submit the information requested in the departmental questionnaire
students were on holiday or taking examinations during the pilot study survey window. In some cases the response rate was very low
770 students provided data via the online questionnaire. After data cleaning we were able to include 5, 901 student responses in the analysis:
45%in business studies; 23%in mechanical engineering; and 32%in electrical engineering. 5. 3 Data collection The data collection for the pilot study took place via two different processes:
Organising a survey among students on a global scale was one of the major challenges in U multirank.
'but these usually focus on general aspects of student life and their socioeconomic situation. To the best of our knowledge there is no global survey asking students to assess aspects of their own institutions and programmes.
So we had no way of knowing whether students from different countries and cultures would assess their institutions in comparable ways.
In Chapter 8 (8. 2) we will discuss the flexibility of our approach to a global scale student survey.
They were asked to send invitation letters to their students, either by regular mail or by email.
We prepared a standard letter to students explaining the purpose of the survey/project and detailing the URL
No institution indicated that it did not participate in the student survey because of the cost of inviting the students.
Australia) the students were taking examinations or were on vacation at the time the survey started.
770 students participated in the survey, of this total 5, 901 could be included in the analysis. 106 5. 3. 1. 2 Follow-up survey After the completion of the data collection process we asked those institutions that submitted data to share their experience of the process
Most pilot institutions reported no major problems with regard to student, graduate and staff data. If they had problems these were mostly with research and third mission data (knowledge transfer,
The student survey For the student survey, after data checks we omitted the following elements from the gross student sample:
Missing data on the students'institution Missing data on their field of study (business studies, mechanical engineering, electrical engineering) Students enrolled in programs other than bachelor/short national first degree programs
and master/long national first degree programs Students who had spent little time on the questionnaire and had responded not adequately.
Students had to answer at least parts of the questions that are used to calculate indicators and give the necessary information about their institution,
Students who reported themselves as formally enrolled but not studying actively Students reporting that they had moved just to their current institution Students who obviously did not answer the questionnaire seriously
In addition we performed a recoding exercise for those students who reported their field of study asother'.
'Based on their explanation and on the name of the programme they reported, the field was recoded manually in all cases where a clear attribution was possible.
As a result of these checks the data of about 800 student questionnaires have been omitted from the sample. International databases 5. 3. 2the data collection regarding the bibliometric and patent indicators took place by studying the relevant international databases
the indicators that have been built using the information from departmental questionnaires and the indicators related to student satisfaction data. 123 Table 6-2:
rating Feasibility score Data availability Conceptual clarity Data consistency Recommendation Student/staff ratio A a Graduation rate A b Qualification of academic staff
Teaching & Learning (student satisfaction scores) TEACHING & LEARNING Rating of indicators (pre-pilot) Feasibility score (post-pilot) Field-based ranking Student survey Relevance Concept/construct validity Face validity Robustness Availability Preliminary
Feasibility score Data availability Conceptual clarity Data consistency Recommendation External research income A a Total publication output*A a Student satisfaction:
publications*A a Percentage of international staff B A Percentage of students in international joint degree programs A b International doctorate graduation rate B A Percentage foreign degree
-seeking students New indicator B Percentage students coming in on exchanges New indicator A Percentage students sent out on exchanges New indicator A*Data source:
There were some problems reported with availability of information on nationality of qualifying diploma and students in international joint degree programs.
Robustness Availability Preliminary rating Feasibility score Data availability Conceptual clarity Data consistency Recommendation Percentage of international students A a Incoming and outgoing students
A a-B Opportunities to study abroad (student satisfaction) A b International orientation of programs A b International academic staff B A-B International joint research publications
Not all institutions have clear data on outgoing students. In some cases only those students participating in institutional or broader formal programs (e g.
ERASMUS) are registered and institutions do not record numbers of students with self-organized stays at foreign universities.
Availability of data was relatively low regarding the student satisfaction indicator as only a few students had participated already in a stay abroad
and could assess the support provided by their university. The indicatorinternational orientation of programs'is a composite indicator referring to several data elements;
working in the region B c In Research contracts with regional partners B b Regional joint research publications*B A Percentage of students in internships in local enterprises B c In*Data source:
education B c Out Student internships in local enterprises B b-C In Degree theses in cooperation with regional enterprises B b-C In Summer schools C C
While far from good, the data situation on student internships in local enterprises and degree theses in cooperation with local enterprises turned out to be less problematic in business studies than that found in the engineering field.
While a field period of four to six weeks after sending out invitations to students seems appropriate at individual institutions
whether the subjective evaluation of their own institution by students can be compared globally or whether there are differences in the levels of expectations or respondent behavior.
In our student questionnaire we usedanchoring vignettes'to control for such effects. Anchoring vignettes is designed a technique to ameliorate problems that occur
For a detailed description see appendix 9) Our general conclusion from the anchoring vignettes analysis was that no correlation could be found between the students'evaluation of the situation in their own institutions
This implies that the student assessments were influenced not systematically by differences in levels of expectation (related to different national backgrounds or cultures),
While students can be asked about their learning experience in the same way across different fields
sorted by indicatorresearch publication output student staff ratio graduation rate qualification of academic staff research publication output external research income citation index
%income third party funding CPD courses offered startup firms international academic staff%international students joint international publ. graduates working in the region student internships in local enterprise
& Learning Research Knowledge transfer international orientation Regional engagement student staff ratio graduation rate qualification of academic staff research publication output external
research income citation index%income third party funding CPD courses offered startup firms international academic staff%international students joint international publ. graduates working in the region
student internships in local enterprise regional copublication Institution 2-Institution 4 Institution 1--Institution 3-Institution 7---Institution 8--Institution 9 Institution 5-Institution 6-Teaching & Learning Research Knowledge
Context factors that may affect decision-making processes of users of rankings (e g. students, researchers) although not linked to the performance of institutions.
for prospective students intending to choose a university or a study program, low student satisfaction scores regarding the support by teaching staff in a specific university or program is relevant information,
although the indicator itself cannot explain the reasons behind this judgment. Rankings also have to be sensitive to context variables that may lead to methodological biases.
In addition access to and navigation through the web tool will be made highly user-driven by specificentrances'for different groups of users (e g. students, researchers/academic staff, institutional administrators, employers) offering specific information
In particular forlay users'(e g. prospective students) the existence of various language versions of U multirank would increase usability.
which fields to add it would make sense to focus on fields with significant numbers of students enrolled
or data related to students and graduates. EUMIDA could contribute to improve the data situation regarding employment-oriented outcome indicators.
Although 157 mobility of students is increasing, the majority of in particular undergraduate students will continue to start higher education in their home country.
requiring the involvement of many key players 160 (governments, European commission, higher education associations, employer organizations, student organizations).
i e. student organizations and associations of higher education institutions, would be responsible for the operation of the transparency instrument.
Doubts about commitment to social values of European higher education area (e g. no free access for student users?.
Major cost factors are for instance the realisation of student and graduate surveys or the use of databases charged with license fees, e g. bibliometric and patent data.
if universities have no e-mail-addresses of their students, requiring students to be addressed by letters.
The frequency of the updating of ranking data. A multidimensional ranking with data from the institutions will not be updated every year;
For instance the EC could take on the role of promoter of students'interests and could see the delivery of a web tool free of charge to students as its responsibility.
To ensure students'free access to U multirank data the EC could provide also in the long run direct funding of user charges that would
otherwise have to be imposed upon the students. There are a number of opportunities to find funding sources for U-Multirank:
a) Basic funding by the governing institutions in form of a lump sum. This is realistic for the government model (e g. basic funding by EU) and for the 173 independent,
b) Funding/sponsorship from other national and international partners interested in the system. c) Charges from the ranking users (students, employers etc..
To keep the web tool free of charges, especially for students, an equivalent to the charges could be paid by the EC.
Charges to the users of the U multirank web tool would seriously undermine the aim of creating more transparency in European higher education by excluding students for example;
The EC could pay for student user charges. Project-based funding for special projects, for instance new methodological developments or rankings of a particulartype'of institution offer an interesting possibility with chances of cross-subsidization.
Linking student satisfaction and service quality perceptions: the case of university education. European Journal of Marketing, 31 (7), 528-540.
The importance of institutional image to student satisfaction and loyalty within higher education. Higher education, 58 (1), 81-95.
Determinants of Student Loyalty in Higher education: A Tested Relationship Approach in Latin america. Latin american Business Review, 10 (1), 21-39.
Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011