Synopsis: Education:


JRC85353.pdf

36 5. 1. 1 Universities ranked in QS University ranking...36 5. 1. 2 Academic ranking of a Computer science Faculty...

38 5. 1. 3 Employer Ranking of a Computer science Faculty...40 5. 1. 4 Citations Ranking of a Computer science Faculty...

42 5. 1. 5 R&d Expenditures by ICT Firms...44 5. 1. 6 ICT FP7 Funding to Private Organisations...

46 5. 1. 7 ICT FP7 Participations...48 5. 1. 8 ICT FP7 Funding to SMES...

131 8. 1 QS WORLD UNIVERSITY RANKINGS by QS...131 8. 2 ICT FP7 by EC DG Connect...

Munchen Kreisfreie Stadt EIPE ID card Activity Characteristic Name of Indicator Indicator ID Rank R&d Agglomeration Universities ranked in the QS University ranking Agrd 1

32 Academic ranking of a Computer science faculty Agrd 2 10 Employer ranking of a Computer science faculty Agrd 3 11 Citations ranking of a Computer science faculty Agrd 4 29 R&d

Inner London East EIPE ID card Activity Characteristic Name of Indicator Indicator ID Rank R&d Agglomeration Universities ranked in the QS University ranking Agrd 1

18 Academic ranking of a Computer science faculty Agrd 2 7 Employer ranking of a Computer science faculty Agrd 3 3 Citations ranking of a Computer science faculty Agrd 4 6 R&d

Paris EIPE ID card Activity Characteristic Name of Indicator Indicator ID Rank R&d Agglomeration Universities ranked in the QS University ranking Agrd 1 37 Academic ranking

of a Computer science faculty Agrd 2 8 Employer ranking of a Computer science faculty Agrd 3 8 Citations ranking of a Computer science faculty Agrd 4 4 R&d expenditures by ICT

the EIPE ID card Activity Characteristic Name of Indicator Indicator ID Nr R&d Agglomeration Universities ranked in the QS University ranking Agrd 1 1 Academic ranking of a Computer science

faculty Agrd 2 2 Employer ranking of a Computer science faculty Agrd 3 3 Citations ranking of a Computer science faculty Agrd 4 4 R&d expenditures by ICT firms

R&d 5. 1. 1 Universities ranked in QS University ranking Table 13: Top ranking regions according to the Universities ranked in QS University ranking indicator Rank NUTS3 Code Region name Indicator Value EIPE Rank 1 UKL12 Gwynedd 100

266 2 DE711 Darmstadt, Kreisfreie Stadt 83 7 3 DE125 Heidelberg, Stadtkreis 82 23 4 DE423 Potsdam, Kreisfreie Stadt 77

UKH12 Cambridgeshire CC 20 5 Indicator description Indicator ID Agrd 1 Name of indicator Universities ranked in the QS University ranking

Measures the number of universities in QS university ranking based in a region Unit of measurement Region's share in the total number of EU ranked universities to a region's share in the EU population Definition of ICT dimension none Unit of observation

NUTS 3 Source QS WORLD UNIVERSITY RANKINGS by QS (see Section 8. 1) Reference year (s) considered 2011 37 Figure 22:

Frequency of the Universities ranked in QS University ranking indicator values 1248 9 8 8 7 4 3 6 1 1 2 1 1 1

2 1 0 500 1000 1500 Frequency 0 20 40 60 80 100 Number of universities ranked in QS Table 14:

Descriptive statistics of the Universities ranked in QS University ranking indicator Number of observations Mean value Standard deviation Variance 1303 1. 16 7. 01 49.17 38 5. 1. 2

Academic ranking of a Computer science Faculty Table 15: Top ranking regions according to the Academic Computer science faculty QS Ranking indicator Rank NUTS3 Code Region name Indicator Value EIPE Rank 1 UKH12

Cambridgeshire CC 100 5 2 UKJ14 Oxfordshire 87 19 3 UKI22 Outer London-South 73 114 4 UKM25 Edinburgh, City

UKE21 York 27 63 29 ITD55 Bologna 27 76 Indicator description Indicator ID Agrd 2 Name of indicator Academic ranking of a Computer science faculty

Measures the performance of the Computer science faculty according to the academic ranking of QS Unit of measurement The highest rank of a Computer science faculty in the academic ranking Definition of ICT dimension Computer science faculty Unit of observation NUTS

3 Source QS WORLD UNIVERSITY RANKINGS by QS (see Section 8. 1) Reference year (s) considered 2011 39 Figure 23:

Frequency of the Academic Computer science faculty QS Ranking indicator values 1244 1 5 5 12 11 8 9 2 2 1 1

1 1 0 500 1000 1500 Frequency 0 20 40 60 80 100 Academic ranking of a Computer science faculty Table 16:

Descriptive statistics of the Academic Computer science faculty QS Ranking indicator Number of observations Mean value Standard deviation Variance 1303 1. 38 7. 25 52.59 40 5

. 1. 3 Employer Ranking of a Computer science Faculty Table 17: Top ranking regions according to the Employer Computer science faculty QS Ranking indicator Rank NUTS3 Code Region name Indicator Value EIPE Rank 1 UKH12

Cambridgeshire CC 100 5 2 UKJ14 Oxfordshire 95 19 3 UKI12 Inner London-East 68 2 4 UKI22 Outer London

30 GR300 Attiki 28 49 Indicator description Indicator ID Agrd 3 Name of indicator Employer ranking of a Computer science faculty What does it measure?

Measures the performance of the Computer science faculty according to the employer ranking of QS Unit of measurement The highest rank of a Computer science faculty in the employer ranking Definition of ICT dimension Computer science faculty Unit

of observation NUTS 3 Source QS WORLD UNIVERSITY RANKINGS by QS (see Section 8. 1) Reference year (s) considered 2011 41 Figure 24:

Frequency of the Employer ranking of a Computer science faculty indicator values 1244 1 3 3 12 12 11 6 6 1 1 1 2 0

500 1000 1500 Frequency 0 20 40 60 80 100 Employer ranking of a Computer science faculty Table 18:

Descriptive statistics of Employer Computer science faculty QS Ranking indicator Number of observations Mean value Standard deviation Variance 1303 1. 47 7. 63 58.27 42 5

. 1. 4 Citations Ranking of a Computer science Faculty Table 19: Top ranking regions according to the Citations Computer science faculty QS Ranking indicator Rank NUTS3 Code Region name Indicator Value EIPE Rank 1 UKL12

Gwynedd 100 266 2 PL127 Miasto Warszawa 91 50 3 NL335 Groot-Rijnmond 77 72 4 FR101 Paris 75 3

Gent 37 94 Indicator description Indicator ID Agrd 4 Name of indicator Citations ranking of a Computer science faculty What does it measure?

Measures the performance of the Computer science faculty according to the citations ranking of QS Unit of measurement The highest rank of a Computer science faculty in the citations ranking Definition of ICT dimension Computer science faculty Unit

of observation NUTS 3 Source QS WORLD UNIVERSITY RANKINGS by QS (see Section 8. 1) Reference year (s) considered 2011 43 Figure 25:

Frequency of the Citations Computer science faculty QS Ranking indicator values 1243 3 11 10 9 6 7 3 2 3 2 2

1 1 0 500 1000 1500 Frequency 0 20 40 60 80 100 Citations ranking of a Computer science faculty Table 20:

Descriptive statistics of Citations Computer science faculty QS Ranking indicator Number of observations Mean value Standard deviation Variance 1303 1. 94 9. 57 91.58 44 5

of indicator Universities ranked in the QS University ranking Academic ranking of a Computer science faculty Employer ranking of a Computer science faculty Citations ranking of a Computer science faculty R&d expenditures by ICT firms ICT FP7 funding

Measures the number of universities in QS university ranking Measures the performance of the Computer science faculty according to the academic ranking of QS Measures the performance of the Computer science faculty according to the employer ranking of QS Measures the performance

of the Computer science faculty according to the citations ranking of QS Measures the average annual amount spent on R&d in the ICT sector Measures the amount received for research in ICT R&d Unit of measurement Region's share in the total

number of EU ranked universities to a region's share in the EU population The highest rank of a Computer science faculty in the academic ranking The highest rank of a Computer science faculty in the employer ranking The highest rank of a Computer science

faculty in citations ranking Region's share in the R&d expenditures by ICT firms in the EU to a region's share in the EU population Region's share in the total EU ICT FP7 funding to a region's share in the EU population Definition of ICT dimension None Computer science faculty Based on NACE

Rev. 2 ICT areas of the FP7 programme Unit of observation NUTS 3 Source QS WORLD UNIVERSITY RANKINGS by QS (Section 8. 1) Company level information:

Orbis by Bureau Van dijk (Section 8. 7) ICT FP7 by EC DG CONNECT (see Section 8. 2) Reference year 2011 2005-2011 2007-2011

Data Sources 8. 1 QS WORLD UNIVERSITY RANKINGS by QS The Computer science and Electronic Faculties rankings originate from the QS WORLD UNIVERSITY RANKINGS,

which was formed in 2008 to meet the increasing public interest for comparative data on universities and organisations,

and the growing demand for institutions to develop deeper insight into their competitive environment. 4 The QS WORLD UNIVERSITY RANKINGS currently considers over 2,

000 and evaluates over 700 universities in the world, ranking the top 400. Like any ranking at the global level, it is constrained by the availability of data from every part of its scope.

To construct measures of faculty performance, the QS uses its proprietary datasets that enable to drill down by subject area, namely academic and employer reputation surveys and the Scopus data for the Citations per Faculty indicator in the global rankings.

These have been combined to produce the results. In detail each of the faculty ranking pieces can be described in the following way:

Academic Reputation survey is the centrepiece of the QS WORLD UNIVERSITY RANKINGS since their inception in 2004. In 2010, it drew upon over 15,000 respondents to compile the results.

In the survey, respondents are asked to identify the countries, regions and faculty areas that they have most familiarity with

and up to two narrower subject disciplines in which they consider themselves expert. For EACH of the (up to five) faculty areas they identify,

respondents are asked to list up to ten domestic and thirty international institutions that they consider excellent for research in the given area.

Employer reputation survey considers the students'employability as a key factor in the evaluation of international universities and in 2010 drew on over 5,

The employer survey works on a similar basis to the academic one only without the channelling for different faculty areas.

and thirty international institutions they consider excellent for the recruitment of graduates. They are asked also to identify from which disciplines they prefer to recruit.

Employers seeking graduates from any discipline are weighted at 0. 1 and those from a parent category (i e.

Citations per Faculty takes into account the size of an institution while allowing observing its penetration the global research landscape.

When aggregated together these totals per faculty and their associated citations provide an indicator of volume and quality of output in the given discipline.

Aggregation, similarly to the approach used in the overall QS WORLD UNIVERSITY RANKINGS a z-score is calculated for each indicator with the results scaled between 0 and 100


JRC85356.pdf

32 5. 1 QS WORLD UNIVERSITY RANKINGS by QS...32 5. 2 FP7 database by EC DG Connect...

Thus, R&d activity is an investment in knowledge accumulation and in the development of technologies (Parham, 2009.

In particular, the EIPE project builds up a measurement of ICT R&d activity by observing the actual presence of ICT technology producers (universities, companies, R&d facilities), their R&d expenditures and bibliometric data.

the EIPE project builds up this measurement by observing the level of agglomeration of technology producers (universities, companies, R&d facilities), R&d expenditures and bibliometric data.

o Computer science and engineering with respect to university faculties, o Computer science with respect to scientific publications, o ICT hardware and software with respect to R&d activity performed in R&d centres,

FP7 data on FP participation from EC DG Connect, REGPAT by OECD, QS WORLD UNIVERSITY RANKINGS by QS, Web of Science by Thomson Reuters, Design Activity Tool by IHS isuppli, European

the EIPE ID card Activity Characteristic Name of Indicator Indicator ID Nr R&d Agglomeration Universities ranked in the QS University ranking Agrd 1 1 Academic ranking of a Computer science

faculty Agrd 2 2 Employer ranking of a Computer science faculty Agrd 3 3 Citations ranking of a Computer science faculty Agrd 4 4 R&d expenditures by ICT firms

In particular, they acknowledge the importance given in EIPE to the presence and the quality of major knowledge production organisations, such as universities (and their computer science departments), private and public research centres (in particular those of multinational companies), innovative SMES

of indicator Universities ranked in the QS University ranking Academic ranking of a Computer science faculty Employer ranking of a Computer science faculty Citations ranking of a Computer science faculty R&d expenditures by ICT firms FP7 funding

Measures the number of universities in QS university ranking Measures the performance of the Computer science faculty according to the academic ranking of QS Measures the performance of the Computer science faculty according to the employer ranking of QS Measures the performance

of the Computer science faculty according to the citations ranking of QS Measures the average annual amount spent on R&d in the ICT sector Measures the amount received for research in ICT R&d Unit of measurement Region's share in the total

number of EU ranked universities to a region's share in the EU population The highest rank of a Computer science faculty in the academic ranking The highest rank of a Computer science faculty in the employer ranking The highest rank of a Computer science

faculty in citations ranking Region's share in the R&d expenditures by ICT firms in the EU to a region's share in the EU population Region's share in the total EU FP7 funding to a region's share in the EU population Definition of ICT dimension None Computer science faculty Based on NACE Rev

. 2 (see Table 1) ICT areas of the FP7 programme (see Section 5. 2) Unit of observation NUTS 3 Source QS WORLD UNIVERSITY RANKINGS by QS (see Section 5. 1

) Company-level information: ORBIS by Bureau Van dijk (see Section 5. 7) FP7 database by EC DG Connect (see Section 5. 2) Reference year (s) considered 2011 2005-2011 2007-2011 10

The performance of universities and computer science faculties across the world, as reported by the QS University ranking.

For a detailed description of the data source, see Section 5. 1. Information about the funding

1. QS WORLD UNIVERSITY RANKINGS by QS, 2. FP7 database by EC DG Connect, 3. Bibliometrics: Web of Science by Thomson Reuters, 4. ICT R&d centres locations:

each of the data source is described. 5. 1 QS WORLD UNIVERSITY RANKINGS by QS The rankings of Universities

and Computer science and Electronic Faculties originate from the QS WORLD UNIVERSITY RANKINGS. It was formed in 2008 to meet the increasing public interest in comparative data on universities and organisations,

and the growing demand for institutions to develop deeper insight into their competitive environment. 16 The QS WORLD UNIVERSITY RANKINGS currently considers over 2,

000 universities in the world and evaluates over 700 of them, ranking the top 400.

This list is used to build an indicator of the location of a ranked university in a region within the current project.

In addition due to the fact the QS ranking includes 52 subject disciplines, one of which is Computer science,

additional faculty-level information is extracted for the purpose of the EIPE study. To construct measures of faculty performance,

the EIPE study used QS proprietary datasets to investigate its subject area at three levels, namely academic and employer reputation surveys and the Scopus data for the Citations per Faculty indicator.

In detail, each of the faculty ranking pieces can be described in the following way: The Academic reputation survey is the centrepiece of the QS WORLD UNIVERSITY RANKINGS since their inception in 2004.

In 2010, it drew upon over 15,000 respondents to compile the results. In the survey, respondents are asked to identify the countries,

regions and faculty areas that they have most familiarity with and up to two narrower subject disciplines in

which they consider themselves expert. For each of the faculty areas they identify, respondents are asked to list up to ten domestic

and thirty international institutions that they consider excellent for research in the given area. They are not able to select their own institution.

The Employer reputation survey considers the students'employability as a key factor in the evaluation of international universities and in 2010 drew on over 5,

The employer survey works on a similar basis to the academic one only without the channelling for different faculty areas.

http://www. topuniversities. com (last accessed 01.02.2012) 33 excellent for the recruitment of graduates. They are asked also to identify from which disciplines they prefer to recruit.

Employers seeking graduates from any discipline are weighted at 0. 1 and those from a parent category (i e.

Citations per faculty takes into account the size of an institution, and also observes its penetration into the global research landscape.

When aggregated, these totals per faculty and their associated citations provide an indicator of volume and quality of output in the given discipline.

in addition to the university ranking, it also offers the rankings described above by teaching subject, including Computer science.

The main constraint is that it offers only a limited number of universities, which does not allow us to cover the entire population of the European higher education institutions. 5. 2 FP7 database by EC DG Connect The Framework Programmes for Research and Technological Development,

also called Framework Programmes or abbreviated to FP1, through to FP7, are funding programmes created by the European union

Oxford Scholarship Online Monographs. EC-JRC. 2005). ) Tools for Composite Indicators Building. Ispra: EC-JRC. 48 Ellison, G,

University of Technology. Ottaviano, G, . & Thisse, J.-F. 2004). Agglomeration and economic geography. In J. V. Henderson, P. Nijkamp, E s. Mills, P. C. Cheshire & J. F. Thisse (Eds.

"Journal of International Business studies, 22 (1), 1-21. Patel, P, . & Vega, M. 1999). Patterns of internationalisation of corporate technology:


jrc88429 s3 05_2014_transnational_learning_mariussen_midtkandal_ rakhmatullin_approved in pubsy.pdf

2014 Åge Mariussen Inger Midtkandal Ruslan Rakhmatullin S3 Policy Brief Series No. 05/2014 A Policymakers Guide to Transnational Learning in Smart Specialisation Report

Printed in Spain 1 A Policymakers Guide to Transnational Learning in Smart Specialisation Åge Mariussen*,Inger Midtkandal**and Ruslan Rakhmatullin***Botnia-Atlantica Institute, University of Vaasa (Finland

), Umeå University (Sweden), Nordland Research Institute (Norway)**European commission, JRC-IPTS, Seville (Spain) S3 Policy Brief Series n°05/2014 January

transnational learning is a promising and potentially powerful tool. However, attempts to organize transnational learning may easily fail.

This brief explains how it can succeed, with reference to the six steps of the S3 Guide.

This typology is intended to illustrate how transnational learning may play different roles in S3 policymaking.

Smart Specialisation, transnational learning, typology of RIS3, drivers of change, 6 steps of the S3 Guide a The views expressed are purely those of the author

and may not in any circumstances be regarded as stating an official position of the European commission. 2 TRANSNATIONAL LEARNING AS A POWERFUL TOOL IN SMART SPECIALISATION The point of departure for this policy-brief is opportunities

and challenges in promoting transnational learning as a more forceful tool in Smart Specialisation. Core issues are 1

This typology is intended to illustrate how transnational learning may play different roles in S3 policymaking.

The brief draws on Learning Transnational Learning, combined with experience from S3 peer review workshops throughout 2013,

Core components in these attempts at transnational learning so far have been Critical friends and peer review.

SMART SPECIALISATION IS NOT A FASHION The concept of transnational learning is used in different ways.

The result of this kind of learning is often the creation of what professors studying diffusion refer to as fads

and fashions where several countries and regions go in a similar direction. Fashions start with solutions (such as new public management forms of organization) which look for problems.

Among industrial actors, this is called learning through monitoring. Industrial actors apply benchmarking and monitoring of what their competitors

and just skip any thought of adapting universities to regional development needs. These and other obstacles to triple helix connectivity might be frustrating,

if you are able to create new university industry relations, or repair other flaws in your system of innovation,

Using transnational learning to change your own region takes a coordinated effort, which involves several phases of analysis and planning.

THEN HOW CAN TRANSNATIONAL LEARNING BE EXPLOITED IN S3 POLICYMAKING? The short answer is that it needs to be integrated into all parts of the S3 planning cycle.

The experimental S3 planning and policy-making cycle The experimental S3 planning and policy-making cycle Smart specialisation planning and policymaking Objectives Transnational learning opportunity What can go wrong?

Learning from others in the analysis (an outsider perspective) We are perfect Step 2 Set up of a sound and inclusive governance structure Balancing inclusion and participation with a view of the entire region.

or institutions in evaluation and exploitation of transnational learning Transnational learning isolated in a project

Learning from others how to prioritize, evaluating good practices and selecting the best options for future development Dominant regional actors takes all the money for usual purposes Step 5 Establishment of suitable policy mixes Policy mixes are determined by the model

Failed attempts to do transnational learning are derived often from a series of missteps. The regional partnership may see the situation as perfect (step 1),

Similarly, efficient use of transnational learning should as explained above include learning through monitoring in the analysis (step 1),

involving the S3 regional leaders in transnational learning (step 2), specifying a vision on a reflection upon the global position of your region (step 3),

What is relevant in terms of transnational learning depends on your SWOT analysis where you have discovered your drivers of change.

such as the triple helix of universities and public policies does not support the industry and its suppliers sufficiently,

the integration of transnational learning into this planning process may go like this: DRIVER OF CHANGE PLANNING CYCLE WITH TRANSNATIONAL LEARNING Globalization is destroying our industrial base Our innovation system is inefficient Entrepreneurial discovery of own latent potentials We should create a new paradigm of production

Sharing visions with other regions Analysis. Comparison with others, discovery of good practices. What is wrong with my framework conditions?

because your educational institutions and R&d framework are insufficient. If you have a strong regionally embedded knowledge base,

such as Education (lack of adaptation to globally competing existing industries) Infrastructure Labour market education You should also consider to what extent the existing fragmentation of your triple helix is caused by the educational system,

why is not this university generating more spin-offs? In seeking the explanations to this question,

you should look at spin-offs from other universities. Is there something wrong with the organization of the science park?

Or perhaps there is a lack of connectivity between the universities and industrial actors who has a potential to commercialize university-based innovations?

What about the absorptive capacity of your industries? Problems often have root causes, chains of causation

A common problem is that the university institutions have strengths in terms of research which is matched not by regional industries.

which are technologically related to the strengths of the University. With no industrial actors willing and able to industrialize university-based innovations

patents from your university are likely to be sold and industrialized in other regions. There might be ways around these structural problems,

providing you discover and use appropriate policies. In this discovery process, there is a lot to learn from the history of other more successful regions you identified in the first place.

Why are their industry better connected to regional universities? In many cases what appears to be a structural problem today is the result of policy decisions made some years ago.

For instance, in comparing connectivity between universities and industries between Nordic countries, we have found that an explanation to the excellent contacts between industries

and universities in Finland is due to certain reforms in the Finnish school system which created open doors from craft based education to poly-technical university-level education.

This is a good practice. The hypothesis is that it can be transferred to other regions in other countries through a reform of your existing educational institutions

or through setting up of new. This is a long-term strategy, and it should be implemented as such.

Other good practices is the frequent contacts between some Norwegian universities and tourism is that the universities tries to support innovation in tourism through innovation programs,

in synergy with programs encouraging 11 tourist entrepreneurs to cooperate in promoting destinations. These are short term strategies,

The forms of transnational learning required in these strategies are sophisticated and include shared technology foresights

Driver of change S3 focus Characteristics of trans national learning Deindustrialization Regional embedding of knowledge GAP analysis Specific indicators Transfer of institutional solutions Innovation system

either competitive learning through monitoring or deep and many-sided Transnational co-specialisation Technology foresights Scenarios Shared strategy Deep, many-sided The different drivers are likely to be related to different

Correspondingly, this raises different types of agenda for transnational learning. In some cases, such as transnational co-Specialisation, transnational learning may be seen as many-sided and deep. 14 REFERENCES Mariussen Åge and Virkkala Seija (2013) Learning Transnational Learning, Routledge Studies in Human

Geography Midtkandal Inger and Rakhmatullin Ruslan (2014), The S3 Platform Peer review Methodology, JRC Technical report,

A Policymakers Guide to Transnational Learning in Smart Specialisation Authors: Åge Mariussen, Inger Midtkandal, Ruslan Rakhmatullin Luxembourg:

transnational learning is a promising and potentially powerful tool. However, attempts to organize transnational learning may easily fail.

This brief explains how it can succeed, with reference to the six steps of the S3 Guide.

This typology is intended to illustrate how transnational learning may play different roles in S3 policymaking.


< Back - Next >


Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011