Synopsis: Ict: Data: Data:


WEF_GlobalInformationTechnology_Report_2014.pdf

The Global Information technology Report 2014 Rewards and Risks of Big data Beñat Bilbao-Osorio, Soumitra Dutta,

which statistical data are maintained on a separate and independent basis. 2014 World Economic Forum The Global Information technology Report 2014 iii Contents Preface v Jennifer Blanke and Alan

Balancing the Risks and 53 Rewards of Data-Driven Public Policy Alex Pentland (MIT) 1. 5 Managing the Risks and Rewards 61 of Big data Matt Quinn and Chris

Taylor (TIBCO) 1. 6 Rebalancing Socioeconomic 67 Asymmetry in a Data-Driven Economy Peter Haynes (Atlantic Council) and M-H. Carolyn Nguyen (Microsoft

Which Policies Will Lead to Leveraging Data-Driven Innovation's Potential? Pedro Less Andrade, Jess Hemerly, Gabriel Recalde,

Data Tables 249 How to Read the Data Tables...251 Index of Data Tables...253 Data Tables...

255 Technical Notes and Sources 323 About the Authors 329 Partner Institutes 335 Acknowledgments 343 2014 World Economic Forum 2014 World Economic Forum

The Global Information technology Report 2014 v The 13th edition of The Global Information technology Report is released at a time

and variety of sources of the creation of new data. These essays also advise on the changes that organizations,

and obtain economic and social value from this vast quantity of newly generated data. In addition, the Report presents a wealth of data,

including detailed profiles for each economy covered and data tables with global rankings for the NRI's 54 indicators.

We would like to convey our sincere gratitude to the industry and academic organizations'experts who contributed outstanding chapters.

I believe we are currently experiencing the biggest fundamental change the world has seen since the initial development of the Internet as people, processes, data,

Iot and data analytics. The explosive expansion of Iot, or connections between context-aware machines and other physical objects, is changing how we utilize devices to improve our daily lives.

And the shift in data and analytics from being centralized, structured, and static to being distributed, mixed structured and unstructured,

and the rapid growth of IPCONNECTED devices is driving exponential increases in data traffic. The migration to IP networks and the ability to turn big data into valuable

circuit switching to packet switching, fixed connectivity to mobile connectivity, dedicated resources to virtual ones, data traffic to voice and video traffic, PC connections to any-device connections,

such as analytic engines and cloudbased storage, have made it possible to gather these data in unprecedented amounts

They invest only in the data gathering that gives them privileged access to the customers they care about,

EXTRACTING VALUE FROM BIG DATA Data have had always strategic value, but with the magnitude of data available today and our capability to process them they have become a new form of asset class.

In a very real sense, data are now the equivalent of oil or gold. And today we are seeing a data boom rivaling the Texas oil boom of the 20th century and the San francisco gold rush of the 1800s.

It has spawned an entire support industry and has attracted a great deal of business press in recent years This new asset class of big data is described commonly by what we call the three Vs.

Big data is high volume, high velocity, and includes a high variety of sources of information. Next to those traditional three Vs we could add a fourth:

Big data can take the form of structured data such as financial transactions or unstructured data such as photographs or blog posts.

or obtained from proprietary data sources. Big data has been fueled by both technological advances (such as the spread of radio-frequency identification, or RFID

and networks of social connections are now all data, and their scale is massive. What did we search for?

But succeeding with big data requires more than just data. Data-based value creation requires the identification of patterns from

Businesses need to decide which data to use. The data each business owns might be as different as the businesses themselves;

these data range from log files and GPS data to customer-or machine to machine-machine data.

Each business will need to select the data source it will use to create value. Moreover, creating this value will require the right way of dissecting

and then analyzing those data with the right analytics. It will require knowing how to separate valuable information from hype.

This world of big data has also become a source of concern. The consequences of big data for issues of privacy and other areas of society are understood not yet fully.

Data could become a new ideology. We are just at the beginning of a long journey where, with the proper principles and guidelines,

5) rebalancing socioeconomic asymmetry in a data-driven economy;(6) the role of regulation and trust building in unlocking the value of big data;(

In addition, the Country/Economy Profile and Data Tables sections at the end of the Report present the detailed results for the 148 economies covered by the study

As exabytes of new data are created daily, a rising share of this data growth is flowing over IP networks as more people, places,

and things connect to the Ioe. Proprietary networks are increasingly migrating to IP, facilitating the growth of big data,

and networks are fast becoming the key link among data generation, analysis, processing, and utilization.

The authors highlight four major trends driving data growth over IP networks and detail how networks are central to maximizing analytical value from the data deluge.

The chapter identifies critical technology and public policy challenges that could accelerate, or encumber, the full impact of big data

and the Ioe including standards and interoperability, privacy and security, spectrum and bandwidth constraints, crossborder data traffic, legacy regulatory models, reliability, scaling, and electrical power.

so that executives base their judgments on data rather than hunches. Research already indicates that companies that have managed this are more likely to be productive and profitable than their competition.

The ultimate maturity level involves transforming the business model to become data-driven, which requires significant investment over many years.

Policymakers should establish an environment that facilitates the business viability of the big data sector (such as data

and Rewards of Data-Driven Public Policy Alex Sandy Pentland from the Massachusetts institute of technology (MIT) highlights in Chapter 1. 4 that we are entering a big data world,

where governance is driven far more by data than it has been in the past. Basic to the success of a data-driven society is the protection of personal privacy and freedom.

Discussions at the World Economic Forum have made substantial contributions to altering the privacy and data ownership standards around the world in order to give individuals unprecedented control over data that are about them,

while at the same time providing for increased transparency and engagement in both the public and private spheres.

and in particular governments, may be tempted to abuse the power of the data that they hold. To address this concern we need to establish best practices that are in the interests of both large organizations and individuals.

1. Large data systems should store data in a distributed manner, separated by type (e g.,, financial vs. health) and real-world categories (e g.,

whose function is focused on those data, and with sharing permissions set and monitored by personnel from that department.

Best practice would have the custodians of data be regional and use heterogeneous computer systems. With such safeguards in place, it is difficult to attack many different types of data at once,

and it is more difficult to combine data types without authentic authorization. 2. Data sharing should always maintain provenance

and permissions associated with data and support automatic, tamper-proof auditing. Best practice would share only answers to questions about the data (e g.,

, by use of preprogrammed SQL queries known as Database Views) rather than the data themselves, whenever possible.

This allows improved internal compliance and auditing, and helps minimize the risk of unauthorized information leakage. 3. Systems controlled by partner organizations,

Otherwise data can be siphoned off at either the data source or the end consumer without the need for attacking central system directly. 4. The need for a secure data ecosystem extends to the private data of individuals and the proprietary data of partner companies.

As a consequence, best practice for data flows to and from individual citizens and businesses is to require them to have secure personal data stores

and be enrolled in a trust network data sharing agreement. 5. All entities should employ secure identity credentials at all times.

Best practice is to base these credentials on biometric signatures. 6. Create an open data commons that is available to partners under a lightweight legal agreement

Open data can generate great value by allowing third parties to improve services. Although these recommendations might at first glance seem cumbersome,

In many cases, the use of distributed data stores and management are already part of current practice,

Most importantly, these recommendations will result in a data ecosystem that is more secure and resilient, allowing us to safely reap the advantages of using big data to help set

and high-velocity data creates three key trends: Big data leverages previously untapped data sources to liberate information from places where it was hidden previously.

Big data management requires automation wherever possible, because volume and complexity eliminate the ability of humans to intervene

less fragile data systems because the sheer variety of structured and unstructured data breaks the old computational and transactional ways of writing logic.

and securing ever-larger amounts of data is complicated much more than the relatively simple problem of marshaling storage and computational resources.

Rebalancing Socioeconomic Asymmetry in a Data-Driven Economy Chapter 1. 6, contributed by Peter Haynes of the Atlantic Council

and M-H. Carolyn Nguyen at Microsoft, explains that an increasing amount of data is being generated by individuals who are handing potentially valuable information to commercial enterprises in exchange for free services.

or being recompensed for their data's monetary value, and with little or no control over its immediate or future use.

These socioeconomic asymmetries in the broad data ecosystem are a potential threat to the emerging data-driven economy

and analytics of data. The authors argue the need for a data ecosystem based on fair value exchange

and the ability of users to control the use of data related to them. The chapter also considers potential technology

and policy approaches by which this might be achieved, and present the need for significant additional research and new thinking, in both technology and policy,

to enable a sustainable data-driven economy. Building Trust: The Role of Regulation in Unlocking the Value of Big data In Chapter 1. 7, Scott Beardsley, Luís Enríquez, Ferry Grijpink, Sergio Sandoval, Steven Spittaels,

how to treat anonymous data, whether to allow the right to be forgotten, and the need to clarify the relevant jurisdictions and liabilities between parties.

Which Policies Will Lead to Leveraging Data-Driven Innovation's Potential? Chapter 1. 8, contributed by Pedro Less Andrade, Jess Hemerly, Gabriel Recalde,

and Patrick Ryan at Google, focuses on the social and economic value of data, but from the point of view of use

As it has become axiomatic that more data are produced every year, commentators have been driven to call this revolution the age of big data.

the use of data to build successful products and services, optimize business processes, and make more efficient data-based decisions already has established an history.

the main features of big data (quantity, speed, variety) are technical properties that depend not on the data themselves but on the evolution of computing, storage,

This is why this chapter uses data-driven innovation to frame the discussion. High-value solutions that may not have quantifiable economic value are being developed using data,

and many sectors, from businesses to governments, benefit from data-driven innovation. Apart from producing

and using data for better policymaking processes, the public sector can also play its part in promoting

and fostering data-driven innovation and growth throughout economies by (1) making public data accessible through open data formats,

(2) promoting balanced legislation, and (3) supporting education that focuses on data science skills. Making Big data Something More than The next Big Thing In Chapter 1. 9.,Anant Gupta, Chief executive officer at HCL Technologies Ltd, argues that big data analytics is not a passing fad.

COUNTRY/ECONOMY PROFILES AND DATA PRESENTATION Parts 2 and 3 of the Report feature comprehensive profiles for each of the 148 economies covered this year as well as data tables for each of the 54

Each part begins with a description of how to interpret the data provided. Technical notes and sources, included at the end of Part 3,

and information on the definitions and sources of specific quantitative non-Survey data variables included in the NRI computation this year.

EXTRACTING VALUE FROM BIG DATA Data have had always strategic value, but with the magnitude of data available today and our capability to process them they have become a new form of asset class.

In a very real sense, data are now the equivalent of oil or gold. And today we are seeing a data boom rivaling the Texas oil boom of the 20th century and the San francisco gold rush of the 1800s.

It has spawned an entire support industry and has attracted a great deal of business press in recent years As explained in more detail in Chapter 1. 3,

this new asset class of big data is described commonly by what we call the three Vs.

Big data can take the form of structured data such as financial transactions or unstructured data such as photographs or blog posts.

or obtained from proprietary data sources. Big data has been fueled by both technological advances (such as the spread of radio-frequency identification, or RFID

and networks of social connections are now all data, and their scale is massive. What did we search for?

Google uses big data to predict the next wave of influenza. 2 IBM uses data to optimize traffic flow in the city of Stockholm,

Dr. Jeffrey Brenner, a physician in New jersey, uses medical billing data to map out hot spots where you can find his city's most complex and costly healthcare cases as part of a program to lower healthcare

which courses. 5 But succeeding with big data requires more than just data. Data-based value creation requires the identification of patterns from

Businesses need to decide which data to use. The data each business owns might be as different as the businesses themselves;

these data range from log files and GPS data to customer-or machine to machine-machine data.

Each business will need to select the data source it will use to create value. Moreover, creating this value will require the right way of dissecting

and then analyzing those data with the right analytics. It will require knowing how to separate valuable information from hype.

Chapter 1. 7 provides guidelines for businesses to make this transition. To a large extent, mastering big data can also be compared to irrigation.

For many, data-driven has become the new management philosophy. The Economist Intelligence Unit released survey data showing that approximately two-thirds of executives feel that big data will help find new market opportunities

and make better decisions. 6 Nearly half of the surveyed respondents feel big data will increase competitiveness,

Data could become a new ideology. We are just at the beginning of a long journey where, with the proper principles and guidelines,

Although data are still scarce in terms of ICT impacts, policy interest in measuring ICTS has shifted from measuring ICT access to measuring ICT impacts.

At the moment, because of data limitations, this pillar focuses on measuring the extent to which governments are becoming more efficient in the use of ICTS

and the development of rigorous quantitative data to do so is still in its infancy. As a result, many of the dimensions where ICTS are producing important impacts especially

Therefore this subindex should be regarded as a work in progress that will evolve to accommodate new data on many of these dimensions as they become available.

COMPUTATION METHODOLOGY AND DATA In order to capture as comprehensively as possible all relevant dimensions of societies'networked readiness

the NRI 2014 is composed of a mixture of quantitative and survey data, as shown in Figure3.

27 or 50 percent are collected quantitative data primarily by international organizations such as International Telecommunication Union (ITU), the World bank,

and the United nations. International sources ensure the validation and comparability of data across countries. The remaining 27 variables capture aspects that are more qualitative in nature or for

which internationally comparable quantitative data are not available for a large enough number of countries, but that nonetheless are crucial to fully measure national networked readiness.

These data come from the Executive Opinion Survey (the Survey which the Forum administers annually to over 15,000 business leaders in all economies included in the Report. 8 The Survey represents a unique source of insight into many critical aspects related to the enabling environment,

and data availability for indicators obtained from other sources, mostly international organizations. This year the Report includes 148 economies, four more than the 2013 edition.

because Survey data could not be collected this year. More details on variables included in the Index and their computation can be found in Appendix A

Breakdown of indicators used in the Networked Readiness Index 2014 by data source TOTAL: 54 INDICATORS INDICATORS FROM OTHER SOURCES 27 INDICATORS (50%)EXECUTIVE OPINION SURVEY 27 INDICATORS (50%)2014 World Economic Forum The Global Information technology Report

In addition, the Country/Economy Profiles and Data Tables sections at the end of the Report present the detailed results for the 148 economies covered by the study

According to ITU, the increase in smartphone usage is leading to more handset data download because owners of smartphones are more likely to purchase goods,

data. Large amounts of data, often referred to as big data, are generated constantly both in a structured and non-structured manner.

Thanks to advances in ICTS, the volume and velocity of generation of these data are unprecedented,

as is the capacity of organizations to capture and treat them, potentially generating great economic and social value.

treat, and interpret these data. This will frequently require new management philosophies and organizational structures capable of adapting

The numbering of the indicators matches the numbering of the data tables at the end of the Report.

Where data are missing for indicator 4. 03 (i e.,, Puerto rico and Timor-Leste), the score on the affordability pillar,

How the Network Unleashes the Benefits of Big data ROBERT PEPPER JOHN GARRITY Cisco systems Exabytes (1018) of new data are created every single day.

First described by Clive Humby as the new oil, 1 this data growth is fueling knowledge economies,

But most of these data are unstructured and underutilized, flowing at a volume and velocity that is often too large and too fast to analyze.

If data do in fact, comprise the new raw material of business, on par with economic inputs such as capital and labor, 2 then deriving insight

and analysis. A rising share of this data growth is flowing over IP networks as more people, places,

and fast becoming the key link among data generation, processing, analysis, and utilization. How can we effectively maximize value from this data explosion

and avoid the pitfall of diminishing marginal data value? This chapter details how IP networks underpin the Ioe

and can accelerate big data's transformational impact on individuals, businesses, and governments around the world.

After first highlighting four major trends driving data growth over IP networks and detailing how networks are central to maximizing analytical value from the data deluge,

ACCELERATING DATA PRODUCTION AND DATA TRAFFIC Data growth is skyrocketing. Over 2. 5 quintillion bytes of data are created each day;

90 percent of the world's stored data was created in the last two years alone. 3 To put this into context,

one hour of customer transaction data at Wal-mart (2. 5 petabytes) provides 167 times the amount of data housed by the Library of Congress.

The research consultancy IDC estimates that the digital universe all digital data created, replicated, or consumed is growing by a factor of 30 from 2005 to 2020,

200 gigabytes for every person on earth. 4 Much of this data growth is traversing IP networks.

Mobile data traffic, however, is growing at an even faster pace: over the same period, mobile data will grow 13-fold,

with a CAGR of 66 percent, capturing a greater share of all data created and transmitted (Figure1).

5 The Global Information technology Report 2014 35 2014 World Economic Forum Despite the rapid growth in data production and transmission,

however, only a small fraction of all physical objects in the world are connected currently to IP networks.

Internet-enabled alarm clocks gather data on weather and traffic, combining that information with a user's schedule,

At an industrial level, applications using sensor technologies are capturing vast amounts of data to improve decision-making.

the data universe will continue to grow rapidly. The Ioe will not only fuel the expansion of big data and data transmission,

but can also provide targeted, automatic, data-driven analysis for our day-to-day lives. CRITICAL DRIVERS OF DATA GROWTH In 1944

the first digital computer, the Colossus, was deployed in the United kingdom to decipher codes during WORLD WAR II. The Colossus was able to process data at 5, 000 characters per second (25 Kb/s). 7 Currently the world's fastest supercomputer,

the Milkyway-2, can process 54,902 × 1012 operations per second (54,902 TFLOP/s). 8 This intensive growth in data processing power continues today,

coupled with extensive growth in data production. This data growth also supports four major trends that lead to a rising share of data transmission over IP networks in the world of the Ioe

as described below. Internet protocol (IP) is becoming the common language for most data communication. Proprietary industrial networks are migrating to IP,

bringing previously isolated data onto public and managed IP networks. The Internet's history is built on the migration of proprietary networks to IP.

Proprietary data networks such as Appletalk and IBM Systems Network architecture (SNA) have migrated to IP over time,

Growth rates and rising share of mobile data Sources: Cisco 2013b; EMC 2013; authors'calculations. Indexed data traffic series, 2010 levels of bytes=100 Mobile data traffic Total data universe Total IP traffic 1. 2:

The Internet of Everything 36 The Global Information technology Report 2014 2014 World Economic Forum Voice over internet Protocol (Voip.

Each migration shifts a large amount of data production and transmission onto IP networks (see Box1).

adding heavily to the endpoints collecting data and to the devices consuming information. Cisco's Visual Networking Index estimates that,

along with meta-information about the data itself (e g.,, descriptive statistics, frequency distribution, dispersion, etc..This digitization of information is leading to greater exchange of stored media and data over the Internet.

The introduction of Internet protocol version 6 (IPV6) allows for trillions of trillions (1038) of devices to connect to the Internet.

Huge and growing data volume from industrial applications Industrial applications of the Internet of Everything (Ioe) generate immense data flows,

and the data flows that come with them. In the oil and gas industry, for example, data are utilized across the entire value chain, from exploration, production, refining,

and distribution to marketing and retail. Sensors and computing are used to capture and monitor seismic data,

borehole activity, environmental readings, weather, production utilization, storage capacity, spot pricing (trading), transportation, inventory levels, demand and forecasts,

and location data. In seismic exploration, the cost, size, and speed of data are all rising as exploration moves to 3d imaging.

Data capture amounted to around 300 megabytes per square kilometer in the 1990s. By 2006, data per square kilometer amounted to 25 gigabytes,

while today the amount per square kilometer is in the petabytes. 1 According to Chevron and industry-wide estimates,

a fully optimized digital oil field based on data utilization results in 8 percent higher production rates and 6 percent higher overall recovery. 2 In electric utility grids,

data utilization also improves efficiency. Current grids monitor data to control electricity flows (both to and from the grid) based on real-time demand,

thus improving generator efficiency and ensuring more-sustainable energy sources. Upgrading standard electric meters to smart meters allows information to be communicated over a network back to a control center

and increases the amount of data captured. While traditional meters are read once a month,

some smart meters can report usage rates in 15-minute intervals. For every million meters, this leads to 96 million measurements per day, an estimated 3, 000-fold increase in data collection. 3 Conservative estimates of the total amount of data

that will be generated by smart meters by 2019 in the United states alone (assuming only two readings per day,

sensors on General electric (GE)' s jet plane turbines illustrate the vast amount of data generated daily.

GE estimates that each sensor on a GE turbine generates approximately 500 gigabytes of data every day.

This aggregates to petabytes of data daily. 5 Notes 1 Beals 2013; see also note 4 at the end of this chapter. 2 Leber 2012.3 IBM Software 2012.4 Danahy 2009;

THE GAP BETWEEN DATA GROWTH AND DATA VALUE Current estimates suggest that only half a percent of all data is being analyzed for insights;

13 furthermore, the vast majority of existing data are unstructured and machine-generated. 14 Applying analytics to a greater share of all data can lead to productivity increases, economic growth,

and societal development through the creation of actionable insights. Data alone are not very interesting or useful.

It is when data can be used and become actionable that they can change processes and have direct positive impact on people's lives.

The Ioe generates data, and adding analysis and analytics turns those data into actionable information.

Building on the framework of the knowledge hierarchy, 15 aggregated data become information that when analyzed, become knowledge.

Knowledge can lead to insights and informed decisionmaking, which at the highest level is wisdom (Figure2).

For example, society at large can benefit from tracking trends observed from metadata such as anonymized mobile phone data used to track population migration after the earthquake and cholera outbreaks in Port-au-prince,

Haiti. 16 Likewise, analyzing social media discussions can identify crises or flu outbreaks. At an industrial level, big data analysis can yield very large benefits.

For example, the value of modernizing the US electricity grid to be driven data is estimated at US$210 billion.

A reconstituted electricity grid would be based on an architecture driven by technology selections to fully harness the convergence of data

controls and transactions. 17 According to Bradley et al. in a recent Cisco White paper, harvesting data for critical decision-making though the Ioe can create approximately US$14. 4 trillion dollars of added value in the commercial sector over the next

however, may require concurrent investment in resources to manage the rise in data. It is forecasted that by 2020,

Turning data into insight Sources: Ackoff 1989; authors'interpretation. Insight (wisdom) Process optimization Decision-making Knowledge Metrics and scorecards Information Data Individual data points 1. 2:

The Internet of Everything 38 The Global Information technology Report 2014 2014 World Economic Forum EQUIPPING IP NETWORKS TO DELIVER BIG DATA INSIGHTS Moving up the knowledge pyramid from data to insights

and informed decisions is a critical challenge facing businesses and governments. Equipping IP networks to better transmit data to processing centers as well as enabling the network to create,

analyze, and act on data insights is one comprehensive approach. Building this capability will require improving network infrastructure, building analytical capabilities and intelligence into the network,

and distributing computing and analytical capabilities throughout the network, particularly at the edge. Specifically, these are:

including unintelligent ones (those that are capable only of transmitting data, not receiving them); securing infrastructure;

This will require building in the ability to compute data in motion and host partner applications in an ecosystem where applications can be built to analyze data inflow, particularly enabling machineto-machine (M2m) services.

Distributing computing and storage. Efficient distribution will require moving the ability to analyze data only in the data center to add processing at the edge (or near the edge) of the network,

to prevent delays in processing caused by latency as well as delays caused by network congestion. TECHNICAL AND POLICY CHALLENGES Building a network that will maximize the impact of big data requires powerful and seamless interactions among sensors

Privacy issues arise with the growth of data, particularly with regard to data generated by or about individuals.

Policymakers must identify the appropriate balance between protecting the privacy of individuals'data and allowing for innovation in service delivery and product development.

New technologies and services, Source: Authors. TECHNICAL POLICY Figure 3: Policy and technical issues facing big data and the Ioe Standards & interoperability Privacy & security Spectrum & bandwidth constraints Reliability Scaling Electrical power Cross-border data traffic

Legacy regulatory models The Global Information technology Report 2014 39 1. 2: The Internet of Everything 2014 World Economic Forum such as location-based services, are bringing these privacy issues to the forefront, offering users enhanced experiences while raising concerns of identity protection.

Some policies such as transparency in the use of data and effective mechanisms for consumer control of personal data can help in this regard.

and access by unauthorized and unwanted users to large databases and data flows. In order to ensure a healthy ecosystem where users, consumers,

Over the next five years, the growth of mobile data traffic will require greater radio spectrum to enable wireless M2m,

Bandwidth constraints will also be an obstacle in transmitting data over existing networks. The examples cited in Box1 reflect the volume of data being generated by proprietary networks

resulting in the need to move computing close to the network edge in a distributed intelligence architecture.

Data loads will be lumpy across various applications of the Ioe, and matching bandwidth needs to bandwidth availability will be a continuous challenge.

Any interruption to the transmission of data over networks negatively impacts these processes. Constraints on the technological limits of electrical efficiency and on computer memory and processing already pose limits to computing and data analysis.

Ioe applications that collect and handle data across sovereign jurisdictions could be affected negatively by policies restricting cross-border data traffic and global trade in Ioe-related services.

Emerging cross-border issues include national data protection rules and data transfers, data portability and interoperability standards,

humans have been processing data. We have been our own primary data machines. But today, with the advent of vast arrays of computing power, we increasingly rely on data processed by others,

and the Ioe and the era of big data are transforming our lives. Data flows and the ability to capture value from data are changing industries,

creating new opportunities while impacting others. For example, the app economy the business created by software applications running on smartphones has created hundreds of thousands of jobs. 21 One recent study estimates that the marginal impact of data utilization in the Ioe could raise US gross domestic product

by 2 percent to 2. 5 percent by 2025.22 The Ioe where more data are being captured by more devices,

interacting with more people and changing the processes by which we live, learn, work, and play is having a profound impact on the world.

But the value derived from the Ioe can be increased measurably if IP networks are able to facilitate the rise of big data

From Data to Wisdom. Journal of Applied Systems analysis 16:3 9. Beals, B. 2013. The Big Deal about Big data in Oil and Gas.

Improved Response to Disasters and Outbreaks by Tracking Population Movements with Mobile phone Network Data: A Post-Earthquake Geospatial Study in Haiti.

The Coming Smart Grid Data Surge. October 5. Available at http://www. smartgridnews. com/artman/publish/News blogs news/The-Coming-Smart-Grid-Data-Surge-1247. html. De Martini, P. and L. von Prellwitz. 2011.

Gridonomics: An Introduction to the Factors Shaping Electric Industry Transformation. Cisco White paper. Available at http://www. cisco. com/web/strategy/docs/energy/gridonomics white paper. pdf. The Economist. 2010.

Data, Data Everywhere. Managing Information, Special report, February 25. Available at http://www. economist. com/node/15557443.

Smart Grid Data About to Swamp Utilities. October 12. Gigaom. Available at http://gigaom. com/2009/10/12/smart-grid-data-about-to-swamp-utilities/.

/Gantz, J. and D. Reinsel. 2012. The Digital Universe in 2020: Big data, Bigger Digital Shadows,

Data Is the New Oil. Blog Post, November 3. Available at http://ana. blogs. com/maestros/2006/11/data is the new. html. Taft, J.,with P. De Martini and L. von Prellwitz. 2012.

Utility Data Management and Intelligence. Cisco. http://www. cisco. com/web/strategy/docs/energy/managing utility data intelligence. pdf. Top500. org. 2013.

of structured and unstructured data generated by individuals, enterprises, and public organizations is multiplying exponentially;

90 percent of the total data stored today is less than two years old. 1 So-called big data has the potential to improve

and analyze the deluge of data that threatens to drown companies. Although this technology is indeed necessary

so that senior executives make more judgments based on clear data insights rather than on intuition. They must build the necessary internal capabilities,

and human resources to interpret data in an astute manner. Moreover, because they rely on governments to provide the requisite environment,

and determine what they need to do to extract greater business and organizational benefits from the vast volume of data.

and improve their data-driven decision-making. It is characterized by what are known as the three Vs large data volumes, from a variety of sources, at high velocity (i e.,

, real-time data capture, storage, and analysis). Besides structured data (such as customer or financial records), which are kept typically in organizations'data warehouses, big data builds on unstructured data from sources such as social media, text and video messages,

and technical sensors (such The authors wish to thank Dr. Andreas Deckert for his contribution to this chapter.

The Global Information technology Report 2014 43 2014 World Economic Forum as global positioning system or GPS, devices) often originating from outside the organization itself.

The magnitude and complexity of data being produced far exceed the typical capacities of traditional databases and data warehouses for the purposes of storing,

processing, analyzing, and deriving insights. Usage statistics emanating from social media sites illustrate the sheer volume of unstructured data.

For example, in 2012 Facebook reported that it was processing around 2. 5 billion new pieces of content daily. 2 Big data has the potential to infuse executive decisions with an unprecedented level of data-driven insights.

However, research indicates that many organizations are struggling to cope with the challenges of big data. For example

in 2012 the Aberdeen Group found that the proportion of executives who reported that their companies were unable to use unstructured data,

and who complained that the volume of data was growing too rapidly to manage, had increased by up to 25 percent during the previous year. 3 EVOLUTION, NOT REVOLUTION Despite the rapid growth of big data,

which ever-moreelaborate data have influenced decision-making. From organizations'first attempts at data analytics in the 1960s and 1970s

this journey has proceeded through various stages, described by buzz words such as data mining and business intelligence, all of which sought to transform raw data into meaningful information for business purposes (Figure1).

However, the essential principles for exploiting its commercial benefit remain exactly the same as they were in previous moves toward increased data-driven decision-making.

Executives must harness this recent data explosion by focusing on carefully formulating the business questions that enable the swift and accurate identification of those nuggets of data that they believe can improve their organization's 1970 1980 1990 2000 2010 Now and future Figure 1:

Evolution of data-driven decision-making Source: Booz & Company. Linearprogramming Managementinformation systems/dashboards Datamarts Datawarehouses Dataclusters Operations research Creditscoring Cloudstorage Nonlinear programming Crowdsourcing Internetofthings Neuralnetworks Webanalytics Industry4. 0 Decision support

systemts Customer relationship management Sentimentanalysis Image analysis Webcrawling Naturallanguage processing Data visualizationmontecarlo simulations Standardreporting Knowledgediscovery Operational intelligence Heuristic problemsolving Riskmodeling Alerting Expertsystems Ad

Degree of sophistication Volume/complexity of data BIG DATA Chapter 1. 3: Big data Maturity 44 The Global Information technology Report 2014 2014 World Economic Forum performance or allow them to gain access to new revenue pools.

The authors interviewed executives in 330 publicly traded companies in the United states. They then examined relevant performance data,

The more companies characterized themselves as data-driven, the better they performed on objective measures of financial and operational results.

In particular, companies in the top third of their industry in the use of data-driven decision-making were, on average, 5 percent more productive and 6 percent more profitable than their competitors. 4 Despite these findings,

and deemed least valuable in human resources management. 8 How big data is used The big data maturity stages (Figure2) depict the various ways in which data can be used,

This stage typically relies on internal data, with an organization establishing key performance indicators (KPIS) to evaluate its success at achieving stated goals.

functional area excellence, organizations start to experiment with internal and external data to improve selected facets of their business.

For example, one retailer analyzed data recounting the past purchasing behavior of individual customers in conjunction with the company's most recent sales to predict

In the public sector, a Canadian hospital observed previously unseen patterns in streaming data from monitoring of newborns,

In many instances this involves obtaining data from external sources and The Global Information technology Report 2014 45 Chapter 1. 3:

Data-rich organizations such as retailers or telecommunications companies, are equipped better than others to utilize their internally generated data in this way.

For instance, a global mass merchant was able to increase its profit per customer by 37 percent by applying advanced customer analytics,

These prices are based on in depth choice modeling, fed with data from parking sensors, surveys, weather forecasts, information about holidays, local business activities,

Value proposition enhancement Targeted advertising/customized recommendations in real time Preventive health monitoring & disease detection Data monetization Online telematics services Personalization of customer experience/products Stage 4:

Business model transformation Selling of data to open new revenue pools Data-centric business models (e g.,, web search, web advertising) Quantitative management of investment funds Crowdsourcing to augment internal data Large-scale implementation Experimenting/selective adoption BIG DATA Chapter 1. 3:

Big data Maturity 46 The Global Information technology Report 2014 2014 World Economic Forum of parking space at all times.

data-driven business that aims to personalize ads. The ultimate goal is to deliver the right message to the right person at the right time.

This new ad tech world will be dominated by those major players that possess the most comprehensive data about consumers

and equipment will soon be loaded with sensors, making in depth status data available both in real time and across longer time spans.

GE is investing more than US$1 billion in building up its data science capabilities to provide data

and analytics services across business functions and geographies. 11 Another showcase for the transformative potential of big data comes from the public sector. Regional and national-level policymakers around the world are launching open data initiatives,

making data available to the public via integrated web portals and automated interfaces. Recent examples involve the United kingdom

the release of public data is an important environmental factor enabling organizations to use big data,

A data-driven business model has been integral to companies such as Google, Facebook, and Twitter, which have burst on to the scene in recent years

Obstacles to progress Despite widespread interest in data-driven decisionmaking in one form or another, companies face many potential pitfalls in extracting the maximum commercial benefit from big data usage.

The most prominent obstacle is the shortage of available talent specializing in data analytics data scientists with an advanced education in mathematics

Many organizations also suffer from poor-quality data that are fragmented across various systems, geographies, and functional silos.

Internal data has to be of high quality consistent, accurate, and complete and available across the organization.

executive instinct is challenged by the facts of hard data. However, while data can be of great assistance in solving an actual problem

it still holds true that senior management has to ask the right questions. Many of the external challenges that companies face revolve around data privacy considerations.

By building up these capabilities and integrating them effectively, organizations move further along the path of data-driven decision-making and position themselves to extract greater benefits from big data.

formulate a vision for the usage of data consistent with the public interest, fostering a common understanding with citizens and obtaining their buy in;

enable a big data ecosystem by establishing policies to facilitate valid business models for third-party data, service,

Enablers of environment readiness Success factors for internal capabilities Maturity stages in the usage of big data Traditional applications (getting more out of data you already have) New horizons of big data Technical capabilities/infrastructure

Regulatory framework for data privacy Dataavailability andgovernance ICT infrastructure Sponsorship Big data ecosystem Organizational capabilities and resources Public perception and awareness Data-driven decision-making culture

Business model transformation What can we read from the data? What can we learn from the data to become better?

How can we make data a value driver of our business? How can we use data to fundamentally reinvent our business?

Chapter 1. 3: Big data Maturity 48 The Global Information technology Report 2014 2014 World Economic Forum Priorities for policymakers will vary in different parts of the world.

Developing countries, for example, will concentrate on building up the required ICT infrastructure and education programs to prepare for large-scale demand from organizations intent on using big data.

and which data are forbidden explicitly by privacy regulations. If the scope of permissible data is to expand,

skeptical citizens must first be persuaded that big data will work in their favor by paving the way for better products and services.

The prevailing patchwork situation accentuates the lack of clarity on lawful data usage especially the question

if data are owned by a company in the European union, but hosted on servers in the United states,

when an organization plans to outsource data operations to a foreign provider, yet some personal data are prohibited from being transferred out of the country concerned.

They formulate basic principles around the limitation of collection of personal data, the specification of the purpose of data collection, the protection of collected data, the prevention of data loss or unauthorized access,

and the right of individuals to obtain information about collected data. The guidelines have influenced in the past national legislation,

In less-advanced sectors, with executives still grappling with existing data, making intelligent use of

prove the value of data in pilot schemes; identify the owner for big data in the organization

and formally establish a Chief Data Scientist position (where applicable); recruit/train talent to ask the right questions

and tools to allow data scientists to answer those questions; position big data as an integral element of the operating model;

and establish a data-driven decision culture and launch a communication campaign around it. Quick wins Organizations should resist expensive upfront infrastructure investments for overly ambitious big data projects.

Seeking out proprietary data that can be exploited immediately for commercial gain may provide The Global Information technology Report 2014 49 Chapter 1. 3:

Help from outside External data providers can offer all types of data to organizations and can therefore complement existing data-gathering efforts.

Typical datasets offered by external providers include contact, lifestyle, and demographic information on (market segments of) individuals.

In addition to sourcing data from outside the organization, the selective use of external analytics service providers can also prove instrumental in establishing big data maturity quickly,

Decision makers already acknowledge the future influence of data-driven decision-making. However organizations confront vast differences in their ability to utilize big data to good effect,

They will have to predict what the world of data-driven insights will look like in the medium term,

the initiative is available at http://data. gov. uk/;/in New york city it is available at https://data. cityofnewyork. us/.

/13 Polonetsky and Tene 2013.14 OECD 2013. REFERENCES Aberdeen Group. 2013. Big data Trends in 2013, February 1.

How Big Is Facebook's Data? 2. 5 Billion Pieces of Content and 500+Terabytes Ingested Every day.

Available at http://techcrunch. com/2012/08/22/how-big-is-facebooks-data-2-5-billion-pieces-ofcontent

The Evolving Role of Data in Decision-making. Available at http://www. economistinsights. com/analysis/evolving-role-data-decision-making.

Gartner. 2013. Survey Analysis: Big data Adoption in 2013 Shows Substance Behind the Hype. Available at http://www. gartner. com/id=2589121.

Balancing the Risks and Rewards of Data-Driven Public Policy ALEX PENTLAND MIT In June 2013,

massive US surveillance of phone records and Internet data was revealed by former National security agency (NSA) contractor Edward Snowden,

Data about human behavior, such as census data, have always been essential for both government and industry to function.

however, a new methodology for collecting data about human behavior has emerged. By analyzing patterns within the digital breadcrumbs that we all leave behind us as we move through the world (call records, credit card transactions,

The risk of deploying this sort of data-driven policy and regulation comes from the danger of putting so much personal data into the hands of either companies or governments.

The three main divisions within the spectrum of data control are:(1) data commons, which are available to all, with at most minor limitations on use;(

2) personal or proprietary data, which are controlled typically by individuals or companies, and for which legal and technology infrastructure must provide strict control

and auditing of use; and (3) the secret data of governments, The Global Information technology Report 2014 53 2014 World Economic Forum which typically has less direct public oversight and more stringent controls.

The issues of data commons will be addressed first, followed by concerns about personal and proprietary data,

and, finally, issues of secret government data. The preferred lens for examining these issues is experimentation in the real world rather than arguments from theory or first principles,

because using massive, live data to design institutions and policies is outside of our traditional way of managing things.

In this new digital era we cannot rely only on existing policy tradition, or even laboratory science,

because the strengths and weaknesses of big data analysis are very different from those obtained through standard information sources.

Data commons The first entry in the data taxonomy is the data commons. A key insight is that our data are worth more

when shared because they can inform improvements in systems such as public health, transportation, and government. Using a digital data commons can potentially give us unprecedented ability to measure how our policies are performing so we can know

We already have many data commons available: maps, census data, and financial indices, for example. With the advent of big data, we can potentially develop many more types of data commons;

these commons can be both accessible in real time and far more detailed than previous, hand-built data commons (e g.,

, census data, etc..This is because the new digital commons depend mostly on data that are produced already as a side effect of ongoing daily life (e g.,

, digital transaction records, cell phone location fixes, road toll records, etc. and because they can be produced automatically by computers without human intervention.

One major concern with these new data commons is that they can endanger personal privacy.

Another, secondary, concern involves the tension between proprietary interests, both commercial and personal, and the goal of putting data in the commons.

Acceding to these proprietary interests might tend to reduce the richness of such a commons,

which would diminish the ability of such a data commons to enable significant public goods. To explore the viability of a big data commons,

what is perhaps the world's first true big data commons was unveiled on May 1, 2013. In this Data for Development (D4d) initiative, 90 research organizations from around the world reported hundreds of results from their analysis of data describing the mobility

and call patterns of the citizens of the entire African country Côte d'ivoire. 1 The data were donated by the mobile carrier Orange, with help from the University of Louvain (Belgium) and the MIT Human Dynamics Laboratory (United states),

along with collaboration from Bouake University (Côte d'ivoire), the United Nation's Global Pulse, the World Economic Forum,

The research projects conducted by the 90 participating organizations explored the use of this data commons,

An example of using the D4d data to improve social equality was highlighted by work done by researchers at the University college of London,

Another example of using the D4d data to enhance social equality is the mapping of ethnic boundaries by researchers from the University of California,

The D4d data were utilized also to understand and promote operational efficiency through an analysis of Côte d ivoire's public transportation system by IBM's Dublin laboratory.

Finally, examples of using D4d data to improve social resiliency include analysis of disease spread by groups from Novi Sad University (Serbia), École Polytechnique Fédérale de Lausanne (EPFL, Switzerland

Balancing the Risks and Rewards of Data-Driven Public Policy 54 The Global Information technology Report 2014 2014 World Economic Forum Box 1:

The future of big data and governance The Data for Development (D4d) data commons is only a small first step toward improving governance by using big data.

because our current understanding of policy and human society is limited based on very data resources. Currently, most social science is based either on analysis of laboratory experiments or on survey data.

These approaches miss the critical fact that it is the details of which people you interact with,

The horizontal axis presents the duration of the data collection; the vertical axis shows the richness of the information collected.

almost all data from traditional social science (labeled 1 in the figure) are near the (0,

Recently data scientists have developed living lab technologies for harvesting digital breadcrumbs, and are now obtaining much richer descriptions of human behavior.

or electronic name badges (sociometers) to collect data. 2 The point labeled 9 is the D4d dataset that covers the entire country of Côte d'Ivoire. 3 Just a brief

dense data that allow us to build quantitative, predictive models of human behavior in complex, everyday situations.

In just a few short years we are likely to have available incredibly rich data about the behavior of virtually all of humanity on a continuous basis. The data mostly already exist in cell phone networks

and Rewards of Data-Driven Public Policy 2014 World Economic Forum These selected results are just a small sample of the impressive work that is made possible by this rich and unique data commons.

From the point of view of Orange, it also demonstrates the potential for new lines of business that combine this data commons with customers'personal data:

The work of these 90 research groups also suggests that many of the privacy fears associated with the release of data about human behavior may be generally misunderstood.

In this data commons, the data were processed by advanced computer algorithms (e g.,, sophisticated sampling and the use of aggregated indicators)

although the data were freely available for any legitimate research in which a group was interested,

the data were distributed under a legal contract that specified that they could be used only for the purpose proposed and only by the specific people making the proposal.

Personal and proprietary data The second category in the data taxonomy is personal and proprietary data

and cannot be done with the data and what happens if there is a violation of the permissions.

all personal data have attached labels specifying what the data can, and cannot, be used for. These labels are matched exactly by terms in a legal contract between all the participants stating penalties for not obeying the permission labels

and giving the right to audit the use of the data. Having permissions, including the provenance of the data,

allows automatic auditing of data use and allows individuals to change their permissions and withdraw their individual data.

Today, longstanding versions of trust networks have proven to be both secure and robust. The best known example is the SWIFT network for inter-bank money transfer;

its most distinguishing feature is that it has never been hacked. When asked why he robbed banks,

the MIT Human Dynamics Laboratory (http://hd. media. mit. edu), in partnership with the Institute for Data Driven Design (http://idcubed. org), have helped build openpds (open

which incidental data about human behavior must be included in the permissions and auditing framework. Such data are collected typically in the course of normal operations in order to support those operations (e g.,

, the location of a cell phone is required to complete phone calls), but without specific informed consent.

supported by Telecom italia, Telefonica, the MIT Human Dynamics Laboratory, the Fondazione Bruno Kessler, the Institute for Data Driven Design,

whose goal is to invent a better way of living. 4 The objective of this living lab is to develop new ways of sharing data to promote greater civic engagement and information diffusion.

and use data about Chapter 1. 4: Big data: Balancing the Risks and Rewards of Data-Driven Public Policy 56 The Global Information technology Report 2014 2014 World Economic Forum themselves.

For example, the openpds system lets the community of young families learn from each other without the work of entering data by hand

or the risks associated with sharing through current social media. These data can then be used for the personal selfempowerment of each member,

or (when aggregated) for the creation of a data commons that supports improvement of the community for example,

a map that shows disposable income for each neighborhood can stimulate better distribution of community services.

The ability to share data safely should enable better idea flow among individuals companies, and government;

and medical records to generate a useful data commons. It will also explore different user interfaces for privacy settings,

for configuring the data collected, for the data disclosed to applications, and for those data shared with other users, all in the context of a trust framework.

Although the Trento experiment is still in its early days, the initial reaction from participating families is that these sorts of data sharing capabilities are valuable,

and they feel safe sharing their data using the openpds system. Government data The third category in the taxonomy is secret government data.

A major risk of deploying data-driven policies and regulations comes from the danger of putting so much personal data into the hands of governments.

But how can it happen that governments especially authoritarian governments, choose to limit their reach?

The answer is unlimited that access to data about the citizen behavior is a great danger to the government as well as to its citizenry.

Consider the NSA's response to the recent Snowden leaks: This failure originated from two practices that we need to reverse, Ashton B. Carter,

with each different type of data separated and dispersed among many locations, using many different types of computer systems and encryption.

The logic behind this observation is that databases that have different types of data that are physically and logically distributed,

because unfettered access to data about citizen behavior can be a major aid to organizing a successful coup to overthrow the government.

Governments that structure their data resources in this manner can more easily monitor attacks and misuse of all sorts.

and Rewards of Data-Driven Public Policy 2014 World Economic Forum the requested records) remains hidden.

distributed data stores with permissions, provenance, and auditing for sharing among data stores. In this case,

however, the data stores are segmented by their referent for example, tax records for individuals, tax records for companies,

import records from country X to port Y, and so on rather than having one data store per person.

Because the architecture is so similar to the citizencentric personal data stores, it enables easier and safer sharing of data between citizens and government.

For this reason, several states within the United states are beginning to test this architecture for both internal and external data analysis services.

where governance is driven far more by data than it has been in the past. Basic to the success of a data-driven society is the protection of personal privacy and freedom.

Discussions at the World Economic Forum have made substantial contributions to altering the privacy and data ownership standards around the world in order to give individuals unprecedented control over data that are about them,

while at the same time providing for increased transparency and engagement in both the public and private spheres.

in particular governments and corporations, may be tempted to abuse the power of the data that they hold.

1. Large data systems should store data in a distributed manner, separated by type (e g.,, financial vs. health) and real-world categories (e g.,

whose function is focused on those data, with sharing permissions set and monitored by personnel from that department.

Best practice would have the custodians of data be regional and use heterogeneous computer systems. With such safeguards in place, it is difficult to attack many different types of data at once,

and it is more difficult to combine data types without authentic authorization. 2. Data sharing should always maintain provenance

and permissions associated with data, and should support automatic, tamper-proof auditing. Best practice would share answers only to questions about the data (e g.

by using the preprogrammed structured query language, or SQL, queries known as Database Views) rather than sharing the data themselves, whenever possible.

This allows improved internal compliance and auditing and helps to minimize the risk of unauthorized information leakage by providing the minimum amount of information required. 3. Systems controlled by partner organizations,

Without such safeguards, data can be siphoned off at either the data source or at the end consumer,

without even attacking central system directly. 4. The need for a secure data ecosystem extends to the private data of individuals and the proprietary data of partner companies.

As a consequence, best practice for data flows to and from individual citizens and businesses is to require them to have secure personal data stores

Best practice is to base these credentials on biometric signatures. 7 6. Create an open data commons that is available to partners under a lightweight legal agreement

Open data can generate great value by allowing third parties to improve services. Although these recommendations might seem cumbersome at first glance,

and Rewards of Data-Driven Public Policy 58 The Global Information technology Report 2014 2014 World Economic Forum found within modern computer databases and networks.

In many cases, the use of distributed data stores and management are already part of current practice,

Most importantly, these recommendations will result in a data ecosystem that is more secure and resilient, allowing us to safely reap the advantages of using big data to help set

and the Institute for Data Driven Design, available at http://idcubed. org. 3 For details about openpds, see http://idcubed. org/open-platform/openpds-project/.

ID3 (Institute for Data Driven Design, or idcubed. Available at http://idcubed. org. MTL (Mobile Territorial Lab). Available at http://www. mobileterritoriallab. eu/.Openid Connect.

Toward a New deal on Data. In The Global Information technology Report 2008 2009: Mobility in a Networked World.

and Rewards of Data-Driven Public Policy 2014 World Economic Forum 2014 World Economic Forum CHAPTER 1. 5 Managing the Risks and Rewards of Big data

and data that lack traditional structure, working in an environment of big data is just business as usual.

In this chapter we will discuss how managing the growing challenge of data is not new for a regional healthcare organization in the Midwestern United states

which have integrated neither data nor built a strategy around its use, the term big data itself is a way to express the sudden digitization of many things that have been with us forever

and stored as data. For most companies, big data represents a significant challenge to growth and competitive positioning.

and, more recently, the addition of sensor data (data derived from devices that sense their environment) to the mix have pushed all the boundaries of how we think about data and its uses.

but also implies new tools and new ways of managing data. Like many things, data can be used to do positive things for the world,

but it can also be used to manipulate, embarrass, or repress. Data can be highly accurate and efficiently structured or unstructured, fragmented,

and highly suspect. Data can also be managed well or carelessly. Big data in its outsized properties, amplifies those effects.

It is in those extremes that the risks and rewards of big data are decided. THREE KEY BIG DATA TRENDS As the world becomes more familiar with big data,

First and foremost, big data leverages previously untapped data sources. Those sources are of several types. The first includes wearable devices that stream data about an individual

and his or her surrounding environment on a momentby-moment basis such sensors include the applications on a smartphone that sense movement.

Those computer controls mean not only that data are constantly being fed into machines but that they are also coming out of machines at a quickly increasing rate.

The previously untapped information sources create a data ecosystem that can be modeled in a way that blends historical with in-the-moment information

Big data's effectiveness is coupled tightly to an organization's ability to bring the right data together in the right moments that allow for the right response and outcome.

the continued discovery of previously untapped data sources will continue to change and improve our models,

and acutely aware of the explosion of data. 1 Hackathorn's curve describes the decreasing value of data over time as it passes through stages of use (Figure1).

The challenge of the decreasing value of data over time has become even more meaningful in the age of big data.

and variety of data continue to push the curve down and to the right as organizations struggle to capture,

Added to this complexity is the increasing access to real-time data that leaves organizations in some industries attempting to reduce their response time to microseconds,

opening the door for better and better tools that can manage data far more quickly and efficiently than a human can.

Data exist in a moment, ready for decision and action, but there is a higher-level purpose for information.

Data comprise the digital representation of events, or things that happen in patterns that occur over time, in conjunction with other events or in isolation,

The idea of keeping track of what does not occur is a level of complexity higher than the old ways of waiting for data to arrive or change.

The big data conversation often centers on the use of machines as the best resource for the storage and analytic processing of vast amounts of data,

filter, and correlate each piece of data as it flows over the enterprise so that decisions can be made some through automation

automation becomes the path for taking action in the shortest time frame possible before the value of data decays further.

systems need to absorb new information in an adaptable way that also adds value to existing data that have already been collected.

Adaptable systems treat new sources of data coming constantly as the means to improve analytical models,

The first is need the for powerful visualization that allows the business to explore data to find questions worth answering.

and ended with information technology structuring data to answer those questions in a very repeatable way,

Visualization instead begins with capturing all data available so that multi-structured and iterative discovery can take place that reveals information with

Visualization lets the data speak for themselves. Humans are suited extremely well to visual analysis. Our brains are wired to very rapidly assimilate

Visualized data and the human mind make for a highly efficient combination. Most importantly, visualized data have the effect of engaging the nontechnical

but business-savvy human in the iterative process of discovering exploitable insight. This lessens the organization's reliance on technical resources and, specifically, on data scientists.

The second hurdle that organizations face is need the to manage ever-larger amounts of Data systems scoped for today's needs quickly become insufficient

when the data are increasing in size, speed, and complexity. Unfortunately when people talk about big data they often use the term to compartmentalize it

when data were processed as batches of transactions that represented a finite amount of information. Thinking of big data in those terms fails to take into account all of the data being created everywhere, every day.

This compartmentalized view can also deprecate data that may not appear useful or valuable or may be difficult to process.

At a point in the future, organizations will very likely look back and wish they had considered all data

when deciding what to store. When we consider data without specific boundaries we can focus our efforts on linking data together

and analyzing them more broadly. We will probably find the data have value for a wider range of people in the organization than originally anticipated.

When we consider all data, we can see the value of discovering the connectivity of data.

This brings into consideration different data types that are used to adorn our original data and make them more valuable as a source of visual, predictive,

and operational analytics. Why does that matter? We have grown accustomed to having instantaneous answers to our questions.

As data grow, they have the very real likelihood of slowing down how decisions are made.

Nonlinear growth taxes our systems and creates the scenario that every day we get bogged down more as untapped data sources become newly available,

our clever automations become less effective, and our systems seem less adaptable than before. An all-data approach allows the organization to see today's information as the best we have in the moment,

knowing that we will continue to layer on more DATA LATENCY ANALYSIS LATENCY DECISION LATENCY Business eventdata stored Information delivered Action taken Figure 1:

The value-time curve Source: Hackathorn 2004. l Process entry and exit l Process intermediate steps Time value The Global Information technology Report 2014 63 Chapter 1. 5:

Managing the Risks and Rewards of Big data 2014 World Economic Forum data not with the goal of having a larger dataset,

but instead with the goal of using all of the data available to gain the best outcome.

Rather than slowing down the results, using all available data takes into account data linkages

and permits a broad analysis that allows the most organizational clients to constantly arrive at the best possible outcome.

and the constantly additive benefits of all data allows experts to be able to explore data to find their value.

that means being able to explore diverse data that include historical visits to the website as well as transactions completed or shopping carts abandoned;

The first is need a to ensure that data are not being used in a way that goes against the organization's best interests.

Data are very powerful, and organizations need to ensure that information is being collected, stored, analyzed,

Many of the risks and rewards of big data are coupled tightly to the use of all of those data.

On the reward side, data can be used to create far better customer service by knowing the customers'needs and histories.

data can be used to engage the customer and to create a better relationship that serves everyone's needs.

as clinical trials of sample patients give way to all data about every patient. Personalization and healthcare offer two standout opportunities for big data to reward us.

the need to protect data both at a discrete level and, maybe even more importantly, at an aggregate level.

and dissemination of data, but in our haste to build out the largest datasets and the maximum computational power, the need to put the right controls in place has been overlooked consistently.

Throughout the evolution of big data, the capability to govern data appropriately has existed, but unless organizations make the choice themselves

The system needed to track patient data despite that patient's location within the hospital and despite the siloed information technology systems that are all too common in healthcare.

the system needed to bring data together in a way that allows high-speed correlation, based on prior analysis of sepsis data,

It has become far more complicated in recent years because of the explosion of data that connect the customer's customer and the supplier's supplier.

For a global package delivery company, knowing their business means being able to access all available data to monitor not just the arrival

and departure of aircraft but also the aircraft altimeter and attitude in order to provide additional layers of data that provide better insight on the nuanced status of the flight. 3 In a similar fashion,

and other sensitive cargos that require constantly monitoring all data. A global logistics company must monitor discrete data such as package temperature, location,

and time to delivery that continually describe a shipment's ambient conditions; furthermore, these data must be available alongside expiration data and acceptable data ranges.

Those aggregate data form the basis for ensuring non-stop compliance to local and international standards for moving items that require special handling.

Those same data ensure that contract terms are being respected and provide the basis for improving profitability while decreasing waste and inefficiency within a contracted service.

It is a gift that keeps on giving, as detailed historical shipment data allow better pricing of potential new contracts,

making the logistics carrier more competitive and reducing the risk of negotiating and accepting poor contracts.

Without the ability to manage all relevant data logistics companies and their customers would be unable to effectively move cargoes that bring enormous benefits to all parts of the planet.

and visualization reveals the meaningful patterns in data that tell us what happened under a host of variables in the past.

along with data coming from mobile devices. That information is vitally important to knowing not only how to provide information

MITIGATING THE RISKS Managing the three key trends of leveraging previously untapped data sources, using automation wherever possible,

and secure ever-larger amounts of data. Big data has a remarkable ability to change the world.

and application logs, all of which generate machine data that provide insight into how, when, and why machines are communicating with other machines.

Server logs also provide indications of who accessed data and how these data were used, affording critical oversight into potential illegal or unethical access and use of data.

Machine data are monitored by healthcare organizations to show compliance with Health insurance Portability and Accountability Act (HIPAA) standards banks to prevent credit card fraud,

and governments and corporations to watch for and prevent data loss. Today's public, legislative,

and legal sentiments may not be tomorrow's; these attitudes will continue to diverge by government and region.

Governments and other organizations need to balance the Facebook Effect, which entails the deliberate sharing of more and more personal information, with the requirements of security and

Although social acceptance of what data can and will be shared is changing and evolving, its impact on privacy

how, and why data will be used will become more important as organizations seek to provide better services and products at both the government and private levels.

Managing the Risks and Rewards of Big data 66 The Global Information technology Report 2014 2014 World Economic Forum CHAPTER 1. 6 Rebalancing Socioeconomic Asymmetry in a Data-Driven

Early Internet-era ICTS enabled more efficient and effective processing and use of data resulting in information that was used,

along with what is being called the data-driven economy, may finally make possible a true knowledge economy by

Two decades after the emergence of the consumer Internet, the world is awash in data.

from 2012 to 2017 machine to machine-machine data traffic is set to grow an estimated 24 times,

combining, and parsing data in groundbreaking ways; and individuals will be empowered because they will be able to draw on a wide range of yet-to-be invented data-based services

However, the knowledge economy relies on the availability of an adequate supply of data to enable the discovery of new knowledge.

This requires policy frameworks that permit data including personal data to be collected, analyzed, and exchanged freely,

and consent to restrict the collection of data predesignated as personal may overly restrict the supply of data available,

In reality, it is not the collection of data that is the source of potential harm but its unconstrained use.

for individuals to give express consent for all the data that may be generated about them. Together, the above factors necessitate a change in policy approach from a collection-based model toward a use-based model,

where individuals give permission for the use of data related to them. What is increasingly clear about an economy based on the collection, use,

users do consider fair value exchange in allowing the use of their data. 11 They have some expectation of

and they also illustrate one of the challenges of the data-driven economy. Most consumers understand that the discounts they receive via a loyalty card are provided in exchange for data they supply to the retailer.

But very few realize that the primary value to the retailer is the ability to track

Rebalancing Socioeconomic Asymmetry in a Data-Driven Economy 68 The Global Information technology Report 2014 2014 World Economic Forum As the global economy becomes increasingly grounded in the exchange of data, the ways in

which those data are collected and analyzed will become even more opaque to the consumer and the value exchange even harder to discern;

An individual may have only a vague idea of what data exist about him or her and

what is being done with these data. Some will have been volunteered actively by the consumer; some will have been obtained passively, with or without his or her explicit knowledge;

the real values of both the data provided and the service returned (in other words, the underlying exchange of value) may be almost impossible to determine.

Today little agreement exists about how best to value online data. The most comprehensive survey of valuation methodologies was presented in a recent OECD study (on which the authors of this chapter consulted) that identified numerous ways in

which data might be valued in the market (refer to Box1 12 However, each of these methods has significant flaws,

while making significant profits from the sheer scale of its data holdings, has yet to find the Holy grail of social media data monetization.

Distinguishing personally beneficial uses of data from socially beneficial uses is a further challenge because each creates separate and significant value.

attributing this benefit directly to data involves some inspired approximation. And even though one estimate puts the savings in this case at up to US$300 billion

13 most of the ways in which data are valued today would consider such benefits an externality to be ignored.

however, the various ways in which data might be valued are largely irrelevant today, because they have given already away their digital crown jewels for free.

Individuals are passing massive amounts of personal and other data to large corporations with little

Facebook users, for example, provide it with data that have the potential to generate immense long-term value for the company;

ascertaining the revenues or net income per data record; establishing the market prices at which personal data are offered or sold;

establishing the economic cost of a data breach; determining prices for personal data in illegal markets;

and ascertaining how much individuals would be willing to pay to protect their data. Source: OECD 2013.

Rebalancing Socioeconomic Asymmetry in a Data-Driven Economy 2014 World Economic Forum In other words, under the current model, the greater the role that data play in the global economy,

This could mean that a data-driven economy may become a contracting economy. Like Lanier, we believe that

if a truly sustainable data-driven economy is to be established, the way in which data are traded between individuals

and corporations will require a major reset. For a datadriven economy to thrive, individuals would have to receive fair/appropriate monetary compensation for each specific datum they provide,

a specific datum might gain value only when commingled with other data, for example, and any payment/micropayment system would have to be capable of keeping track of such complexities (assuming the individual has given permission for this to happen).

And a sustainable data-driven economy might also entail individuals paying fees (likely modest) for services they now consider (erroneously) to be free.

The importance to our economic future to the entire concept of a data-driven economy of undergoing this evolution cannot be overstated.

Without fair value exchange for data along with inherent trust in the data ecosystem everyone will ultimately lose consumers, corporations,

DESIGNING A TRUSTWORTHY AND ECONOMICALLY VIABLE DATA ECOSYSTEM We believe that an essential element of the foundation that can enable user trust

In such an architecture, data are accompanied logically by a metadata tag that contains references to the permissions

and policies associated with the data, along with related provenance information, specified in an extensible and interoperable markup language.

The metadata is logically bound to the data and cannot legally be modified unbound or for the entire data lifecycle by any parties other than the user

or as specified by, for example, a related policy or rules of a trust framework. More comprehensive consideration of these issues can be found in Realizing the Full Potential of Health Information technology to Improve Healthcare for Americans:

in a decentralized data ecosystem and consequently provides a foundation for both trustworthy data and fair value exchange.

and permissions over time, prevent undesirable use of previously collected data, address unanticipated uses, and adjust to changing norms.

in order to introduce the concept of fair value exchange (and sustainability) into a data-driven economy,

those data must be assigned monetary value, then metadata is the mechanism that will enable individuals to direct their labors

and reap the related benefits for the duration of its existence in the data ecosystem enabling a more enlightened society in the digital space.

and use of data remains unanswered, however, and requires considerably more research. Such an approach is technologically non-trivial.

although metadata can be logically bound to data, it can also be unbound by bad actors (a situation similar to the vulnerability of today's financial systems to hackers).

and policies that would govern how data can be used within and shared across trust boundaries,

and how those permissions and policies would be negotiated among the multiple parties with claims on the data or claims to its monetary value. 16 Yet another,

either directly or through other means (such as recommender systems or data intermediaries). Achieving all this will require the specification of an interoperable metadata-based architecture that can function at Internet scale.

The development of such an architecture needs to be a collaboration between multiple data stakeholders to ensure its feasibility and inherent security,

A metadata-based architecture offers value to all stakeholders in the data ecosystem, not only users.

Data controllers and processors can more easily understand and comply with permissions and policies defined for specific data.

They can also establish a dynamic, economically viable and sustainable marketplace in data that would ideally mirror the way in which fair value exchange is established in the physical world.

Solution providers can create applications and services that Chapter 1 6: Rebalancing Socioeconomic Asymmetry in a Data-Driven Economy 70 The Global Information technology Report 2014 2014 World Economic Forum produce new business value

and track the associated value chain, yet still use data in privacy-preserving ways. Companies can develop metadata schemas that fully describe data use, codes of conduct,

and relevant policies to meet industry and regulatory requirements. And regulators can take advantage of greatly improved auditability of data,

along with a stronger and betterdefined connection between the data and those policies that govern its use.

Although metadata can help facilitate a data-driven economy it cannot guarantee that entities handling the data will honor the permissions

and policies associated with them. However, when implemented as part of a principles-based policy framework that provides guidance on trustworthy data practices supplemented by voluntary

but enforceable codes of conduct and underpinned by legal redress this is a flexible approach that holds the promise of satisfying the interests of regulators, individuals, and industry.

In addition, as noted above, the authors believe that metadata could also be a key to establishing a viable and sustainable economic ecosystem in a data-driven economy,

enabling the monetary value generated by data to be tracked, captured, and realized as payments to and from the ecosystem's participants.

CONCLUSION AND WAYS FORWARD There are many challenges here, and today we have more questions than answers.

But what is clear is that, in order to create a sustainable data-driven ecosystem, technology and policy must work symbiotically.

For that to happen, governments and their regulatory representatives need to partner closely with industry, academic researchers,

Data are for value-added labor productivity. 5 Bughin and Manyika 2013.6 Gens 2011.7 Cisco 2013.8 Ericsson 2011.9 United nations

New Results from International Micro Data. Paper presented at the OECD Workshop on ICT and Business Performance, OECD, Paris, December 9.

Global Mobile Data Traffic Forecast Update, 2012 2017, February 6. Cisco. Available at http://www. cisco. com/en/US/solutions/collateral/ns341/ns525/ns537/ns705/ns827/white paper c11-520862. pdf. Drucker, P. F. 1969.

A User-Centred Approach to the Data Dilemma: Context, Architecture, and Policy. In Digital Enlightenment Yearbook 2013.

Rebalancing Socioeconomic Asymmetry in a Data-Driven Economy 2014 World Economic Forum PCAST (President's Council of Advisors on Science and Technology.

Rebalancing Socioeconomic Asymmetry in a Data-Driven Economy 72 The Global Information technology Report 2014 2014 World Economic Forum The Global Information technology Report 2014 73

& Company Data is a precious thing...and that's why I've called data the new oil.

Because it's a fuel for innovation, powering and energizing our economy. 1 These were the words of Neelie Kroes, Vice-president of the European commission responsible for the Digital Agenda,

As Kroes noted, data comprise a fuel we have begun only just to tap. This new oil is certainly plentiful.

Trillions of bytes of data are generated by companies that capture information about their customers, suppliers, and operations.

These sources of data do not even include the billions of individuals around the world generating the same fuel on their smartphones, personal computers, and laptops.

And the volumes of data are exploding. Mckinsey recently estimated that the data collected globally will grow from some 2,

700 exabytes in 2012 to 40,000 exabytes by 2020.2 To put this into context, a single exabyte of data equals a hundred thousand times all the printed material of the Library of Congress.

Definitions of big data vary greatly. Rather than put a number on what qualifies as big,

and adopting initiatives such as the European union's open data directive, which aims to give both citizens

The Role of Regulation in Unlocking the Value of Big data 74 The Global Information technology Report 2014 data.

and analytic capabilities for handling data, as well as the evolution of behavior among its users. Recent Mckinsey research shows that enabling open data or liquid data across seven domains education, transportation, consumer products, electricity, oil and gas, healthcare,

and consumer finance can generate more than US$3 trillion in additional value a year. 4 There is no guarantee,

Another prerequisite is a large enough pool of talent with the advanced analytical skills needed to put the data to good use.

whether ways can be found to protect information technology infrastructures and the data they carry from cyberattacks.

although the level of concern varies according to the type of data being considered. Consumers care more about their financial transactions and healthrelated information than about their online habits, for example.

The recent revelations by Edward Snowden disclosing US government data collection practices and the extraction of data from a number of large Internet companies have raised further public awareness about privacy issues

enabling individuals to be in control of data about their own person and preventing unnecessary listings and discriminatory behavior.

*These data are taken from the Special Eurobarometer poll published in 2011. Respondents were asked to select 4 out of 12 possible responses to the question of what should happen to companies that breach protection rules.

'privacy 72%of Internet users are worried about giving away too much personal data 88%of Europeans believe that their data would be protected better in large companies that are obliged to name a data protection officer Companies that breach protection rules should be fined*51 40

%banned from using such data in the future 39%compelled to compensate the victims 2014 World Economic Forum The Global Information technology Report 2014 75 1. 7:

if those data are to be used, and for what purpose. Companies and organizations using their data are required also to protect it from unauthorized use.

There are strict measures in place to protect medical data and credit information. But the issue has become more complicated in the Internet era.

Some argue that this right should be safeguarded more strongly than ever when so many companies and organizations are seeking access to personal data

and personal benefits can arise from sharing data, and many consumers are perfectly happy to give up some of their privacy in return for certain goods or services.

The aim is to enable the easier transfer of data among economies where the level of data protection regulation varies greatly.

or disclosure of their data, and that the data collected should be accurate, complete, and up to date. 6 Strict ex-ante requirements.

Ex-ante requirements apply in Europe, where both the Council of europe and the EU Commission have developed extensive frameworks to protect data

and privacy in their respective member countries. 7 These frameworks not only define what is regarded as personal data

and how such data can and cannot be used, but they also set organizational and technological requirements.

for example, implement technological and organizational measures to protect the data gathered. Furthermore, strict liabilities are in place relating to both companies and cooperation frameworks for regulators.

The frameworks stipulate that data from the European union may be transferred only to countries that have an appropriate level of protection. 8 All three regulatory archetypes are constantly evolving.

One example of this evolution is that the European union is currently updating the existing data protection directive from 1995 to better meet the requirements of today's data-intensive world. 9 In the United states

-and sector-specific laws have stricter regulations Case-by-case enforcement of privacy statements Tradition of habeas data the right to find out

the same level of protection for transmission to third countries A safe harbor agreement with the United states enables free data transfer between compliant companies in the two regions RUSSIA

The Role of Regulation in Unlocking the Value of Big data data can drive, while maintaining customer trust and data protection.

Consent before data collection. A key principle in the European regulatory framework is need the to obtain personal consent before data are gathered.

Anyone wanting to use an individual's data must first seek his or her permission.

But with so much information now available and being gathered, seeking that approval can be a slow,

The suggested EU framework defines personal data as any data that can be attributed to an identifiable person either directly or indirectly.

Both these definitions mean that not only data clearly identifying a person with information such as a name

but also data that can be attributed to a person indirectly through some other measure, such as via a mobile phone number or an identity code.

In a big data world where a lot of data are interlinked, it can be difficult to know exactly

when data become personal. Is it only data that identify a person with certainty, or does it also include data that identify someone with high probability?

How about a person's actions? Performance? Or buying behavior? To give a concrete example, a US retail chain identified new parents as a very lucrative market segment.

The chain analyzed their customers via characteristics such as their shopping habits, age, or marital status to spot customers who were pregnant.

Closely linked to the dilemma of how to define which data are personal is the issue of data anonymization or sanitization.

Traditionally, anonymous data have not been subject to data protection laws. However, in a big data world where anonymized data can easily be linked up,

it is not very hard to build a profile of a person without traditional means of identification such as a name or address.

For example, a team at Harvard was able to identify individuals from anonymized data in a genetics database by cross-referencing it with other public databases.

therefore be argued that the use of anonymous data can potentially constitute an intrusion of privacy.

Another question related to data anonymization is the right of companies to use the personal data already in their possession and turn them into anonymized data that they sell to others.

Some companies are selling their customer data such as location and application data of telecommunications companies to other companies in anonymized and aggregated form for marketing purposes.

Companies can target their marketing more effectively by using these data to learn about their customers.

Internet companies are also matching their customer data and online habits with data from other companies to better target their online advertising. 15 Several questions arise from a privacy perspective.

When can data be considered anonymized? Does using a pseudonym make data anonymous? are allowed companies to use anonymized data without the customer's consent,

or must customers give their prior approval? Should that consent be granted before use, or is it enough to allow customers to opt out?

The right to be forgotten. The new EU data protection framework proposes introducing a right for users to request that data controllers remove their personal data from their files.

Although on paper it sounds easy to remove personal data relating to an individual upon request,

this may not be so easy in the real world. The European union Agency for Network and Information security (ENISA) states that a great deal of data are stored in different places in the cloud for security reasons,

and these data may have been aggregated or amended into new forms, such as statistical data. Thus removing some specific data from all systems upon request may be entwined with the aggregated data.

Clearly this is not such a straightforward task in a virtual environment, and there is no single technical method to enable this easily. 16 Relevant jurisdiction.

Data are used increasingly and stored across borders, but regulation is still largely national in its scope

and regulators lack jurisdiction in markets outside their own. The uncertainty about jurisdictions creates problems for companies and 2014 World Economic Forum 1. 7:

Building Trust: The Role of Regulation in Unlocking the Value of Big data 78 The Global Information technology Report 2014 consumers alike.

Which regulations apply to companies from another country? Which judicial authority has the right to intervene in disputes?

In its recent proposal on the new EU data protection regulation, the European union extends the applicability of its regulation to companies outside the European union that are handling data relating to European union based individuals.

which stores its data within a cloud service operated by yet another. If data are leaked,

it can be very difficult to decide which company is liable. The above remaining gray areas must be considered

that allows US-based companies to transfer data between the two regions without further approval from EU-based regulators.

what data they do or do not share. Providing transparent privacy policies or simply informing the customer of the scope of data handling as well as requesting clear consent declarations from customers also helps create customer trust without sacrificing big data business opportunities.

Technological tools help as they can allow customers to adjust their privacy settings and choose whether to opt in or out of services.

and the protection of personal data an area where several issues are unclear and require further consideration and clarification,

Directive 95/46/EC on the Protection of Individuals With regard to the Processing of Personal data and on the Free Movement of such Data.

Open Data: Unlocking Innovation and Performance with Liquid Information. Mckinsey Global Institute, Mckinsey Center for Government,

Which Policies Will Lead to Leveraging Data-Driven Innovation's Potential? PEDRO LESS ANDRADE JESS HEMERLY GABRIEL RECALDE PATRICK RYAN Public Policy Division, Google, Inc. Over the last few years, myriad examples of innovation in data analysis have emerged,

creating new business models for data-driven innovation. For example, businesses are developing ways for real-time weather information to be communicated to devices in the field that can advise farmers on pest activity, water supply,

and inclement weather. 1 The Royal Netherlands Meteorological Institute has found a way to generate extremely accurate rainfall information using nothing more than existing data from cell-tower installations. 2 The next phase of the Internet

's evolution has us on a clear path toward a revolution of data. 3 Every year,

and dissemination of data come down, making those data more readily available. This process is fomented by the increasing migration of many social

and economic activities to the web. 4 More data are generated today than ever before; this is a positive trend that will inevitably continue:

90 percent of the world's information generated through the history of mankind has been generated over the last two years,

5 while data generated per year is growing at a rate of 40 percent. 6 In this chapter we will focus on the social and economic value of data,

We will therefore talk about data driven-innovation instead of big data, and will provide case studies from different areas,

with a special consideration of how data-driven innovation in the public sector could improve policymaking.

who can leverage the potential of data-driven innovation in their communities through forward looking policies.

WHY SPEAK OF DATA-DRIVEN INNOVATION INSTEAD OF BIG DATA? It has become axiomatic that more data are produced every year,

and somehow this phenomenon has driven commentators to call this revolution the age of big data. However, what is commonly known as big data is not a new concept,

as the use of data to build successful products and services, optimize business processes, or make more efficient data-based decisions already has established an history.

Innovative uses of data have been key to developing new products and making more efficient decisions for quite a long time,

Crunching data statistics, and trends in new ways has helped always change the way that entire sectors operate.

Agriculture is one of the first major sectors to have benefitted from the aggregation and analysis of data:

and it sets up data as a negative because of the implication that big is bad.

variety) are technical properties that depend not on the data itself, but instead on the evolution of computing, storage,

Thus, what is important about data is not their volume, but how they may contribute to innovation

Data alone do not possess inherent value; instead it is the processing of data in innovative ways that brings new economic and social benefits,

and this value creates a virtuous circle to feed into more use of data-based decisionmaking and analysis. 12 In other words,

it is the use of data that really matters. 13 One way to measure this value is to measure the socioeconomic metrics

(or to estimate the future potential) obtained from the use of data. The excitement that we are seeing with new deployments of data to fuel innovation is not just because of the volume of data

nor is it about the data themselves. As pointed out by the Software and Information Industry Association,

transformative data can be big or small or even the‘needle'of data found in a giant haystack. 14 The truth is that data are data,

and that has changed not for centuries. When big data is no longer a trendy concept, data will continue to drive innovation,

and solutions for new problems will come from new ways of analyzing and interpreting data,

regardless of volume or our technological capacities to manage it. In the next section, we will address what we see in the future for data-driven innovation.

THE BENEFITS OF DATA-DRIVEN INNOVATION Many sectors benefit from data-driven innovation: healthcare (e g.,

, diagnosis and treatment), financial services (e g.,, analyzing market trends and economic conditions), and transportation and public administration (e g.,

, metrics on what citizens want and where economic development is headed), to name a few. In one example, a philanthropic research center stores and analyzes the cancer genome

a university-based group of academics mined data from 60 years of historical weather records to identify the factors that are most predictive of hurricane activity. 16 In the private sector,

and other technical data to identify and prevent fraudulent activity in online payments, bolstering trust for commercial exchanges on the Internet. 17 A startup firm has developed a no-cost platform for users that helps travelers predict flight delays using an algorithm that scours data on every domestic flight for the past 10 years and matches

it to real-time conditions. 18 Finally, the United nations is working with governments around the world to understand global trends related to hunger,

because data-driven innovation takes place across various sectors of the economy and society, it is sometimes difficult to quantify its full economic impact.

as economists are now demonstrating that a fundamental problem exists in our ability to quantify the value of data,

'22 Data are neither a good nor a service and so they escape traditional economic analysis. This highlights the complication of discussing data:

although the value often creates an economic reward, such measurements are not easy to make.

but only in the past couple of years have undertaken economists serious attempts to quantify the Internet's impact on the world's economies. 23 One example of innovative data use that has a difficult-to-quantify economic value proposition is Google's Flu Trends,

Flu Trends provides its analysis based on aggregated search queries. 24 Some of these estimates have been compared with official historic influenza data from relevant countries with surprisingly high levels of accuracy

the data from Flu Trends are open, available for everybody 1. 8: From Big data to Big Social and Economic Opportunities 82 The Global Information technology Report 2014 2014 World Economic Forum to download and use.

A group of researchers from the Johns hopkins university, for example, used these data to develop a practical influenza forecast model designed to provide medical centers with advance warning of the expected number of flu cases,

and accessibility of data are crucial to keeping the wheel of innovation rolling by allowing others to access

and manipulate the data in transformative ways. Similarly the rapid collection and processing of information has helped in recent natural disasters.

a group of researchers from the Karolinska Institute and Columbia University analyzed calling data of over 2 million mobile phones to detect the pattern of population movements across the country.

called Flow Minder, suggest that population movements during disasters may be more predictable than had previously been understood. 28 These examples show that there are ethical and responsible ways of analyzing big sets of data

High-value products and services and more efficient deployment of resources are not the only outcomes of data-driven innovation.

Studies suggest that there is a direct connection between data-driven decision-making in business and improved firm performance.

Firms that adopt data-driven decision-making have an output and productivity that is 5 percent to 6 percent higher than would be expected,

as they allow firms of all sizes to leverage data-driven analysis without needing to make huge investments in their IT infrastructure. 30 As is the case for businesses,

According to Mckinsey, the US government had over 848 petabytes of data stored in 2009 second only to the manufacturing sector. 31

What is usually known as data-driven policymaking involves the collection of information related to how roads are traveled,

such as census departments, have long been established to maintain data about the nation. Thus data-driven policymaking is not new

but the opportunities brought by the advances on information and communication technologies make data-driven policymaking increasingly accessible to government officials.

Further, open government initiatives put these data into the hands of the public, facilitating a new kind of transparency and civic engagement for curious and interested citizens.

Data can benefit society when they are open. 33 By providing a way to check assumptions,

detect problems, clarify choices, prioritize resources, and identify solutions, data-driven policymaking injects databased rationality into the policymaking process, all of

which could also create economic benefits. 34 According to the Organisation for Economic Co-operation and Development (OECD),

by fully exploiting public data, governments in the European union could reduce administrative costs by 15 percent to 20 percent,

data-driven policymaking moves policymaking out of the realm of intuition and dogma by creating a sound evidentiary basis for decisions.

However, studies suggest that the public sector still does not fully exploit the potential of the data it generates

and collects, nor does it exploit the potential of data generated elsewhere. The revolution of data still needs to make its way within government agencies.

Although the government is one of the sectors with the greatest potential to capture value from data-driven innovation

it also has one of the lowest productivity growth rates because it lags behind business and industry in fully embracing data.

Box 1: Hong kong Efficiency Unit The Hong kong Efficiency Unit acts as a single point of contact for handling public inquiries and complaints on behalf of many government departments.

its staff recognized the social messages hidden in the complaints data, which in fact provided important feedback on public service.

From Big data to Big Social and Economic Opportunities 2014 World Economic Forum SETTING THE STAGE FOR A DATA-DRIVEN ECONOMY Apart from producing

and using data for better policymaking processes, the public sector can also play its part by promoting

and fostering data-driven innovation and growth throughout economies. To realize the potential of data-driven innovation,

policymakers need to develop coherent policies for the use of data. This could be achieved by:(

1) making public data accessible through open data formats,(2) promoting balanced legislation, and (3) supporting education that focuses on data science skills.

Open data initiatives The use of data across sectors can drive innovation and economic growth. However, many generators of data including governments do not share their data.

As we have seen, the public sector is one of the main producers and collectors of data.

Open data initiatives that make data in the public sector accessible to everyone contribute to data-driven innovation

and create value for governments. For example, aggregate public transport data may be used by developers to create useful applications for passengers (see Box2.

This access to real-time information could result in a greater number of passengers and, subsequently, to more income for the transport authorities.

In addition, accessible public data usually lead to better data because data users can test structure

and help to fix mistakes (see Box3). Improvements in the quality of data mean better databased solutions and, ultimately, better policy.

It is important to note that opening up public data does not necessarily lead to the disclosure of personal data.

Public data that may contain personal information of citizens should be shared in an aggregate or fully de-identified way to protect citizens'privacy.

We will go into more detail around the discussions on privacy and personal data in the following section.

How to get the best of data-driven innovation The increasing ease of linking and analyzing information usually raises concerns about individual privacy protection.

Personal data are the type that has drawn the most attention from a regulatory point of view, in relation to data-driven innovation.

The challenge is to achieve a reasonable balance between individuals'right to privacy and the emerging opportunities in data-driven innovation.

For this reason, in order to capitalize on opportunities for economic growth via innovation, flexible and adaptable policies are needed.

In other words, privacy protection frameworks should support secure and reliable data flows while enhancing responsible, risk-reducing behavior regarding the use of personal data.

Legislation should take into account the tension between data-driven innovation and the principle of data minimization.

but framing things in this way leads to the inevitable conclusion that fewer data are better.

A key dividend of data-driven innovation is the possibility of finding new insights by analyzing existing data and combining them with other data.

This can sometimes blur the lines between personal and nonpersonal data, as well as the uses for which consent may have been given. 36 A practical definition of personal data should be based on the real possibility of identifying an individual during the treatment of data. 37 This is why applying existing approaches to personal data may result in overly broad definitions that can

have unintended negative consequences for data-driven innovation For the same reason that combining and correlating datasets is a key feature of data-driven innovation,

the full potential of data collected may not be clear at the time of collection. A consent model that is appropriate to the data-driven economy should provide a path for individuals to participate in research through informed consent.

In this model, they would become aware of the benefits of their participation as well as potential privacy risks.

For this reason, the legislative considerations for data collection should not assume that less is always more

and should take into consideration the data-intensive direction of some of the economy's growing sectors.

Building skills for the future An economy where both the public and private actors who base their decisions on data analysis will demand highly skilled workers with backgrounds in Box 2:

Harvard Transparency Project The Transparency Policy Project at Harvard's Kennedy School studied the relationship between transit data format and accessibility and the number of applications for that system.

Meanwhile, the most reluctant agency to adopt open data, WASHINGTON DC's Metro, had only 10 applications serving its customers in 2012 (1 to 121,400).

and a gap between the supply and demand for these types of skills may hinder data-driven innovation's full potential.

The United states itself will need up to 190,000 more workers with deep analytical expertise by 2018.38 This clear demand for skilled workers is further evidence of data-driven innovation's potential benefits for economies.

which data may be generated, analyzed, and put to use. Thirty years ago we needed an army of data-entry clerks to feed an information into a system;

today, the information is already available in a machine-readable format. We carry devices with sensors that can provide incredible amounts of information in real time.

however, misses the true potential of data. Instead, we should focus our discussion on data-driven innovation,

as this relates to the results and outcomes of data use from generating innovative products and service to improving business and government efficiency.

Many other examples provided earlier have shown that datadriven solutions have transformative social impact as well. However

achieving the full potential of data-driven innovation demands challenging the outdated paradigms established in a significantly less data-intensive world.

To achieve the maximum benefits from data-driven innovation, policymakers must take into account the possibility that regulation could preclude economic and societal benefits.

It is by looking at the big picture surrounding big data that we can create the right environment for data-driven innovation,

amount of data itself, but its analysis for intelligent decision-making. 13 Hemerly 2013.14 SIIA 2013.15 Burke 2012.16 Mccormick University 2012.

Can open data lead to better data? Moscow's city government published about 170 datasets with geo coordinates at the Moscow opendata portal.

After examining the data, Russian members of the Openstreetmap community found many errors and mistakes, including wrong geo coordinates.

while reviewing open statistical data from the United kingdom's National Health Service, found that records said that 20,000 male patients required midwifery services between 2009 and 2010.

How Does driven Data Decisionmaking Affect Firm Performance? April 22. http://dx. doi. org/10.2139/ssrn. 1819486.

The Promise of Data-Driven Policymaking in the Information Age. April. Center for American Progress.

Public Policy Considerations for Data-driven Innovation. Computer (IEEE Computer Society) 46 (6): 25 31.

The (Unmeasured) Rise of the Data-Driven Economy. Progressive Policy Institute, Policy Memo. October. Available at http://www. progressivepolicy. org/wp-content/uploads/2012/10/10.2012-Mandel beyond-Goods-and-Services the-Unmeasured-Rise-of-the-Data-Driven-Economy. pdf. Manyika, J.,M

. Chui, B. Brown, J. Bughin, R, Dobbs, C. Roxburgh, and A. H. Byers. 2011. Big data:

Exploring Data-Driven Innovation as a New Source of Growth: Mapping the Policy Issues Raised by‘Big data'.

How Can Open Data Lead to Better Data Quality? September 3. Available at http://blog. okfn. org/2013/09/03/how-can-open-data-lead-to-better-data-quality/.

/Platzman, G. W. 1979. The ENIAC Computations of 1950: Gateway to Numerical Weather Prediction. Bulletin of the American Meteorological Society 60 (4): 302 12.

Effective Disclosure through Open Data. Transparency Policy Project, Harvard Kennedy School, June. Available at http://www. transparencypolicy. net/assets/FINAL UTC TRANSITTRANSPARENCY 8%2028%202012. pdf. SIIA (Software and Information Industry Association.

Data-Driven Innovation: A Guide for Policymakers: Understanding and Enabling the Economic and Social Value of Data.

SIIA White paper. Available at http://goo. gl/QWJGHY. Sims, D. 2011. Big data Thwarts Fraud. O'reilly Strata, February 8.

Yet this tidal wave of data when channeled and filtered by an array of new information technologies holds untold value for organizations,

data and the decisions driven by those data now represent the next frontier of innovation and productivity.

honed by the monthly clickstream data of 45 million online shoppers, tailors offerings to online shoppers, raising the rate of completed transactions by more than 10 percent. 4 But for most businesses,

because it requires simultaneously analyzing various types of information transactions, log data, mail documents, social media interactions, machine data, geospatial data, video and audio data,

Traditional types of business data were available in a format that was structured and could have been analyzed automatically for example, a spreadsheet quantifying customer returns of different products at different stores over time.

Synthesizing unstructured data from numerous sources and extracting relevant information from it can be as much art as science.

Flawed data governance Big data is not a substitute for much less a solution for flawed information management practices.

If anything, it requires much more rigorous data governance structures. Without those improvements, information technology (IT) systems that have not been upgraded to handle large volumes of data are likely to collapse under the sheer weight of the data being processed.

Surveys suggest that business leaders are excited often more about the potential of big data Box 1: A user's glossary of key big data terms As an organization plans its big data strategy,

A batch-oriented programming framework that supports the processing of large data sets in a distributed computing environment.

A column-oriented database stores data tables as sections of columns of data rather than as rows of data,

as in most relational databases, providing fast aggregation and computation of large numbers of similar data items.

providing data summarization, query, and analysis. It permits queries over the data using a familiar SQL-like syntax.

Flume: A tool for collecting, aggregating, and moving large amounts of log data from applications to Hadoop.

Mahout: A library of Hadoop implementations of common analytical computations. Oozie: A workflow scheduler system developed to manage Hadoop jobs.

The R language is used widely among statisticians and data miners for developing statistical software and data analysis.

A tool facilitating the transfer of data from relational databases into Hadoop. Zookeeper: A centralized service for maintaining configuration information, naming, providing distributed synchronization,

Lack of a data-driven mind-set Because mind-set can be hard to pin down,

if business leaders do not have driven a data mind-set that is, if they do not believe that it is important for decisions to be based on cold,

data-driven business leaders will have a tremendous incentive to treat data, and therefore the IT and analytics professionals who help deliver it in an understandable form,

And these leaders will make it a priority to ease the flow of data across organizational silos.

and in-memory databases (where data reside on main memory as opposed to disk storage).

machine learning (systems that learn from data) and data warehousing. Big data professionals are expected to be familiar with both disciplines,

The onus of collecting data should be shared by the IT and analytics teams, but analysis must be the sole responsibility of analytics professionals.

and how can data help you with it? are a good place to start. Such questions can spur the functional experts themselves to start asking the more fundamental questions that can unlock the value of data.

For instance, marketing professionals could ask, What is the value of a‘tweet'or a‘like'?

It also pays to keep in mind that big data is not about data themselves; it is about using data to discover insights that can lead to valuable outcomes.

Step 3: Take stock of all data worth analyzing. Valuable business insight can come from many sources,

including social media feeds, activity streams, and dark data (data that are currently unused but that have already been captured), machine instrumentation,

and operational technology feeds. It is important to explore these sources and to experiment with new ways of capturing information, such as complexevent processing, video search,

Organizations'data typically fit into four buckets: Operational data, such as data emanating from smart grid meters,

embedded systems (examples include microwave sensors and chips inserted in automobiles), transactions logs (such as payment transactions), radio-frequency identification chips (RFID), navigation and location sensors, networks, and servers.

Streaming data, such as computer network data, phone conversations, and so on. Documents and content, such as PDFS, web content,

and analyze data and for which the potential payback is high. Functions such as marketing, customer service, supply chain management,

In comparing views of data from a traditional business intelligence perspective versus a big data one, consider the following the questions:

What data are we capturing today? What are the limitations of this kind of structured data?

What extra value will we get by collecting external, context-specific, and unstructured data? Where will we find data

and how will we collect them? Would our business act upon the insights 0 5 10 15 20 25 30 35 40 45 50 0 10 20 30 40 50 Figure1:

Potential payback of big data initiatives Source: Gartner, 2013. Data systems most fit for purpose Easy pickings Overeager Invest here Not ready but who cares?

it is helpful to keep in mind the complexity of both the type of data and the type of analysis the data will require.

As we mentioned above, much of what is meant by big data is unstructured information data that traditionally have been impossible to break down

and categorize as they are collected. Such data are not only difficult to analyze but can also be misinterpreted easily

when taken out of context. Thus it makes sense to experiment in the beginning with data that are relatively easy to analyze.

Different types of analysis also present varying degrees of complexity. Generally speaking, descriptive analytics (which answers what happened?

An organization's traditional information architecture may not accommodate massive, high-speed, variable data flows. Many traditional and even state-of-theart technologies were designed not for today's or tomorrow's level of data volume, velocity, and variety.

Even as datasets grow exponentially along those dimensions, the investments required for scaling technologies (such as processors, storage, database management systems,

and train data scientists and analysts in Hadoop programming, or to buy an enterprise-ready version of Hadoop.

Every team member business analyst, programmer, data scientist, and data visualizer will need to have cross-functional familiarity.

Building this team is a five-step process: Break down your talent needs into four distinct areas:

possess data-crunching capabilities, and make data-driven decisions. Hire people with needed skills if they are not available

or cannot be acquired by cross-training existing employees. Hire people with related skills if the needed skills are unavailable within your organization

For instance, consider substituting statisticians for the much less common data scientists. Start small and scale up.

Some are even predicting that big data analytics will lead to the emergence of an entirely new set of CXO roles within enterprises Chief Data Officer, Chief Digital Officer, Chief Analytics Officer, and so on.

Without clear line responsibilities, a CDO (whichever flavor, Data or Digital) or a CAO would have little leverage to execute the important tasks needed to increase the organization's big data capabilities

we recommend that governments create a vision and platform for publicsector open data. We believe that open data will be an essential characteristic of future public policy.

It is important that such a vision percolate down from the top to garner support from ministries

and civil servants alike so that open data initiatives function effectively. Communicating from the very top that open data is an essential characteristic of public policy is crucial.

Furthermore governments should create an easy-touse platform for the public to access the data in a form that is easily digestible and ready for analysis. It is also advisable to develop rules

and regulations for taxing the commercial use of open data. Governments should spearhead the effort to ensure the privacy and security of personal data.

The appropriate agency should take a leading role in working with all relevant private -and public-sector entities to develop

and combining data on Hadoop with data from traditional databases to turn its marketing staff from Mad Men to Math Men.

A US-based document management corporation is applying its decades of expertise in imaging technologies to transportation systems that can benefit from real-time analysis of data.

and for gathering data it sells to its clients. It now also sells this big data platform through its newly established subsidiary.

and crop protection data from its test sites to provide a service that generates field performance information for every acre

The plan should identify all government data worth analyzing, define data collection responsibilities, outline steps to ensure data quality,

and determine where big data technologies and analysis capabilities should be deployed first. Finally, each government should establish a big data center of excellence (BDCOE.

The BDCOE should be the focal point of expertise, long-range thinking and policy formulation, and training and development.

but should also act as the government's leading authority on all matters related to data management.

The numbering of the variables matches that of the data tables in the next section of the Report,

However, the year of each data point is indicated in the corresponding data table. For more information on the framework and computation of the NRI, refer to Chapter 1. 1. ONLINE DATA PORTAL In complement to the analysis presented in this Report,

an online data portal can be accessed via www. weforum. org/gitr. The platform offers a number of analytical tools

and visualizations, including sortable rankings, scatter plots, bar charts, and maps, as well as the possibility of downloading portions of the NRI dataset. 3:

For further details and explanation, please refer to the section How to Read the Country/Economy Profiles on page 97 Following a correction on the data for indicators 8. 02 Government online service Index and 10.04 E-participation Index,

130.2.7 Zimbabwe Zimbabwe Low-income group average 2014 World Economic Forum Part 3 Data Tables 2014 World Economic Forum 2014 World

Economic Forum How to Read the Data Tables The Global Information technology Report 2014 251 The following pages provide detailed data for all 148 economies included in The Global Information technology Report 2014.

The data tables are organized into 10 sections, which correspond to the 10 pillars of the Networked Readiness Index (NRI).

, the period to which the majority of the data corresponds) follows the description. When the period differs from the base period for a particular economy,

When data are not available or are outdated too, n/a is used in lieu of the rank and the value.

Because of the nature of data, ties between two or more countries are possible. In such cases, shared rankings are indicated accordingly.

ONLINE DATA PORTAL Complementing the analysis presented in this Report, an online data portal can be accessed via www. weforum. org/gitr.

The platform offers a number of analytical tools and visualizations including sortable rankings, scatter plots, bar charts,

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 6 7 1 Singapore...6. 1 2 Finland...

2014 253 Index of Data Tables Environment subindex 1st pillar: Political and regulatory environment...255 1. 01 Effectiveness of lawmaking bodies*..

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 6 7 1 Singapore...6. 1 2 Finland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 9 7 1 Luxembourg...5. 9 2 Singapore...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 9 7 1 New zealand...6. 7 2 Finland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 8 7 1 Singapore...6. 1 2 Finland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 5 7 1 Finland...5. 9 2 Hong kong SAR...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 8 7 1 Finland...6. 2 2 Singapore...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 United states...19 2 Luxembourg...20 3 Japan...21 4 New zealand...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Ireland...21 1 Singapore...21 3 Rwanda...23 4 Austria...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Singapore...150 2 New zealand...216 3 Bhutan...225 4 Korea, Rep...230 4 Rwanda...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 9 7 1 Finland...6. 5 2 Sweden...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 2. 7 7 1 Hong kong SAR...4. 6 2 Qatar...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Macedonia, FYR...8. 2 2 Timor-Leste...11.0 3 Qatar...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 New zealand...1 2 Georgia...2 2 Macedonia, FYR...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Canada...1 1 New zealand...1 3 Armenia...2 3 Georgia...2 3 Kyrgyz Republic...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 9 7 1 Japan...6. 2 2 Taiwan, China...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Korea, Rep...100.8 2 Finland...95.5 3 United states...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 2 7 1 Switzerland...6. 1 2 Belgium...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 5 7 1 Qatar...5. 6 2 Singapore...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Iceland3...54,817. 2 2 Norway3...29,244. 2 3 Canada3...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Azerbaijan...100.0 1 Bahrain...100.0 1 Bhutan...100.0 1 Chile8...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Luxembourg...4, 088.5 2 Hong kong SAR...1, 426.6 3 Malta...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Iceland...3, 139.3 2 Netherlands...2, 803.7 3 Korea, Rep...2, 751.6 4 Switzerland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 9 7 1 Iceland...6. 6 2 Finland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Liberia3...0. 00 2 Sierra Leone3...0. 00 3 Hong kong SAR...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Sri lanka...8. 22 2 Israel3...8. 39 3 Bangladesh...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Argentina1...2. 00 1 Australia5...2. 00 1 Austria6...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 7 7 1 Switzerland...6. 0 2 Finland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 0 7 1 Singapore...6. 3 2 Finland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Australia...133.0 2 Spain...128.5 3 Netherlands...128.4 4 New zealand...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Estonia...99.8 2 Latvia...99.8 3 Azerbaijan5...99.8 4 Georgia...99.7 5 Poland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Hong kong SAR...229.2 2 Saudi arabia...187.4 3 Kazakhstan...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Iceland...96.2 2 Norway...95.0 3 Sweden...94.0 4 Denmark...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Netherlands...97.2 2 Iceland...96.0 3 Bahrain...92.7 4 Denmark...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Korea, Rep...97.4 2 Iceland...95.0 3 Netherlands...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Switzerland...39.9 2 Netherlands...39.8 3 Denmark...38.8 4 France...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Singapore...126.1 2 Japan...115.1 3 Finland...106.6 4 Korea, Rep...105.1 5 Sweden...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 5. 5 7 1 Iceland...6. 7 2 United kingdom...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 7 7 1 Sweden...6. 2 2 Iceland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 3. 6 7 1 Switzerland...5. 8 2 Finland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Japan1...301.1 2 Sweden1...294.5 3 Switzerland1...293.5 4 Finland1...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 8 7 1 Finland...6. 2 2 Switzerland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 4 7 1 United kingdom...6. 3 2 Korea, Rep...6. 2 3 United states...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 0 7 1 Switzerland...5. 6 2 Finland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 0 7 1 United arab emirates...5. 9 2 Qatar...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Korea, Rep...1. 00 1 Singapore...1. 00 1 United states...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 3 7 1 Rwanda...6. 2 2 United arab emirates...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 3 7 1 Finland...5. 8 2 Korea, Rep...5. 7 3 Sweden...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Japan1...118.9 2 Finland1...110.1 3 Sweden1...88.8 4 Korea, Rep. 1...87.8 5 Israel1...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 1 7 1 Finland...5. 7 2 Qatar...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Luxembourg...57.2 2 Singapore10...51.0 3 Switzerland...49.8 4 Sweden...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 2 7 1 Qatar...6. 1 2 United arab emirates...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 2 7 1 Iceland...6. 6 2 Finland...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 4. 1 7 1 Singapore...6. 1 2 United arab emirates...

Data Tables RANK COUNTRY/ECONOMY VALUE 1 Korea, Rep...1. 00 1 Netherlands...1. 00 3 Kazakhstan...

E-Government for the People 2014 World Economic Forum 2014 World Economic Forum The Global Information technology Report 2014 323 The present section complements the data tables

The number next to the indicator corresponds to the number of the data table that reports ranks and scores for all economies on this particular indicator.

*The data used in this Report represent the most recent available figures from various international agencies

and national authorities at the time when the data collection took place. It is possible that some data have been updated

or revised since then. 1st pillar: Political and regulatory environment 1. 01 Effectiveness of lawmaking bodies*How effective is your national parliament/congress as a lawmaking institution?

data are available as of 2009,2011, or 2012). The index is calculated as the average of points obtained in each of the 19 categories for

which data are available. Full liberalization across all categories yields a score of 2, the best possible score.

when data are missing, we apply a value of 99 percent for the purposes of calculating the NRI.

but excludes mobile broadband subscriptions via data cards or USB modems. Subscriptions to public mobile data services, private trunked mobile radio, telepoint or radio paging

and telemetry services are excluded also. It includes all mobile cellular subscriptions that offer voice communications.

Jess Hemerly Jess Hemerly is a Senior Public Policy and Government affairs analyst at Google, focusing on privacy and security, data-driven innovation, and accessibility.

He specializes in customer interface and organizations, the development of strategies that capture value from data,

M-H. Carolyn Nguyen Dr M-H. Carolyn Nguyen is a Director in Microsoft's Technology Policy Group, responsible for policy initiatives in data governance and personal data management.

In 2012 Forbes named Sandy one of the seven most powerful data scientists in the world,

In addition, the Report includes detailed profiles for the 148 economies covered this year together with data tables for each of the 54 indicators used in the computation of the NRI.


< Back - Next >


Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011