Standford_ Understanding Digital TechnologyGÇÖs Evolution_2000.pdf.txt

Understanding Digital Technology†s Evolution and the Path of Measured Productivity Growth Present and Future in the Mirror of the Past by Paul A. David Stanford university & All Souls College, Oxford First draft: 20,may 1999 Second draft: 7 december 1999 This version: 10 january 2000 ACKNOWLEDGMENTS This contribution represents a rather radical abridgement and recasting of my paper â€oedigital Technology and the Productivity Paradox: After Ten Years What Has been learned? †originally prepared for the White house Conference on Understanding the Digital economy: Data, Tools and Research, held at the U s. Department of commerce, Washington, D c.,25-26 may 1999. It draws upon joint work with Edward Steinmueller, and with Gavin Wright, and has had the benefit of detailed editorial comments from Erik Brynolfsson Forthcoming in Understanding the Digital economy eds.,, E. Brynolfsson and B. Kahin (eds. MIT Press Please do not reproduce without author†s expressed permission Contact Author: Professor Paul A. David, All Souls College, Oxford OX1 4al, UK Tel.:44+(0) 1865+279313 (direct;++279281 (sec†y: M. Holme Fax: 44+(0) 1865+279299; E-mail:<<paul. david@economics. ox. ac. uk >Understanding the Digital economy's Evolution and the Path of Measured Productivity Growth Present and Future in the Mirror of the Past 1 1. The Computer Revolution, the"Productivity Paradox"and the Economists Over the past forty years, computers have evolved from a specialized and limited role in the information processing and communication processes of modern organizations to become a general purpose tool that can be found in use virtually everywhere, although not everywhere to the same extent. Whereas once"electronic computers"were surrounded large machines by peripheral equipment and tended by specialized technical staff working in specially constructed and air conditioned centers, today computing equipment is to be found on the desktops and work areas of secretaries, factory workers and shipping clerks, often side by side with the telecommunication equipment linking organizations to their suppliers and customers. In the process, computers and networks of computers have become an integral part of the research and design operations of most enterprises and, increasingly, an essential tool supporting control and decision-making at both middle and top management levels. In the latter half of this forty year revolution, microprocessors allowed computers to escape from their â€oeboxes, †embedding information processing in a growing array of artifacts as diverse as greeting cards and automobiles and thus extending the â€oereach†of this technology well beyond its traditional boundaries Changes attributed to this technology include new patterns of work organization and worker productivity, job creation and loss, profit and loss of companies, and, ultimately, prospects for economic growth national security and the quality of life. Not since the opening of â€oethe atomic age, †with its promises of power too cheap to meter and threats of nuclear incineration, has a technology so deeply captured the imagination of the public. Indeed, not since that era have hopes and doubts about the social usefulness of a new technological departure been coupled so closely as has been the case since the late 1980's It was at that point, in the midst of the â€oepersonal computer revolution, †that mounting concerns about the absence of an evident link between progress in digital information technologies and the productivity performance of the economy at large crystallized around the perception that the U s.,along with other advanced industrial economies, was confronted with a disturbing"productivity paradox.""The precipitating event in the formation of this"problematic"view of the digital information technology was an offhand (yet nonetheless pithy) remark made in the summer of 1987 by Robert Solow, Institute Professor at MIT and Economics Nobel Laureate:""You can see the computer age everywhere but in the productivity statistics.""1 Almost overnight this contrasting juxtaposition achieved the status of being treated as the leading economic puzzle of the late twentieth century, and the divergence of opinion that emerged within the economics profession on this question has persisted, and evolved more recently into disagreements over the basis for the claim that in the U s. information and communications technologies have given rise during the latter 1990's to"a new economy"or"new paradigm"of macroeconomic behavior It should not be surprising, therefore, that shifting understandings about the nature of the information revolution and the productivity implications of digital technologies are continuing to shape business expectations and public policies in areas as diverse as education and macroeconomic management. One indication of the wider importance of the subject matter taken up in this volume can be read in their connection with the rhetoric and, arguably, the substance of U s. monetary policy responses to the remarkable economic expansion that has marked the 1990's. For a number of years in mid-decade the Federal reserve board Chairman, Alan Greenspan subscribed publicly to a strongly optimistic reading of the American economy's prospects for sustaining rapid expansion and rising real incomes without generating unhealthy inflationary pressures. Like many other 1 Robert M. Solow,"We'd Better Watch out,"New york Review of Books, July 12, 1987, p. 36 2 observers, the Chairman of the â€oefed†viewed the rising volume of expenditures by corporations for electronic office and telecommunications equipment since the late 1980's as part of a far-reaching technological and economic transformation in which the U s. economy is taking the lead: 2 â€oewe are living through one of those rare, perhaps once-in-a-century events...The advent of the transistor and the integrated circuit and, as a consequence, the emergence of modern computer, telecommunication and satellite technologies have changed fundamentally the structure of the American economy. †Yet, many economists continue to demur from this view, and there has been no lack of skepticism regarding the potential of the new information and communications technologies to deliver a sustained surge of productivity growth. According to Alan Blinder and Richard Quandt (1997: pp. 14-15), even if the information technology revolution has the potential to significantly raise the rate of growth of total factor productivity (TFP in the long run, the long run is uncomfortably vague as a time-scale in matters of macroeconomic management Instead, in their view, â€oewe may be condemned to an extended period of transition which the growing pains change in nature, but don†t go away. †Some diminution of skepticism of this variety has accompanied the quickening of labor productivity growth in the U s. since 1997, and especially the very recent return of the rate of increase in real GDP per manhour to the neighborhood of 2 percent per annum. Among academic economists the consensus of optimistic opinion now holds a wait -and-see attitude, on the argument that it remains premature to try reading structural causes in what may well be transient, or cyclical movements that, in any case, have yet to materially reverse the profound"slowdown"in the economy†s productivity growth trend since the mid-1970's. The long-run perspective on U s. productivity performance provided by Abramovitz and David (1999) shows a refined measure of the TFP growth rate (adjusting for composition-related quality changes in labor and capital inputs having been maintained in the near neighborhood of 1. 4 percent per annum throughout the era from 1890 to 1966.3 From its 1. 45 percent level over the 1929-1966 trend interval, the average annual growth rate then plummeted to 0. 04 percent during 1966-1989. The â€oeslowdown†was pronounced so that it brought the TFP growth rate all the way back down to the very low historical levels indicated by Abramovitz and David†s (1999 statistical reconstructions of the mid-nineteenth century American economy†s performance More worrisome still, the post-1966 retardation stretched out and further intensified until the very end of the 1990's. Estimates of real gross output and inputs have become available (from the Bureau of Labor Statistics USDL News Release 98-187, May 6, 1998) that enable us to follow the path of measured productivity gains in the U s. economy well into the 1990's. The figures relating to the private non-farm business economy are generally regraded as providing a more accurate picture of recent movements, because the deflation of the current value of output has been carried out by using price indexes that re-weight the component goods and services'prices in accord with the changing composition of the aggregate. 4 These â€oechain-weighted†output measures lead to productivity growth estimates that reveal two notable points about the â€oeslowdown. †2 Testimony of Alan Greenspan before the U s. House of representatives Committee on Banking and Financial services July 23, 1996 3 See Abramovitz and David (1999), esp. Part One, Table 1: IV. The estimates for the end-points of the indicated intervals are averages over 1888-1892 for"1890, "and over 1965-1967 for"1966.""The input-quality adjusted TFP growth rate for 1929-1966 was 1. 43 percent per annum, only slightly above the 1890-1929 rate 4 Moving from the private domestic economy to the private business economy concept also eliminates the distortions in the picture of productivity that arise from the inclusion of the imputed gross rents on the stock of dwellings in the real output series, and in the estimated flow of capital input services 3 The first point is that the productivity growth rate†s deviation below the trend that had prevailed during the 1950-1972 â€oegolden age†of post-WORLD WAR II growth became even more pronounced during the late 1980's and early 1990's, instead of becoming less marked as the oil shock and inflationary disturbances of the 1970's and the recession of the early 1980's passed into history. Measured labor productivity rose during 1988-1996 at only 0. 83 percent per annum, 0. 5 percentage points less rapidly than the average pace maintained during 1972 -1988, and thus fully 2. 25 percentage points below the average pace during 1950-1972. Correspondingly, the BLS's TFP growth rate estimate for 1988-1996 sank to 0. 11 percent per annum, which represented a further drop of 0. 24 percentage points from the 1972-1988 pace, which left it (at 1. 87 percent per annum) nearly a full 2 percentage points below the pace of TFP advance that had been achieved during the post-WORLD WAR II"golden age"of growth This having been said, it is worth remarking that the conjuncture of high rates of innovation and slow measured growth of total factor productivity is not a wholly new, anomalous phenomenon in the history of U s economic growth. Indeed, most of the labor productivity growth during the interval extending from the 1830's through the 1880's was accounted for by the increasing capital-labor input ratio, leaving residual rates of total factor productivity growth that were quite small by the standards of the early twentieth century, and a fortiori, by those of the post-WORLD WAR II era. During the nineteenth century the emergence of technological changes that were biased strongly in the direction of tangible-capital using, involving the substitution of new forms of productive plant and equipment involving heavy fixed costs and commensurately expanded scales of production induced a high rate of capital accumulation. The capital-output ratio rose without forcing down the rate of return very markedly, and the substitution of increasing volumes of the services of reproducible tangible capital for those of other inputs, dispensing increasingly with the sweat and the old craft skills of workers in the fields and workshops, along with the brute force of horses and mules, worked to increase real output per manhour. 5 Seen in longer historical perspective, therefore, the recent developments hardly appear unprecedented and paradoxical. It could be maintained that there is little that is really novel or surprising in the way in which the rise of computer capital, and OCAM (office, computing and accounting machinery) capital more generally has been contributing to economic growth in the closing quarter of the twentieth century, except for the fact that this particular category of capital equipment only recently has begun to bulk large in the economy†s total stock of reproducible capital. Indeed, Daniel Sichel (1997) recently proposed a â€oeresolution†of the productivity paradox is just such terms, arguing that the imputed gross earnings on hardware and software stocks amount to such a small fraction of GDP that the rapid growth of real computer assets per se can hardly be expected to be making a very significant contribution to the real GDP growth rate. 6 But, however valid an observation that might be, it fails to satisfactorily dispel the surprise and mystery surrounding the collapse of the TFP growth rate. 7 5see Abramovitz and David 1973, David 1977, Abramovitz 1989. Thus, most of the labor productivity growth rate was accounted for by the increasing capital-labor input ratio, leaving residual rates of total factor productivity growth that were quite small by the standards of the early twentieth century, and a fortiori, by those of the post-WORLD WAR II era. See Abramovitz and David,(1997,1999) for further discussion 6 See Sichel (1997), esp. Ch. 4, Table 4-2. The 1987-1993 growth rates of inputs of computer hardware and software allowing for quality improvements) are put at approximately 17 and 15 percent per annum, respectively, by Sichel; with the gross returns to these inputs estimated at 0. 9 and 0. 7 percentage points on the assumption that the assets earn a â€oenormal†net rate of return, the combined contribution to the annual real output growth is about 0. 26 percentage points 7 The growth accounting calculations, moreover, assume that investments embodying information technology earn only a normal private rate of return and do not yield significantly higher â€oesocial rates of return†due to externalities and other spill -over effects. Were this really the case, it would reconstitute the productivity paradox in the form of the puzzle of why there was not a large positive gap between the social and the private rates of return on this the new information technology and all of its networked applications 4 Economists†reactions to questions concerning the anomalous slowdown of TFP growth and its perplexing conjuncture with the wave of ICT-embodying investments in the U s. economy thus have continued typically to be couched in terms one or another of the three following explanatory claims 1) the productivity slowdown is an artifact of inadequate statistical measurement of the economy†s true performance, or 2) there has been a vast overselling of the productivity-enhancing potential of investments in computers and related information equipment and software--due in part to misplaced technological enthusiasm and also to exaggeration of the relative scale of those capital expenditures, or 3) the promise of a profound impact upon productivity has not been mere â€oehypeâ€, but optimism on that score has to be tempered by acknowledging that the transition to the techno-economic regime in which that potential will be realized is likely to be a much more arduous, costly, and drawn-out affair than was initially supposed It is only reasonable to ask whether what has been learned during the past decade enables us to evaluate this array of explanatory hypotheses, and so better understand their bearing upon the likely future productivity performance of the digital economy. Having persisted since 1989 in advancing the latter, â€oeregime transition†interpretation of the so-called productivity paradox, and therefore holding to the â€oecautious optimist†position in regard to the computer revolution's potential economic impact, I should make it clear from the outset that I have yet to see evidence that persuades me to alter that stance. My approach to understanding the implications of the emerging digital economy continues to rest upon the idea that we are in the midst of a complex, contingent and temporally extended process of transition to a new, information intensive techno-economic regime; that useful insights into the dynamics of processes such as the one currently under way may be obtained by examining analogous historical episodes involving the elaboration and diffusion of previous â€oegeneral purpose technologies. †8 1. 1 The â€oeregime transition†hypothesis and its significance Just as the systematic economic exploitation of the electric dynamo beginning in the last quarter of the nineteenth century eventually came to displace and transcend the"steam engine age,"the temporally extended process that is now well underway seems destined eventually to accomplish the abandonment or extensive transformation of many features and concomitants of the previously established technological regime identified with â€oefordism. †The latter had assumed full-blown form first in the U s. during the second quarter of the present century, coincident with final stages in the electrification of industry. It was the mature form of the Fordist regime that may be said to have underlaid the prosperity and rapid growth of the post-WORLD WAR II era--not only in the U s. but in other industrial countries of Western europe, and Japan, where its full elaboration had been delayed by the economic and political dislocations of the 1920's and 1930's, as well as by WORLD WAR II itself The eventual supplanting of an entrenched techno-economic regime involves profound changes whose revolutionary nature is revealed better by their eventual breadth and depth of the clusters of innovation that emerge than by the pace at which they achieve their influence. Exactly because of the breadth and depth of the changes entailed, successful elaboration of a new general purpose technology requires the development and coordination of a vast array of complementary tangible and intangible elements: new physical plant and equipment, new kinds of workforce skills, new organizational forms, new forms of legal property, new regulatory frameworks, new habits of mind and patterns of taste 8 On the concept of a"general purpose technology"(GPT) and its historical and current relevance, see the remarks immediately below, and section 5 for further discussion (with references to the literature 5 For these changes to be set in place typically requires decades, rather than years. Moreover, while they are underway there is no guarantee that their dominant effects upon macroeconomic performance will be positive ones. The emergence of positive productivity effects is neither assured nor free from being overwhelmed by the deleterious consequences of devoting resources to the exploration of"blind alleys"--or more formally described, technological and organizational trajectories that prove to be economically nonviable and are abandoned eventually. The rise of a new techno-economic paradigm may have transient but nonetheless dislocating, backwash effects upon the performance of surviving elements of the previous economic order It should not be so surprising, therefore, that the supplanting of the Fordist regime by one developed around digital information processing and its distribution via electronic and electro-optical networks has turned out to be an affair in which the disruptive potentialities of the novel technologies and new modes of business organization have been at least as plainly manifest as the improvements in productive efficiency; it might well have been anticipated from the outset that this transition would entail some diminution in the productivity of old assets, and much new investment being allocated to ventures that remain experimental, and adaptive in nature --more akin to learning than the implementation of chosen routines. In short, the"productivity paradox"may be reflecting real phenomena, whose nature is paradoxical only to those who suppose that the progress of technology is autonomous, continuous and, being"hitchless and glitchless, "bound to yield immediate cost -savings and measurable economic welfare gains Thus, those who are found along with me in the â€oecautious optimist†camp share the view that the persistence of the slow trend rates of TFP growth experienced in the U s. economy during the past two decades is unlikely. Instead, it is suggested that, with appropriate attention to problems of coordinating technological and organizational change with labor force training, the future may well bring a strong resurgence of the measured total factor productivity residual that could be attributed reasonably to the exploitation of digital information technologies. Although intent to divine the early harbingers of a more widespread recovery in productivity growth, they acknowledge that such a renaissance is guaranteed not by any automatic market mechanism and maintain that it is foolish to adopt a passive public policy stance and simply await its arrival The development and exploitation of digital information, like previous profound historical transformations based upon new â€oegeneral purpose engines, †turns out to entail a complicated techno-economic regime transition whose success is contingent upon the coordination and completion of many complementary changes in methods of production, work modes, business organization, and supporting institutional infrastructures. Transformations of that sort, however, involve not only the obsolescence of skills, capital assets and business models; typically they are marked also by the accelerated rate of appearance of new goods and products. For a time, then, the latter developments are of a sort that will seriously challenge the ability of inherited statistical indicators to track and measure the performance of the economy which is undergoing significant and unprecedented structural changes. Thus, endogenous measurement biases may well be expected to add to the underlying â€oereal†developments that tend to drag down the observed pace of productivity improvement Furthermore, it should be understandable enough that observers whose attention becomes focused at an early stage of such an extended process upon its most salient, dynamic features, may fail to appreciate how limited is the actual extent of the progress made toward fulfillment of the new technology†s promise. In reaction to the disappointment of excessively optimistic expectations, or of initial mis-perceptions by business enthusiasts about the direction of product and process innovation that will have the widest ultimate scope for application and greatest impact upon measured productivity, charges of â€oehype†are likely to be accompanied by dismissal of the new technology as merely a snare and delusion. In other words, the emergence of a disposition among some observers to embrace an unwarrantedly cynical and pessimistic stance about its long-run impact may be regarded as the other side of the tendency towards â€oetechnological presbyopia, †in which a bright distant future state of the 6 world is envisaged clearly while enthusiasm blurs and dims vision of the likely obstacles, blind alleys and pitfalls that bestrew the path immediately ahead. 9 Consequently, in considering the three styles of explanatory response to the â€oeproductivity paradox†that are identified above, it seems to have been rather misleading for economists to have approached these as though they were necessarily independent, mutually incompatible and hence competing explanatory hypotheses. Rather than striving to build a case for according one greater favor than the others, it seems we shall come closer to the truth of the matter by recognizing that there is empirical support--both from historical as well as contemporary evidence--for treating each of them as a significant facet of the larger phenomenon with which we must be concerned What I endeavor to do in the following pages is, first, to examine (in section 2) some of the evidence relating to the more straightforward measurement problems that have been indicted as contributory factors in the slowdown of measured TFP growth, and to point out the respects in which some of these are not independent and coincidental factors, but, instead should be seen to be sequelae of the ICT revolution itself. The same theme is pursued at a deeper conceptual level in section 3, by briefly considering the implications of the limited way in which a national income accounting system devised to deal with ordinary goods and services is able to cope with the shift towards integrating such commodities with the services of information. These considerations suggest the possibility that the past two decades have been marked by a more pronounced bias towards the underestimation of the growth of aggregate real output and, consequently, of measured productivity In section 4 the discussion takes up some of the technological realities that justly can be said to underlie disappointments with the impact of computers upon the more readily measurable forms of task-productivity. The argument here is that the historical course of the development of the personal computer as a general-purpose machine has not been conducive to enhancing productivity of the sort that can be gauged by conventional measurement approaches. Section 5 returns to the regime transition hypothesis and indicates the ways in which historical experience, particularly that of the diffusion of the electric dynamo, may justifiably be used as a source of insights into the dynamics of the digital economy and its productivity performance. Section 6 concludes by looking to the future from the vantage point afforded us by an understanding of the past 2. Measurement Problems Those who would contend that the slowdown puzzle and computer productivity paradox are first and foremost consequences of a mismeasurement problem must produce a consistent account of the timing and magnitude of the suspected errors in measurement. The estimation of productivity growth requires a consistent method for estimating growth rates in the quantities of inputs and outputs. With a few notable exceptions (e g electricity generation), the lack of homogeneity in industry output frustrates direct measures of physical output and makes it necessary to estimate physical output using a price deflator. Similar challenges arise, of course, in measuring the heterogeneous bundles of labor and capital services, but attention is being directed mainly to the problems that are suspected to persist in the measures of real product growth. Systematic overstatement of price increases will introduce a persistent downward bias in estimated output growth and, therefore, an understatement of both partial and total factor productivity improvements Such overstatement can arise in several distinct ways. There are some industries, especially services, for which the concept of a unit of output itself is defined not well, and, consequently, where it is difficult if not 9 See David (1991) for further discussion of the condition I have labeled â€oetechnological presbyopia†(sharp vision of the distant technological future, coupled with inability to clearly see the nearer portion of the transition path toward that state particularly the illustrative case of early twentieth century electrical engineers†views on factory electrification 7 impossible to obtain meaningful price indices. In other cases, such as the construction industry, the output is so heterogeneous that it requires special efforts to obtain price quotations for comparable â€oeproducts†both at an one point in time and over time. The introduction of new commodities again raises the problem of comparability in forming the price deflators for an industry whose output mix is changing radically, and the techniques that statistical agencies have adopted to cope with the temporal replacement of old staples by new items in the consumer†s shopping basket have been found to introduce systematic biases. These are only the simpler and more straightforward potential worries about mismeasurement, but, before tackling less tractable conceptual questions we should briefly review their bearing on the puzzle of the slowdown and the computer productivity paradox 2. 1 Over-deflation of output: will this account for the productivity slowdown That there is a tendency for official price indexes to overstate the true rate of inflation (understate the pace of price declines) is a point on which there seems to be broad agreement among economists. The magnitude of the bias, however, is another question. The Report of the Advisory Commission to Study the Consumer price index (the so-called Boskin Commission Report, 1997) CPI) concluded that the statistical procedures used by the BLS in preparing the CPI resulted in an average overstatement of the annual rate of increase in"the real cost of living"amounting to 1. 1 percentage points. This might well be twice the magnitude of the error introduced by mismeasurement of the price deflators applied in estimating the real gross output of the private domestic economy over the period 1966-89.10 Were we to allow for this by making an upward correction of the real output growth rate by as much as 0. 6 to 1. 1 percentage points, the level of the Abramovitz -David (1999) estimates for the TFP growth rate during 1966-89 would be pushed back up to essentially that 0. 64-1. 14 percent per annum) range. Even so, that correction--which entertains the extremely dubious assumption that the conjectured measurement biases in the output price deflators existed only after 1966 and not before--would still have us believe that between 1929-66 and 1966-89 there had been a very appreciable slowdown in multifactor productivity growth. 11 Moreover, there is nothing in the findings of the Boskin Commission (1997) to indicate that the causes of the putative current upward bias in the price changes registered by the CPI only have been operating since the end of the 1960's. 12 Thus, the simplest formulation of a mismismeasurement explanation for the productivity slowdown falls quantitatively short of the mark. This does not mean that there presently is no underestimation of the labor productivity or the TFP growth rates. Perhaps the paradox of the conjunction of unpredecentedly sluggish productivity growth with an explosive pace of technological innovation can be resolved in those terms, without 10 The Boskin Commission (1997) was charged with examining the CPI, rather than the GNP and GDP deflators prepared by the U s. Commerce department Bureau of Economic Analysis, and some substantial part (perhaps 0. 7 percentage points of the estimated upward bias of the former is ascribed to the use of a fixed-weight scheme for aggregating the constituent price series to create the overall (Laspeyres type) price index. This criticism does not apply in the case of the national product deflators, and consequently the 1. 1 percent per annum figure could be regarded a generous allowance for the biases due to other, technical problems affecting those deflators 11 On that calculation, the post-1966 reduction in the average annual TFP growth rate would amount to something between 0. 3 and 0. 8 percentage points, rather than the 1. 4 percentage point slowdown discussed above, in section 1 12 The Boskin Commission's findings have met with some criticism from BLS staff, who have pointed out that the published CPI reflects corrections that already are made regularly to counteract some of the most prominent among the suspected procedural sources of overstatement--the methods of"splicing in"price series for new goods and services. It is claimed that on this account, the magnitude of the continuing upward bias projected by the Boskin Commission may well be too large (see Madrick, 1997. The latter controversy does not affect the present illustrative use of the 1. 1 percentage point per annum estimate, because it is being applied as a correction of the past trend in measured real output; furthermore, in the nature of the argument, an upper-bound figure for the measurement bias is what is wanted here 8 however, accounting for the slowdown itself. Plainly, what is needed to give the mismeasurement thesis greater bearing on the latter puzzle, and thereby help us to resolve the information technology paradox, would be some quantitative evidence that the suspected upward bias in the aggregate output deflators has been getting proportionally larger over time. Martin Bailey and Robert Gordon (1988) initially looked into this and came away without any conclusive answer; and subsequently, Gordon (1996,1998) has moved towards a somewhat less skeptically dismissive position on the slowdown being attributable to the worsening of price index mismeasurement errors. But some further efforts at quantification are in order before that possibility is dismissed 2. 2 Has the relative growth of â€oehard-to-measure†activities enlarged the bias A reasonable point of departure is the question of whether structural changes in the U s. economy have exacerbated the problem of output underestimation, and thereby contributed to the appearance of a productivity slowdown. In this connection Zvi Griliches'(1994) observation that there has been relative growth of output and employment in the"hard-to-measure"sectors of the economy is immediately pertinent. The bloc of the U s private domestic economy comprising Construction, Trade, Finance, Insurance, and Real estate (FIRE), and miscellaneous other services has indeed been growing in relative importance, and this trend in recent decades has been pronounced especially. 13 There is certainly a gap in the manhours productivity growth rates favoring the better-measured, commodity-producing sections, but the impact of the economy†s structural drift towards â€oeunmeasurability†is not big enough to account for the appearance of a productivity slowdown between the pre -and post-1969 periods. The simple re-weighting of the trend growth rates lowers the aggregate labor productivity growth rate by 0. 13 percentage points between 1947-1969 and 1969-1990, but that represents less than 12 percent of the actual slowdown that Griliches was seeking to explain. 14 A somewhat different illustrative calculation supporting the same conclusion has been carried out by Abramovitz and David (1999. For that purpose, they make the following extreme assumptions:(i) that an upward bias of 1. 6 percent per annum was present in the price deflator for the U s. gross private domestic product,(ii) that it arose entirely from deficiencies in the price deflators used to derive real gross productivity originating within the group of hard-to-measure sectors identified by Griliches, (iii) that this condition prevailed since the early post-WORLD WAR II era in the case of the hard-to-measure sector, whereas prices and real output growth were measured properly for the rest of the economy. Taking account of the increasing relative weight of the hard-to-measure sector in the value of current gross product for the private domestic economy, the implied measurement bias for the whole economy--under the conditions assumed--must have become more 13 Gross product originating in Griliches†â€oehard-to-measure†bloc average 49.6 percent of the total over the years 1947 -1969, but its average share was 59.7 percent in the years 1969-1990. See Griliches (1995: Table 2), for the underlying NIPA figures from which the shares in the total private (non-government) product were obtained. These averages were calculated as geometric means of the terminal year values in each of the two intervals. Given the observed trend difference (over the whole period 1947-1990) between the labor productivity growth rates of the"hard-to-measure"and the"measurable "sectors identified by Griliches (1994, and 1995), the hypothetical effect on the weighted average rate of labor productivity of shifting the output shares can be calculated readily 14 The gap between the measured and the hard-to-measure sector†s long-term average rates of real output per manhour amounted to about 1. 40 percentage points per annum, but it was smaller than that during 1947-1969 period and widened thereafter. The more pronounced post-1969 retardation of the average labor productivity growth rate found for the hard-to -measure sector as a whole was thus responsible in a statistical sense for a large part of the retarded growth of the aggregate labor productivity. But, it would be quite misleading to suggest that every branch of activity within the major sectors labeled â€oehard to measure†by Griliches participated in the slowdown, while the industries comprising the â€oemeasured†sectors did not; Gordon (1998a) presents more finely disaggregated data on labor productivity, which reveals the pervasiveness of the slowdown 9 pronounced between the period 1948-66 and 1966-89. But, once again, the effect is found to be quantitatively too small: only 12 percent of the slowdown in the labor productivity growth rate that actually was observed could be accounted for in this way. Moreover, as the assumptions underlying this illustrative calculations are extreme, the implication is that even the comparative minor mismeasurement effect found on this score represents an upper-bound estimate. It seems we must look elsewhere 2. 3 The role of new goods in unmeasured quality change The literature devoted to the thesis that real output and productivity growth are being systematically mismeasured has directed hitherto not sufficient attention to the possibility that there has been a growing bias due to underestimation of output quality improvements associated with new goods and services. The problem arises from the practice of waiting the â€oechain in†new products until they have acquired a substantial share of the market for the class of commodities for which they can be regarded as substitutes. During that period, however it is usually the case that the absolute and relative rate of decline in the new product†s price is much more rapid than subsequently is the case. The aggregate price index therefore understates the true rate of price decline The difficulties created for price index statisticians by the turnover of the commodity basket due to the introduction of new goods (before the old staples disappear) are quite ubiquitous across industries, and there is some basis for believing that during the past two decades these may well have become more pronounced in their effect on the accuracy of the official price deflators. This line of speculation is attractive to explore because the mechanism that is hypothesized as the cause of an enlarged understatement of the productivity growth rate --namely, the higher rate of appearance of new goods in the basket of commodities available to consumers--is one that also can be linked to the effects of the emerging information revolution. In that way it might turn out that the technological regime shift itself has been contributing to the appearance of a slowdown in measured productivity, and hence to the creation of its own paradoxically weak impact upon macroeconomic growth New information technologies, and improved access to marketing data are indeed enabling faster, less costly product innovation, manufacturing process redesign, and shorter product life cycles. This has been a central theme in the business and economics literature on â€oemodern manufacturing†at least since the 1980's. 15 The increasing proliferation of new goods and its connection with the application of computers, electronic networks and other new technologies has been identified as â€oeforging a whole new paradigm that makes possible the delivery of custom-designed products to the masses--at ever lower pricesâ€--a phenomenon for which the accepted descriptive phase is mass customization. 16 Leaving aside wholly new types of goods (e g.,, personal computer models, which currently number over 400, or computer software titles, the count of which is in the neighborhood of a quarter of a million), the multiplication of the number of models available for consumers to chose among within preexisting product classes is a striking manifestation of this phenomenon. In the U s market between the early 1970's and the late 1990's the number of automobile vehicle models increased from 140 to 260, sports utility vehicle models (SUVS) increased from 8 to 38; the styles of running shoes rose from 5 to 285, outstripping the rate of expansion of choice among breakfast cereal products (160 to 340), but a proportionately less impressive growth than that in the available number of types of contact lenses (from 1 to 36). 17 15 See, e g.,, Milgrom and Roberts (1990,1992), Milgrom, Quian and Roberts (1991 16 The phrase quoted is from Federal reserve bank of Dallas, 1998, p. 7. See also Ruffin and Cox (1998; Schonfeld (1998 On the emergence of â€oemass customization†see Pine (1993 17 Federal Research Bank of Dallas (1998), Exhibit 1, p. 4 10 While the absolute increase in the sheer variety of goods is staggering, that is not quite relevant to the issue at hand. Just how much welfare gain is attributable to the availability of each nominally â€oenew†product is difficult to establish, and must vary widely, but there is some basis for suspecting that as the number of novel brands and styles has multiplied the average value of the quality gain has been reduced. Beyond that consideration, what matters is whether the share of aggregate output (consumption) represented by newly introduced products has risen above its historical levels; in other words, whether the rate of turnover of the economy†s output mix has increased. 18 Diewert and Fox (1997) present some evidence from Nakamura (1997) on the fourfold acceleration of the rate of introduction of new products in U s. supermarkets during the period 1975-92, compared with the preceding period, 1964-75. By combining this with data from Bailey and Gordon (1988) on the rising number of products stocked by the average U s. grocery supermarket, it is possible roughly to gauge the movement in the ratio between these flow-and stock-measures, thereby gauging the direction and magnitude of changes in the relative importance of new products (and the mean turnover rate. What this reveals is marked that a rise occurred in the new product fraction of the stock between 1975 and 1992, in contrast with the essential stability of that ratio between the mid-1960's and the mid-1970's. If only half of new products were stocked by the average supermarket, the share they represented in the stock as a whole would have risen from about. 09 to. 46 This fivefold rise in the relative number of new products in the total certainly is big enough to create a potential for a substantial growth in the relative downward bias in the measured real output growth rate, as a result of the standard delay in linking the prices of new goods to old ones There is a further implication, which runs in the same direction. The broadening of the product line by competitors may be likened to a common-pool/over-fishing problem, causing crowding of the product space, 19 with the result that even the reduced fixed costs of research and product development must be spread over fewer units of sales. Moreover, to the extent that congestion in the product space raises the expected failure rate in new product launches, this reinforces the implication that initial margins are likely to be high when these products first appear, but will be found to have fallen rapidly in the cases of the fortunate few that succeed in becoming a standard item. Such a change would make the practice of delayed chaining-in of new products even more problematic than was previously the case, thereby contributing to enlarge the underestimation bias in measured output and productivity growth in a manner quite independent of the rising rate of produce turnover The mechanism of product proliferation involves innovations in both marketing and the utilization of the distribution networks. Although, in the U s.,the mass market distribution system was established well early in this century, utilizing it for product and brand proliferation was frustrated by the high costs of tracking and appropriately distributing (and redistributing) inventory. Traditionally, new product introduction involved the high fixed costs of major marketing campaigns and thereby high unit sales. But, more recently, these costs have been lowered by application of information and communication technologies, and by the adoption of marketing 18 Jack Triplett (1999), p. 14, correctly points out that â€oea greater number of new things is not necessarily a greater rate of new things, †and notes that if the relative number of new products is to grow by additions to the product line, then the total number of products must grow faster and faster. He then dismisses the hypothesis of increasing underestimation of the contribution to productivity growth due to new product introductions, on the grounds that the actual rate of growth of the number of new products in U s. supermarkets during 1972-1994 was substantially slower than was the case during 1948 -1972. But the latter statistic is not relevant if the mean product life (the inverse of the turnover rate) also can be raised by replacing old products with new ones. From the turnover rate figures discussed in the text, below, it is apparent that the assumption (in Triplett†s illustrative calculations) of an infinite product life are quite inappropriate 19 During the 1970's, the Federal trade commission was interested actively in whether such product proliferation was a new form of anti-competitive behavior and investigated the ready to eat breakfast cereal industry, see Schmalensee (1978 11 strategies in which the existing mass market distribution system is being configured under umbrellas of â€oebrand name†recognition for particular classes of products (e g. designer labels, â€oelifestyle†brands, and products related to films or other cultural icons), or for high reputation retailers and service providers (e g. prominent department store brands or financial services provided by large retail banks. The latter developments have been part and parcel of the proliferation of within-brand variety in â€oestyles†that has characterized the rise of mass cutomization. It should not be surprising that the accuracy of a statistical system designed to record productivity in mass production and distribution should be challenged when the â€oebusiness models†of the system change as the result of marketing innovation and the use of information and communication technologies Of course, some progress has been made in resolving the computer productivity paradox by virtue of the introduction of so-called â€oehedonic†price indexes for the output of the computer and electronic business equipment industries themselves. These indexes reflect the spectacularly rapid decline in the price-performance ratios of such forms of capital. Thus, the â€oehedonic†correction of computer and related equipment prices has done wonders as a boost to the growth rate of output and multifactor productivity in the producing industry; and via that effect, it has contributed increasingly to the revival of the manufacturing sector†s productivity--simply as a result of the growing weight carried by that branch of industry in the sector as a whole. 20 By the same token, the hedonic deflation of investment expenditures on computer equipment contributes to raising the measured growth of the computer capital services, which are used intensively as inputs in a number of sectors including banking, financial services and wholesale trade within the service sector. The implied rise in computer-capital intensity, and therefore in overall tangible capital-intensity, supports the growth rate of labor productivity in those sectors. But, in itself, the substitution of this rapidly rising input for others does nothing to lift the economy†s measured growth rate of TFP 3. Conceptual Challenges: What Are supposed We to be Measuring Beyond the technical problems of the way that the national income accountants are coping with accelerating product innovation and quality change lie several deeper conceptual issues. These have always been with us, in a sense, but the nature of the changes in the organization and conduct of production activities, and particularly the heightened role of information--and changes in the information state--in modern economic life, may be bringing these problematical questions to the surface in a way that forces reconsideration of what measures are intended to measure, and how they actually relate to those goals. Here space limitations allow only brief notice of two related issues of this kind 3. 1 Micro-level evidence on payoffs from IT investment--the â€oeexcess†returns puzzle The first involves the surprising appearance of â€oeexcess rates of return on computer capital. †These appeared when economists sought to illuminate the macro-level puzzle through statistic studies of the impact of 20 The difference between the measured TFP performance of the â€oecomputer-producing†and the â€oecomputer-using†sectors of the economy, which emerges starkly from the growth accounting studies by Stiroh (1998), may be in some part an artifact of the distorting influence of the Bureau of Economic Analysis use of hedonic price deflators just for the output of the industry producing computer equipment. See, e g.,, Wykoff (1995) for an evaluation of other dimensions of the distortions this has created in comparisons of productivity performance 12 IT at the microeconomic level, using observations on individual enterprise performance. 21 This phenomenon points to the conceptual gap between task productivity measures, on the one hand, and profitability and revenue productivity measurements on the other. The former are closer in spirit to the attempt to measure the productive efficiency of the economy by calculating TFP as the ratio of aggregate real output to the aggregate inputs of labor and capital services; whereas, in undertaking comparisons among organizational departments and firms engaged in quite different production activities, the micro-level performance measure shifts away from any physical, engineering notion of productivity and towards dimensions (revenue units per unit of real input cost) in which their outputs can be rendered commensurable It will be seen that not only is there an important difference, but the relationship between the two conceptual measures may itself be undergoing a transformation, as a result of the impact of the way IT is being applied in businesses. The contrast between the strong (cross-section) revenue productivity impacts of observed computer investments, and the weaker (time series) effects gauged in terms of task productivity, might indicate simply that very high gross private rates of return are associated with such capital expenditures. In view of the rapid rate of anticipated depreciation of capital value due to the high rate (circa 20 percent per annum) at which the price-performance ratio of new computer equipment has been falling, these seemingly â€oeexcess†private returns would be called for to equalize net private rates of return on various assets held by the company It also is the case that subsequent investigations along the same lines have found that there were additional intangible investments that were correlatives of high information technology-intensity. Much of the evidence for this is reasonably direct, being indicated by the presence of workers with high formal educational attainments and skill qualifications, company run on-the-job training programs, and programs of company reorganization linked with computerization and worker retraining. Taking those into account statistically leads to substantial elimination of the apparent â€oeexcess†of the estimated returns ON IT capital vis-Ã-vis the returns on capital of other kinds. 22 But there also is some indirect support from the relationship between the reproduction value of company tangibles and the market valuation of computer-intensive firms for concluding that the diffusion of information technologies among large business firms has entailed substantial levels of intangible asset formation. 23 The latter, of course, is reckoned neither on the output side (among the firms†revenue generating products) nor are the service flows from those intangibles measured among the inputs in production function studies and growth-accounting exercises. The broader significance of this situation, which is becoming increasingly widespread as digital information technologies diffuse throughout the economy, deserves further consideration 3. 2 Leaving out investments in organizational change: the narrow scope of the NIPA How should the investments made by organizations and individuals in learning to utilize a new technology be treated for national income accounting purposes? The factor payment side of the official National Income and Product Accounts (NIPA) include the expenditures that this may entail--for labor time and the use of facilities, but the intangible assets formed in the process do not appear on the output side, among the final goods and services produced. This definition of the scope of GNP and GDP is not problematic so long as the 21 See Brynolfsson and Hitt (1995,1996), Lichtenberg (1995), Lehr and Lichtenberg (1998. The early studies used cross -section observations from samples of large corporations 22 See, e g.,, Brynolfsson and Hitt (1997,1998), Bresnahan, Brynolfsson and Hitt (1999a, 1999b. Greenan and Mairesse 1997), in a pioneering study of French manufacturing firms found that controlling for the quality of the workforce eliminated the appearance of statistically significant â€oeexcess†returns in establishments that were making use of IT equipment 23 Brynolfsson and Yang (1997, revised 1999) report that computer usage is associated with very high calculated values of Tobin†s â€oeqâ€, which they presume reflects the presence of large accumulations of intangible assets 13 relationship between marketed output and non-market investments in learning remains more or less unchanged But that has not been the case A major technological discontinuity, involving the advent of a new general purpose technology, is likely to induce more than the usual relative level of incremental learning activity; and the advent of digital information processing technologies in particular, having stimulated the creation of new software assets within the learning organizations, has been marked by a relative rise in the production of intangible assets that have gone unrecorded in the national income and product accounts. This carries a potential for the conventional statistical indicators to seriously distort our macroeconomic picture of what is being produced, and the way that resources are being used The problem of non-market production of intangibles in the form of computer software was relatively more serious in the mainframe era than it has subsequently become, but the same would not appear to be true of intangible investments in the retraining of workers and the reorganization of business operations that, as as been noted, are required generally if firms are to effectively exploit the enhanced capabilities of new information technologies. Thus, the narrow scope of conventional output measures may persist for some time in failing to register the relative rise of this form of asset production, and so contribute to a downward drag on the measured productivity growth rate 4. Troubles with Computers: Effects of General Purpose Machines on Task-Productivity Laying the whole burden of explanation on the notion that existing concepts and methods are inadequate in accounting for the effects of the computer revolution is, however, not satisfactory. Even if a large share of these effects vanish into territory inadequately mapped using existing statistical measurement approaches, it is puzzling why more conventional indices of productivity in branches of industry that previously were not regarded to be"unmeasurable"have not been affected more positively by the advent of new information technologies. Here, we believe, there is a case to be made that the customary link between innovation in the development of technological artifacts and improvements in productivity for the users of those tools has indeed frayed; that there are real problems in delivering on the productivity promises of the computer revolution 4. 1 Component performance and system performance A common focus of attention in the computer revolution is the rapidity with which the performance of microelectronic components has been enhanced. The widespread acceptance of Moore's Law shapes user expectations and technological planning, not only in the integrated circuit industry, but in all of the information and communication technology industries. For software designers, Moore's law promises that new computational resources will continue to grow and encourages the development of products embodying more features so that the diverse needs of an ever-growing user community can be fulfilled. It need not follow that any particular user will experience performance improvement as the result of component improvement. As has been pointed out, even if the user adopts the new technology, the learning time in mastering new software, the greater number of choices that may need to be made to navigate a growing array of options and the longer times required for the more complex software to be executed will offset part or all of the gains from increasing component performance It is recognized now widely that the costs of personal computer ownership to the business organization may be tenfold the size of the acquisition costs of the computer itself. 24 Many of these costs are unrelated to the 24 Some of these costs are recorded directly while others are part of the learning investments being made by firms in formal and informal â€oeon the job†knowledge acquisition about information technology 14 performance of microprocessor components and for many applications, the use of personal computers is therefore relatively unaffected by microprocessor performance improvements. From a productivity measurement standpoint, the relatively constant unit costs imposed by personal computer ownership have been further compounded by the costs of the continuing spread of the technology throughout the organization. To be sure employees are being given general purpose tools that may be and often are useful for devising new ways to perform their Work at the same time, however, it is apparent to most sophisticated users of computers that the extension of these capabilities also creates a vast new array of problems that must be solved to achieve desired aims. Most organizations believe that learning to solve these problems will eventually create a greater range of organizational and individual capabilities that will improve profitability. In any case, it is expected now that a modern organization will provide reasonably sophisticated information technology as part of the office equipment to which every employee is entitled From a business process or activity accounting viewpoint, however, the spread of personal information and communication technologies has complicated enormously the task of maintaining coherence and functionality within the organization. A task such as the creation of a business letter involves a considerable range of choices, and efforts to define an efficient means of carrying out this operation seldom will be confined to the individual who executes the task. Company formats and style sheets, equipment maintenance and troubleshooting, file server support and standards for archiving and backup of electronic copies of documents all now enter into the task of producing a business letter. The existence of new capabilities suggests the potential for creating greater order and precision, whereas the reality of deploying these capabilities may substantially raise the unit costs of executing the letter-writing task. These observations are intended not as a call for return to the â€oebad old days†of smudged typescripts and hand-addressed envelopes. The point, instead, is that most organizations have neither the capability nor interest in performing detailed activity accounting with regard to the new business processes arising from the use of information and communication technologies. Without attention to these issues, it is not surprising that they may often follow a version of Parkinson's law (â€oework expands to fill the time available for its completionâ€: the ancillary complications of preparing to perform a computer-assisted task may fill the time previously allotted for its completion. Surely, this is not the average experience, but much more careful attention would be suggested in the management of information and communication resources if their costs were recognized fully. 25 Was this state of affairs a necessary, inescapable burden imposed by the very nature of the new information technology, and so destined to perpetuate itself as that technology become more and more elaborate Those seeking an answer to this question may find it helpful to begin by stepping back and explicitly conceptualizing the recent and still unfolding trajectory along which the microelectronics-based digital computer has been developed and deployed, seeing it as a particular, contextualized instance of a more general class of historical processes. 26 Such an approach gives one a view of the path taken up the present as not the only one conceivable, but, on the contrary, a contingency selected course of technological development among a number of alternatives that were available. The actual path of computerization, seen in retrospect, led away from a tight coupling between the new technological artifacts and the task productivity of the individuals and work groups to whom those microelectronics based tools were offered 25 Much greater attention therefore ought be devoted to the â€oetask productivity†of information and technology use. Tasks that are performed repetitively using information and communication technologies are likely to be worthy of the same amount of analysis that is devoted to approval paths, logistics, or other features of the organization that are the focus of quality assurance and business process re-engineering activities. From the productivity measurement viewpoint, it will not be possible to gather meaningful statistics about these aspects of productivity until organizations are performing these sorts of measurements themselves. Only when work processes are monitored and recorded are they likely to find their way into the adjunct studies that are performed to test the accuracy of more abstract productivity measurement systems 26 The following draws upon a more detailed treatment of the productivity implications of the general purpose formulation computer technology that has characterized the personal computer revolution, provided by David and Steinmueller (1999 Section 7 15 4. 2 The general purpose computing trajectory, from mainframes to the PC revolution The widespread diffusion of the stored program digital computer is intimately related to the popularization of the personal computer as a"general purpose"technology for information processing, and the incremental transformation of this"information appliance"into the dominant technology of information processing. The historical process by which this was achieved in the case of the personal computer has had major implications, not only for the success of personal computer technology and the hardware and software industries based upon it, but also economic functionality of the business organizations that have sought to utilize it profitably. For the personal computer, as for its parent the mainframe, and its cousin the minicomputer, much adaptation and specialization has been required to apply a general purpose information processing machine to particular purposes or tasks. Such adaptations have proved costly, especially so in the case of the personal computer. It is something of an historical irony that the core elements of the adaptation problems attending this GPT's diffusion into widespread business application may be seen to derive from the historical selection of a trajectory of innovation that emphasized the"general purpose"character of the paradigmatic hardware and software components The origins of the personal computer required the invention of the microprocessor which was a technical solution to the problem of creating a more"general purpose"integrated circuit to serve a specific purpose, a more flexible portable calculator--a foundational application that ultimately proved uneconomic due to the lower relative costs of more specialized integrated circuits. During the 1970's it was recognized that the microprocessor provided a general solution to the problem of the electronic system designer confronted by an ever-growing array of application demands. During the same period, efforts to down-scale mainframe computers to allow their use for specialized control and computation applications supported the birth of the minicomputer industry. These two developments provided the key trajectories for the birth of the personal computer. As microprocessors became cheaper and more sophisticated and applications for dedicated information processing continued to expand, a variety of task-specific computers came into existence One of the largest markets for such task specific computers created during the 1970's was that for dedicated word-processing systems, which appeared as an incremental step in office automation, aimed at the task of producing documents repetitive in content or format such as contracts, purchase orders, legal briefs, and insurance forms, that could be modified quickly and customized based upon stored formats and texts. But dedicated word processors were displaced rapidly by personal computers during the mid-1980's, as the latter were perceived to be more"flexible "and more likely to be"upgrade-able"as new generations of software were offered by sources other than the computer vendors. 27 The dedicated word processor's demise was mirrored by development in numerous markets where dedicated"task-specific"data processing systems had begun to develop. Digital Equipment Corporation, the leading minicomputer manufacturer retreated from its vertical marketing strategy of offering computer systems specifically designed for newspapers, manufacturing enterprises, and service companies; it specialized instead in hardware production, leaving the software market to independent software vendors. 28 This process, which had begun in the late 1970's as an effort to focus corporate strategy, greatly accelerated during the 1980's with the advent of the large-scale personal computer platforms 27 Outside sourcing of applications software represented a significant departure from the proprietary software strategy that the suppliers of dedicated word-processing systems had sought to implement during the 1970's, and which left them unable to meet the rapidly rising demands for new, specialized applications software. Moreover, personal computers could use many of the same peripherals, such as printers: because the widespread adoption of the new technology raised the demand for compatible printers, the dedicated word processors found themselves unprotected by any persisting special advantages in printing technology 28similar decisions were made by all of the U s. computer manufacturers. See the discussion in Steinmueller (1996 16 united under the IBM PC standard or utilizing that of Apple†s Macintosh. The"general purpose"software produced for these two platforms not only discouraged task-specific software, it also created a new collection of tasks and outputs specifically driven by the new capabilities such as"desk top publishing"(typeset quality documents),"presentation graphics"(graphic artist quality illustrations for speeches and reports), and"advanced word processing"(the incorporation of graphics and tables into reports. All of these changes improved the "look and feel"of information communication, its quality and style, the capability for an individual to express ideas, and the quantity of such communications. But singly and severally they made very little progress in changing the structure of work organization or the collective productivity of the work groups employing these techniques The disappearance of task-based computing in favor of general purpose personal computers and general purpose (or multipurpose) packaged software was completed thus largely during the 1980's. 29 The early evolution of the personal computer can therefore be seen as cutting across the path of development of an entire family of technically-feasible information processing systems focused on the improvement of"task-productivity "in applications ranging from word processing to manufacturing operations control. In many cases, it has also precluded the effective development of collective"work group"processes whose synergies would support multifactor productivity improvement. Instead of â€oebreaking free"from the mainframe, these general purpose engines often wound up"slaved"to the mainframe, using a small fraction of their capabilities to emulate the operations of their less expensive (and less intelligent) cousins, the"intelligent"display terminals By 1990, then, the personal computer revolution while seizing control of the future of information processing had left carnage in its wake, as many such movements tend to do. The revolutionaries had kept their promise that the PC would match the computing performance of the mainframes of yesteryear. What was not achieved, and could not be achieved by this triumph was a wholesale reconstruction of the information processing activities of organizations. 30 Rather than contributing to the rethinking of organizational routines, the spread of partially networked personal computers supported the development of new database and data entry tasks, new analytical and reporting tasks, and new demands for"user support"to make the general purpose technology deliver its potential This is not to claim that the process should be regarded as socially sub-optimal, or mistaken from the private business perspective. A quantitative basis for such judgements, one way or the other, does not exist, as yet. It appears that what was easiest in an organizational sense tended to be the most attractive thing to undertake first. The local activities within the organization that were identified as candidates for personal computer applications often could and did improve the flexibility and variety of services offered internally within the company; and externally to customers who would, through the intermediation of personnel with appropriate information system access, receive an array of service quality improvements. Arguably, many of these improvements are part of the productivity measurement problem, because they simply are captured not in the real output statistics, even though they might enhance the revenue generating capacity of the firms in which they are deployed. The availability of 24-hour telephone reservation desks for airlines, or the construction of worldwide networks for securing hotel, rental automobile, or entertainment reservations, represent welfare 29in the medium and large enterprises of 1990, what remained was a deep chasm between the"mission critical"application embedded in mainframe computers and the growing proliferation of personal computers. The primary bridge between these application environments was the widespread use of the IBM 3270, the DEC VT-100 and other standards for"intelligent "data display terminals, the basis for interactive data display and entry to mainframe and minicomputer systems. From their introduction, personal computers had software enabling the emulation of these terminals, providing further justification for their adoption 30 For an historical account of a potential alternative path of user-driven technological development, one that entailed the reorganization of businesses as an integral aspect of the computerization of their activities, see Caminer, Aris, Hermon and Land (1996 17 improvements for the customer that do not appear in the measured real GDP originating in those sectors, nor in the real value expenditures on final goods and services There is a more evident"down-side"of the process by which general purpose personal computers came to be furnished with"general purpose"personal computer software. It may be accepted that general purpose hardware and software in combination did"empower"users to think of"insanely great"new applications--to use the rhetoric of Steve jobs, one of Apple computer's cofounders. On the other hand, however, the disruptive effects of relentless innovation are inimical to the stabilization of routine and the improvement of efficiency of routine performance which that brings. Moreover, at best only a very small number among the innovative software programs turn out to address the sort of mundane tasks that are sufficiently common to permit them to make a difference to the performance of a large number of users. But the ubiquity and complementary of these dual"general purpose enginesâ€--personal computer hardware and packaged software --has the side-effect of foreclosing the apparent need for more specialized task-oriented software development. 31 Worse still, by the mid-1990's, the competition among packaged software vendors for extending the generality of their offerings became a syndrome with its own name:""creeping featurism"or"featuritis.""Making light of these developments in 1995, Nathan Myrvhold of Microsoft suggested that software is a gas that "expands to fill its container...After all, if we hadn't brought your processor to its knees, why else would you get a new one? 32 Although offered in jest, this comment reflects the serious belief of many in the technological community that continuous technological progress and upgrading of computers, with which they are centrally engaged, is ultimately for the benefit of the user. From their perspective, the key to future success lies in establishing increasingly powerful platforms for new generations of software, whereas among users, these developments may be welcomed by some while loathed by others. What can be predicted reliably, however, is that the costs of adjustment, learning, and sheer"futzing around"with the new systems on the part of less skilled users will continue to severely constrain their contributions to productivity 5. Dark Journey Towards the Brighter Future? --The Regime Transition Hypothesis The so-called â€oeregime transition hypothesis†owes much in its general conception to the work of Freeman and Perez (1986), who emphasized the many incremental technological, institutional and social adjustments that are required to realize the potential of a radical technological departure, and pointed out that typically, those adaptations are neither instantaneous nor costless. David (1990, 1991a, 1991b)) took up this idea which fitted preconceptions derived from studies of the economic history of previous developments involving the introduction of"general purpose engines"--the vertical watermill, steam engines, the electrical dynamo internal combustion engines. These suggested the plausibility of the view that an extended phase of"transition "would be required to fully accommodate and hence elaborate a technological and organizational regime built around a new general purpose technology, the microelectronic digital computing engine--or, for simplicity â€oethe computer. †Recent work in the spirit of the new growth theory has sought to generalize on the idea (formulated by Bresnahan and Trajtenberg (1995), of â€oegeneral purpose technologies†that transform an economy by finding many new lines of application, and fusing with existing technologies to rejuvenate other, preexisting sectors of the economy. While the positive, long-run growth igniting ramifications of a fundamental technological breakthrough of that kind are stressed in the formalization of this vision by the new growth theory literature, the possible down-side of the process has gone not unrecognized. Mathematical models of such a multi-sector 31 For a recent development of this theme, see Norman (1998), esp. Chs. 2-4, 12 32 As quoted in W. Wayt Gibbs, â€oetaking Computers to Taskâ€, Scientific American, July, 1997 18 learning and technology diffusion process indicate that the resources absorbed in the increasing roundaboutness of the transition phase may result in the slowed growth of productivity and real wages. 33 The â€oeregime transition hypothesis†therefore suggested itself quite naturally as a framework for examining the phenomenon of the productivity slowdown and the appearance of the â€oeproductivity paradox†to which it gave rise. By drawing an explicit analogy between â€oethe dynamo and the computer, †David (1991 sought to use the U s. historical experience to give a measure of concreteness to the general observation that an extended phase of transition may be required to fully accommodate and hence elaborate the technological and organizational regime that eventually would emerge around the digital computer. In following the story of the way in which the transmission of power in the form of electricity eventually came to revolutionize industrial production processes, one gains a sharper intuitive grasp of the point that far more is likely to be involved in the transition to a new general purpose technology than the simple substitution of a new form of productive input for an older alternative. The overall speed at which the transformation proceeds can be seen to be governed, in both the past and current regime transitions, by the ease or difficulty of altering many other technologically and organizationally related features of the production systems involved The earlier formulation of the regime transition argument was less ambitious, but also focused specifically upon the economic aspects of the initial phases of the transition dynamics that might contribute to slowing the measured growth of industrial productivity. There are two distinct facets of the"extended transition"explanation of the productivity paradox The first is concerned to show that lags in the diffusion process involving a general purpose technology can result in long delays in the acceleration of productivity growth in the economy at large. The underlying idea is that productivity advances stem from the substitution of new (ICT-intensive) production methods for older ones, as well as from improvements and enhancements of the new technologies themselves, but that because those improvements and the innovations†diffusion are mutually interdependent processes, it is possible for this dynamic process to be a quite long-drawn affair. The second facet of the argument is that in the earlier phases of the transition process resources tend to be directed to applying the innovation to provide new, qualitatively superior goods and services, and so yield welfare gains that escape being reflected properly in the measured output, and productivity indexes of the economy. As this theme already has been aired well (in sections 2 and 3 above), only the first point will bear further elaboration and historical illustration on this occasion 5. 1 Diffusion, dynamos and computers Although central generating stations for electric lighting systems were introduced first by Edison in 1881, electric motors still constituted well under one-half of one percent of the mechanical horsepower capacity of the U s. manufacturing sector at the end of that decade. Electrification was proceeding rapidly at this time especially in the substitution of dynamos for other prime movers such as waterpower and steam engines, so that between 1899 and 1904 the electrified portion of total mechanical drive for manufacturing rose from roughly 5 per cent to 11 per cent (see David (1991a: Table 3)).Yet, it was not until the decade of the 1920's that this measure of diffusion, and the more significant measure of the penetration of secondary electric motors in manufacturing, both moved above the 50 percent mark. It was the transition to the use of secondary electric motors (the unit drive system) in industry that my analysis found to be associated strongly with the surge of total factor productivity in manufacturing during the decade 1919-1929 Recent estimates of the growth of computer stocks and the flow of services therefrom are consistent with the view that when the"productivity paradox"debate began to attract attention, the U s. economy could be said 33 See e g.,, chapters by Helpman and Trajtenberg (1998), Aghion and Howitt (1998), and other contributions in Helpman 1998 19 to have still been in the early phase of the deployment of ICT. Dale Jorgenson and Kevin Stiroh (1995) reveal that in 1979, when computers had evolved not yet so far beyond their limited role in information processing machinery, computer equipment and the larger category of office, accounting and computing machinery OCAM) were providing only 0. 56 percent and 1. 5 percent, respectively, of the total flow of real services from the (nonresidential) stock of producers'durable equipment. 34 But, these measures rose to 4. 9 per cent in 1985 ballooned further to 13.8 percent by 1990, and stood at 18.4 percent two years after that. 35 Thus, the extent of â€oecomputerization†that had been achieved in the whole economy by the late 1980's was roughly comparable with the degree to which the American manufacturing sector had become electrified at the beginning of the twentieth century. When the historical comparison is narrowed more appropriately to the diffusion of secondary motors, a proxy for the spread of the unit drive, the growth rate for 1899-1914 is almost precisely the same as that for the ratio of computer equipment services to all producers†durable equipment services in the U s Does the parallel carry over also, as regards the pace of the transition in its early stages? An affirmative answer can be given to this question, but the route to it is a bit tortuous, as may be seen from the following. If we consider just the overall measure of industrial electrification index that has been referred to above, the pace of diffusion appears to have been rather slower during the"dynamo revolution"than that which has been experienced during the 1979-1997 phase of"the computer revolution";"it took 25 years for the electrified percent of mechanical drive in manufacturing to rise from roughly 0. 5 percent to 38 percent, whereas, according to the diffusion measure just presented, the same quantitative change has been accomplished for the computer within with a span of only 18 years. But, that really is not quite the right comparison to make in this connection The index of the computerization of capital services that has been derived here from the work of Jorgenson and Stiroh (1995), rises in part because the underlying estimates take into account the changing quality of the computer stock; whereas the electrification diffusion index simply compares horsepower rating of the stock of electric motors with total mechanical power sources in manufacturing. The latter index neglects the significance for industrial productivity of the growth of secondary electric motors, as contrasted with prime movers; secondary motors were those which, being used to drive tools and machinery on the factory floor (and mechanical hoists between floors), are found to have had far more important impacts upon measured TFP growth in manufacturing. 36 Between 1899 and 1914 the ratio of secondary motor horsepower to the horsepower rating of all mechanical drive in U s. manufacturing was rising at an average compound rate of 26.2 percent per year. It is therefore striking to observe that, over the period from 1979 to 1997, the estimated average rate of growth of the ratio of computer equipment services to all producers'durable equipment services in the U s. turns out to be precisely the same, at 26.4 percent per annum Such considerations should, at very least, serve as a constructive reply to commentators who have casually supposed that the computerization of the U s. capital stock has been proceeding so much faster than the 34 The capital service flows in question are measured gross of depreciation, corresponding to the gross output concept used in the measurement of labor and mulifactor productivity. Some economists who have voiced skepticism about the ability of computer capital formation to make a substantial contribution to raising output growth in the economy point to the rapid technological obsolescence in this kind of producer durables equipment, and argue that the consequent high depreciation rate prevents the stock from growing rapidly in relationship to the overall stock of producer capital in the economy. The latter argument would be relevant were one focussing on the impact on real net output, whereas the entire discussion of the productivity slowdown has been concerned with gross output measures. See Sichel (1997: pp. 101-103) for a useful comparison of alternative estimates of net and gross basis computer service"contributions to growth "35 If we extrapolate from the (slowed) rate at which it was rising during 1990-1992, the value of this index for 1997 would stand somewhat under the 38 percent level 36 See the industry cross-section regression results, and the discussion of the multifactor productivity growth rate estimates relating to Table 5 in David (1991a 20 electrification of industry as to render illegitimate any attempt to gain insights into the dynamics of the computer revolution by examining the economic history of the dynamo revolution that took place in the half century before 1929. But that is not the only argument which has been advanced for dismissing the previous experience as quantitatively so different as to be irrelevant It also has been suggested by Jack Triplett (1998) that the pace at which the price-performance ratio of computer equipment has been plummeting so far exceeds the rate of fall in the real unit costs of electric energy that there is little if anything to be inferred from the time scale of the transition to the application of the unit -drive system in manufacturing. Yet, Sichel (1997: Table 5-2) estimates the rate of change in real prices of computer services for 1987-1993 to have been-7. 9 percent per annum, and compares that to-7. 0 percent per annum trend rate of decline in the real price of electric power over the period 1899-1948. A comparison between the rates of change for two such disparate time spans does, however, seem rather peculiar. So, instead we might try to put the estimated rate of price decline in the case of electricity on a more appropriate footing by locating a comparably brief 6-10 year interval that also is positioned equivalently early in the evolution of the dynamo technology†s evolution. Taking electrification to have commenced with the introduction of the Edison filament lamp and the central power station in 1876-81, and noting that the 1987-1993 interval was situated 16 to 22 years after the introduction of the microprocessor and magnetic memory, the corresponding temporal interval would be 1892-1903.37 The real prices of electric power declined at a rather slow rate of 1. 3 percent per year during 1892-1902 Nevertheless, it may be remarked that the early evolution of electrical technology progressed at a pace more leisurely than that of modern computer technology; it was only around 1902-1903 that the main technical components for implementation of universal electrical supply systems (based on central generating stations and extended distribution networks bringing power to factories and transport systems) could be said to have been put in place. Over the decade that followed, the rate of decline in the price of electric power accelerated to 6. 8 percent per year, and from 1912 to 1920 it was falling at an average rate of 10 percent per year. This would seem sufficient to make the following, very limited, negative point: comparing the movements of the prices of electricity and quality adjusted computer services hardly warrants dismissing the relevance of seeking some insights into the dynamics of the transition to new general purpose technology by looking back at the dynamo revolution In arguing for the opposite view Triplett (1998) suggests that Sichel†s (1997) estimates of the price of computer services--and, by implication, the comparison just reported--may be misleading. He contends that the hedonic price indexes for computers that come bundled with software actually would have fallen faster than the (unbundled) price-performance ratios that have been used as deflators for investment in computer hardware If so, Sichel†s (1997) price indexes of quality adjusted â€oecomputer services†(from hardware and software) would seriously underestimate the relevant rate of decline. But, Triplett†s argument seems to suppose that operationally relevant â€oecomputer speed†is indexed appropriately by CPU-speed, whereas many industry observers have pointed out that the bundled PC operating system has grown so large that more processing power does not translate into more â€oeeffective operating powerâ€. In other words, one should be thinking about the movements in the ratio TEL/WIN, instead of their product: WINXTEL Furthermore, in the same vein it may be noticed that the slower rate of fall in computer services prices as estimated by Sichel (1997) are more in accord with the observation that applications software packages also have ballooned in size, through the addition of many features that typically remain unutilized; that CPU speed may be too heavily weighted by the hedonic indexes for hardware, inasmuch as the utility of (net) computer power 37 Fortuitously, these dates bound the period in which the possibility of a universal electrical supply system emerged in the U s as a practical reality, based upon polyphase AC generators, AC motors, rotary converters, electric (DC) trams, and the principle of factory electrification based upon the â€oeunit drive†system. See David (1991) for further discussion 21 remains constrained by the slower speed of input-output functions; and that over much of the period since the 1960's the stock of â€oelegacy software†running on mainframes continued to grow, without being rewritten to optimally exploit the capacity available on the new and faster hardware Finally, a deeper, and equally deserved comment may be offered regarding the casual dismissals of the regime transition hypothesis on the grounds that the analogy between computer and dynamo is flawed by the putative) discrepancy between the rate at which prices associated with electricity and computer services. Such attempts are themselves instances of the misuse of historical analogies. An understanding of the way in which the transmission of power in the form of electricity came to revolutionize industrial production processes tells us that far more was involved than the simple substitution of one, new form of productive input for an older alternative. The pace of the transformation must be seen to be governed, in both the past and the current regime transitions, by the ease or difficulty of altering many other, technologically and organizationally related features of the production systems that are involved 5. 2 The proper limits of historical analogy: computer and dynamo, once again While there still seems to be considerable heuristic value in the historical analogy that has been drawn between"the computer and the dynamo,"a cautious, even skeptical attitude is warranted in regard to the predictions for the future that some commentators have tried to extract from the existence of the points of close quantitative resemblance between the two transition experiences. For one thing, statistical coincidences in economic performance are more likely than not to be just matters of coincidence, rather than indications that the underlying causal mechanisms really are identical. It is true that one can show, merely as a matter of algebra that only after the 50 percent mark in diffusion of a cost-saving technology will the latter have its maximum impact upon the rate of total factor productivity growth. 38 It is then pertinent to notice that in the case of U s factory electrification a surge of multifactor productivity growth occurred throughout the manufacturing sector during the 1920's, coincident with the attainment of the 50+percent stage in that diffusion process. This observation is useful primarily to underscore the point that the biggest productivity payoffs should not be expected to come at the beginning phase of the regime transition, even though it is then that the pace of the new technology†s diffusion is likely to be fastest This sort of historical â€oeanecdote†could be used quite legitimately when suggesting (in 1989-1990) that it perhaps was still too soon to be disappointed that the computer revolution had failed to unleash a sustained surge of readily discernable productivity growth throughout the economy. To say that, however, is not at all the same thing as predicting that the continuing relative growth of computerized equipment vis-Ã-vis the rest of the capital stock eventually must cause a surge of productivity growth to materialize; nor does it claim anything whatsoever about the future temporal pace of the computer's diffusion Least of all does it tell us that the detailed shape of the diffusion path that lies ahead will mirror the curve that was traced out by the electrical dynamo during the early decades of the twentieth century. There is nothing foreordained about the dynamic process through which a new, general purpose technology permeates and transforms the organization of production in many branches of an economy. One cannot simply infer the detailed future shape of the diffusion path in the case of the digital information revolution from the experience of previous analogous episodes; the very nature of the underlying process renders that path contingent upon events flowing from private actions and public policy decisions, as well as upon the expectations that are thereby engendered--all of which still lie before us in time 6. Historical Perspectives on the Growth of Measured Productivity in the Digital economy 38 See David (1991a), Technical Appendix for this demonstration 22 The historical trajectory of computer technology development, long overdue for change, now appears to be about to undergo some profound and portentous change of direction. 39 At least three new directions are emerging strongly enough in commercial applications to deserve brief notice. None of these developments are likely to displace the use of personal computers in the production and distribution of information that must be highly customized, or that arises from the ad hoc inquiries similar to the research processes for which the general purpose computer was invented originally. What they do promise are greater and more systematic efforts to integrate information collection, distribution and processing. In attempting to take advantage of these opportunities, enterprises and other institutions are forced to reexamine workflow and develop new methods for information system design Firstly, a growing range of information technologies has become available that are purpose-built and task-specific. Devices such as supermarket scanners were applied to a wide range of inventory and item tracking tasks and related â€oedata logging†devices were to be found in the hands of maintenance, restaurant, and factory workers. The environmental niches in which these devices were able to achieve a foothold were ones where the mass-produced personal computer was neither appropriate nor robust. These more"task specialized"devices have become sufficiently ubiquitous to provide the infrastructure for task-oriented data acquisition and display systems, in which up to date and precise overviews of the material flows through manufacturing and service delivery processes Secondly, the capabilities of advanced personal computers as â€oenetwork servers†have become sufficiently well developed that it is possible for companies to eliminate the chasm between the personal computer and mainframe environment by developing the intermediate solution of client-server data processing systems. This development is still very much in progress and reflects the more complete utilization of the local area networks devised for information and resource sharing during the personal computer era. In this new networked environment, the re-configuration of work organization becomes a central issue; strategic and practical issues surrounding the ownership and maintenance of critical company data resources must be resolved and these often are compelling enough to force redesign of the organizational structure Thirdly, and related to the foregoing, the development of Internet technology has opened the door to an entirely new class of organization-wide data processing applications as well as enormously enhanced the potential for collective and cooperative forms of work organization. Applications and their maintenance can be controlled by the technical support team who would previously have been responsible for the company's centralized data resources. The common standards defining Internet technology have the fortuitous feature that virtually all personal computers can be configured similarly, facilitating not only intra-company network but also inter-company networking The"general purpose"trajectory followed by the spectacular development of personal computer technology has reduced greatly the price-performance ratio of the hardware, without effecting commensurate savings in the resource costs of carrying out many specific, computerized tasks. Some part of the limited resource savings clearly has been transitional, as personal computers were added to existing mainframe capacity rather than substituted for it, and, indeed, were utilized under by being allocated the role of intelligent terminals This aspect of the story bears some striking similarities with the early progress of factory electrification wherein the use of the group drive system supplemented without replacing the distribution of power within factories by means of shafts and belting; this added capital to an already highly-capital-using industrial power technology, without instigating any reorganization of factory layout and routines for materials handling. It was not, however, until the dynamo could be integrated effectively into individual tools under the unit drive system 39 See David and Steinmueller (1999), and the formulation in David and Wright (October 1999), upon which the remainder of this section draws 23 that the major capital-saving contributions to multi-factor productivity growth from thoroughgoing factory redesign could be realized An analogous structural change has been envisaged, based on the development of digital â€oeinformation appliancesâ€--hand-held devices, or other robust specialized tools that are carried on belts, sown into garments and worn as headgear--embedding advanced microprocessors and telecommunications components that allow them to be linked through sophisticated networks to other such appliances, mainframe computers and distributed databases, thereby creating complex and interactive intelligent systems. This may well be an emerging trajectory of ICT development that will impinge directly upon the specific task performance of workers equipped with such devices, and hence upon conventional measures of productivity improvement. 40 Other portents for the future, involving what eventually would amount to major structural transformations, may be seen in the expansion of inter-organizational computing for the mass of transactions involving purchase ordering, invoicing, shipment tracking, and payments, all of which otherwise will continue to absorb much specialist white-collar labor time. Such service occupations might be viewed as the modern day counterparts of the ubiquitous materials-handling tasks in the manufacturing sector that became the target of mechanization innovations during the 1920's. 41 But, beyond these prospective sources of labor productivity gain in service activities, it is relevant to notice that â€oetele-working†in the U s. remains still far from fully deployed; with only about a fifth of the workforce time in large service sector firms providing data communications network links with employees†homes, and many of those are trying out â€oemixed systems†of central office and â€oeoutside†work. As was the case with the group drive system of factory electrification substantial duplication of fixed facilities characterizes this stage in an innovation†s diffusion. So, significant capital-savings through reductions of required commercial office space and transport infrastructures, are likely to result for the whole service sector as â€oetele-working†becomes much more widely and completely deployed Major organizational reconfigurations of this kind have a potential to yield capital-savings that do not come at the expense of labor productivity gains. Coupled with the widespread diffusion of â€oeinformation appliances, †they appear to hold out a realistic prospect for the older branches of an increasingly digitalized economy to enjoy a pervasive quickening in the pace of conventionally measured multifactor productivity improvements, alongside the continuing proliferation of new branches of industry offering novel and qualitatively enhanced goods and services 40 See Gibbs (1997), and especially Norman (1998), Ch. 11 41 See David and Wright (April, 1999), for fuller discussion of the interrelatedness of mechanization of materials handling and factory electrification in the U s. during the 1920's and 1930's 24 References Abramovitz, Moses, â€oenotes on Postwar Productivity Growth: The Play of Potential and Realization, †Center for Economic policy Research Publication No. 156, Stanford university, March 1989 Abramovitz, Moses, and Paul A. David, â€oereinterpreting Economic growth: Parables and Realities, †American Economic Review, vol. 63 (2 may 1973 Abramovitz, Moses, and Paul A. David, â€oethe Rise of Intangible Investment: the U s. Economy†s Growth Path in the Twentieth Century, †in D. Foray and B. A. Lundvall, eds. Employment and Growth in the Knowledge-Based Economy OECD Documents. Paris: OECD, 1996 Abramovitz, Moses, and Paul A. David, â€oeamerican Macroeconomic Growth in the Era of Knowledge-Based Progress The Long-run Perspective, †Stanford Institute for Economic policy Research, Discussion Paper Series No. 99-3 august 1999,180 pp Aghion and Howitt, in Elhanan Helpman (ed.),General Purpose Technologies and Economic growth, Cambridge MA: MIT Press, 1998 Bailey, Martin and Robert J. Gordon,"The Productivity Slowdown, Measurement Issues, and the Explosion of Computer Power.""Brookings Papers on Economic activity 2: 347-420,1988 Beckett, Samuel, Waiting for Godot: A Tragicomedy in Two Acts, 2nd ed.,London: Faber and Faber, 1965 Blinder, Alan S. and Richard E. Quandt, â€oewaiting for Godot: Information technology and the Productivity Miracle, †Princeton university Department of economics Working Paper, May 1997 Boskin, M. J.,1996, Toward a More Accurate Measure of the Cost of living, Final Report to the Senator Finance Committee from the Advisory Commission to Study the Consumer price index, pp. 1-97 Bresnahan, Timothy F.,Erik Brynjolfsson, and Lorin Hitt, â€oeinformation Technology and Recent Changes in Work Organization Increase the Demand for Skilled labor, †in M. Blair and T. Kochan, eds. The New Relationship: Human Capital in the American Corporation, Washington, D c.:The Brookings Institution, 1999 Bresnahan, Timothy F.,Erik Brynjolfsson, and Lorin Hitt, â€oeinformation Technology, Workplace Organization and the Demand for Skilled labor: Firm-level Evidence, †National Bureau of Economic Research: Working Paper Series, No. 7136 May), 1999 (b Bresnahan, Timothy F, . and Manuel Trajtenberg, â€oegeneral Purpose Technologies: Engines of Growth, †Journal of Econometrics, 65,1995: pp. 83-108 Brynjolfsson, Erik, and Lorin Hitt, â€oeinformation Technology as a Factor of Production: The Role of Differences Among Firms, †Economics of Innovation and New Technology, 3 (3-4), pp. 183-99,1995 Brynjolfsson, Erik and Lorin Hitt, â€oeparadox Lost? Firm-Level Evidence Of high Returns to Information systems Spending, †Management Science, April 1996 Brynjolfsson, Erik and Lorin Hitt, â€oeinformation Technology, Organization, and Productivity: Evidence from Firm -level, †MIT Sloan School of management Working Paper, 1997 Brynjolfsson, Erik, and Lorin Hitt, â€oebeyond Computation: Information technology, Organizational Transformation and Business Performance, †MIT Sloan School of management Working Paper, September 1998 25 Brynolfsson, Erik and S. Yang, â€oethe Intangible Costs and Benefits of Computer Investments: Evidence from Financial Markets, †Proceedings of the International Conference on Informational Systems, Atlanta, Ga.,December 1997 Revised April 1999 Caminer, D. T.,J. B b. Aris, P m. R. Hermon and F. F. Land, User-Driven Innovation: The World†s First Business Computer, London: Mcgraw-hill Book Co.,1996 Cox, W. Michael, and Roy J. Ruffin, â€oewhat Should Economists Measure? The Implications of Mass Production vs Mass Customization, †Federal reserve bank of Dallas, Working Paper no. 98-03 july 1998 David, Paul A.,â€oeinvention and Accumulation in America†s Economic growth: A Nineteenth Century Parable, †in International organization, National Policies and Economic Development, a supplement to the Journal of Monetary Economics,(K. Brunner and A. H. Meltzer, eds. vol. 6, 1977, pp. 179-228 David, Paul A.,â€oethe Dynamo and the Computer: An Historical Perspective on the Productivity Paradox, †American Economic Review, 80 (2 may 1990: pp. 355-61 David, Paul A.,â€oecomputer and Dynamo: The Modern Productivity Paradox in a Not-Too-Distant Mirror, †in Technology and Productivity: The Challenge for Economic policy, Organization for Economic Co-operation and Development, Paris, 1991a, pp. 315-48 David, Paul A.,â€oegeneral Purpose Engines, Investment, and Productivity Growth: From the Dynamo Revolution to the Computer Revolution, †in Technology and Investment-Crucial Issues for the 90s, E. Deiaco, E. Hà rner and G. Vickery eds.),), London: Pinter Publishers, 1991b David, Paul A.,â€oedigital Technology and the Productivity Paradox: After Ten Years, What Has been learned and What Do need We to Know? †Prepared for the White house Conference on Understanding the Digital economy WASHINGTON DC, May 25-6, 1999 David, Paul A, . and W. Edward Steinmueller, â€oeunderstanding the Puzzles and Payoffs of the IT Revolution: The †Productivity Paradox†after Ten Years, †Ch. 1 of Productivity and the Information technology Revolution, P. A. David and W. Edward Steinmueller, eds. Harwood Academic Publishers, forthcoming in 2000 David, Paul A, . and G. Wright, â€oegeneral Purpose Technologies and Surges in Productivity: Historical Reflections on the Future of the ICT Revolution, †University of Oxford Discussion Paper No. 31, September 1999 Diewert, W. Edward and Kevin J. Fox, â€oecan Measurement Error Explain the Productivity Paradox? †University of New south wales, School of economics Discussion Paper No. 27,1997 Federal reserve bank of Dallas, â€oethe Right Stuff: America†s Move to Mass Customization, †1998 Annual Report December 1998 Freeman, Christopher, and Carlotta Perez, â€oethe Diffusion of Technical Innovations and Changes of Techno-economic Paradigm, †presented to the Conference on Innovation Diffusion, held in Venice, 17-22 march 1986 Goldin, Claudia, and Lawrence Katz, â€oethe Origins of Technology-Skill Complementarity, †National Bureau of Economic Research Working Paper Series, No. 5657, July 1996 Gordon, Robert J.,â€oeproblems in the Measurement and Performance of Service-Sector Productivity in the United States, †National Bureau of Economic Research Working Paper Series, Number 5519,1996 Gordon, Robert J.,â€oeis There a Tradeoff Between Unemployment and Productivity Growth? †in D. Snower and G. De la Dehesa, eds. Unemployment Policy: Government Options for the Labour market. Cambridge: Cambridge university Press, 1997, pp. 433-63 26 Gordon, Robert J.,â€oemonetary Policy in the Age of Information technology: Computers and the Solow Paradox, †prepared for the conference on Monetary Policy in a World of Knowledge-Based Growth, Quality Change and Uncertain Measurement, †Bank of japan, June 18-19,1998 Gordon, Robert J.,â€oecurrent Productivity Puzzles from a Long-term Perspective, †upublished Ms, Northwestern University, September 1998a Greenan, Nathalie, and Jacques Mairesse, â€oecomputers and Productivity in France: Some Evidence, †Monash Department of Econometrics and Business Statistics Working papers, No. 15/96, September 1997; National Bureau of Economic Research: Working Paper Series, Fall 1997 Greenspan, Alan, â€oeremarks Before the National Governors†Association, †Feb 5, 1996, p. 1, quoted by B. Davis and D. Wessel, Prosperity: The Coming 20-Year Boom and What It Means to You (forthcoming in 1998), Ch. 10 Griliches, Zvi, â€oeproductivity, R&d, and the Data Constraint, †American Economic Review, Mar. 1994,84, pp. 1-23 Griliches, Zvi, â€oecomments on Measurement Issues in Relating IT Expenditures to Productivity Growth, †Economics of Innovation and New Technology, 3 (3-4), pp. 317-21,1995: esp. Table 2, Ch. 10 Helpman, Elhanan, and Manuel Trajtenberg, in Elhanan Helpman (ed.),General Purpose Technologies and Economic Growth, Cambridge, MA: MIT Press, 1998 Jorgenson, Dale W.,â€oeproductivity and Economic growth, †in Ernst R. Berndt and Jack E. Triplett, eds. Fifty Years of Economic Measurement: The Jubilee of the Conference on Research in Income and Wealth. Chicago: University of Chicago Press, 1990, pp. 19-118 Jorgenson, Dale and Kevin Stiroh,"Computers and Growth,"Economics of Innovation and New Technology 3: 295 -316,1995 Lehr, William and Frank R. Lichtenberg, â€oeinformation Technology and Its Impact on Productivity: Firm-Level Evidence from Government and Private Data Sources, 1977-1993, †Canadian Journal of Economics, 1998 Lichtenberg, Frank R.,â€oethe Output Contributions of Computer Equipment and Personnel: A Firm-Level Analysis, †Economics fo Innovation and New Technology, 3 (3-4), 1995: pp. 201-17 Madrick, Jeff, â€oethe Cost of living: A New Myth, †The New york Review, March 6, 1997 Milgrom, Paul R. and John Roberts, â€oethe Economics of Modern Manufacturing: Technology, Strategy, and Organization, †American Economic Review, 80 (3 june 1990: pp. 511-28 Milgrom, Paul R. and John Roberts, Economics, Organization and Management, Englewood Cliffs, NJ: Prentice Hall, 1992 Milgrom, Paul R.,John Roberts and Yingyi Qian, â€oecomplementarities, Momentum, and the Evolution of Modern Manufacturing, †American Economic Review, 81 (2 may 1991: pp. 84-8 Nakamura, L. I.,â€oethe Measurement of Retail Output and the Retail Revolution, †Paper presented at the CSLS Workshop on Service Sector Productivity and the Productivity Paradox, Ottawa, April 1997 Nelson, Richard R.,â€oeon Technology Capabilities and Their Acquisition, †in R. E. Evenson and Gustav Ranis (eds Science and Technology: Lessons for Developing Policy, Boulder, Colorado: Westview Press in co-operation with the Economic growth Center, Yale university, 1990: pp. 71-80 Norman, Donald A.,The Invisible Computer: Why Good Products Can Fail, the Personal computer is So Complex and Information Appliances are the Solution, Cambridge, MA: MIT Press, 1998 27 Oliner, Stephen D, . and Daniel E. Sichel, 1994,"Computers and Output Growth Revisited: How Big is the Puzzle "Brookings Papers on Economic activity 2: 273-318 Pine, B. Joseph II, Mass Customization: The New Frontier in Business Competition, Boston: Harvard Business school Press, 1993 Roach, Stephen S.,"America's Technology Dilemma: A Profile of the Information Economy,"Special Economic Study: Morgan stanley, New york, September 22, 1987 Roach, Stephen S.,"White Collar Productivity: A Glimmer of Hope?""Special Economic Study B Morgan stanley New york, September 16, 1988 Roach, Stephen S.,1991, â€oeservices under Siege: The Restructuring Imperative, †Harvard Business Review 68 September-October: pp. 82-91 Schmalensee, Richard, â€oea Model of Advertising and Product Quality, †Journal of Political economy, 86 (3 june 1978, pp. 485-503 Schonfield, Erick, â€oethe Customized, Digitized, Have-It-Your-Way Economy, †Fortune, September 1998: pp. 114-21 Sichel, Daniel E.,The Computer Revolution: An Economic Perspective, WASHINGTON DC: The Brookings Institution Press, 1997: esp. Ch. 4, Table 4. 2, pp. 101-3, Table 5. 2 Solow, Robert M.,â€oewe†d Better Watch out, †New york Review of Books, July 12, 1987, p. 36 Triplett, Jack E.,â€oethe Solow Productivity Paradox: What Do Computers Do to Productivity? â€, Prepared for the meetings of the American Economic Association, January, Chicago, Illinois, 1998 Triplett, Jack E.,â€oeeconomic Statistics, the New Economy, and the Productivity Slowdown, †Business Economics 34 (2 april 1999: pp. 13-17 U s. Department of labor, Bureau of Labor Statistics, USDL News Release 98-187, May 6, 1998 Wykoff, Andrew W.,â€oethe Impact of Computer Prices on International Comparisons of Labour Productivity, †Economics of Innovation and New Technology, 3 (3-4), 1995: pp. 277-294


< Back - Next >


Overtext Web Module V3.0 Alpha
Copyright Semantic-Knowledge, 1994-2011