An estimated 175,000 patents were granted to the U.S. last year, including more than 25,000 software patents (a new record). Concomitantly, more than 2,500 new software patent infringement suits were filed last year (also a new record). The good news is that the number of software patents grew five-fold during the last 10 years, while the number of patent infringement suits only doubled. More significantly, the numbers of software patents granted each year are increasing five to six times faster than the total number of all patents granted. At current growth rates, within 10 years, software patents will make up over half of all patents granted by the U. S. each year; compelling evidence of America’s transition from industrial work to information work.
Based on the make-up of our workforce, America has been an information economy since 1956, when the number of white collar workers first surpassed the number of industrial workers (by 33 percent to 31 percent). Today, information workers make up 55 percent of our workforce, while only 13 percent are industrial workers. Both the industrial and the information sectors, however, produce about the same share of our annual GDP—about one-third. So the average U. S. industrial worker still turns out three to four times more product per person than does the average information worker.
Entering Revolutionary Times
The low productivity of information workers is, in large measure, due to the fact that, although it has been over half a century since the first computer was switched on, we have only recently begun to use information technology (IT) to actually improve the per-capita output of white collar workers. Economic historians report that it typically takes innovative physical technologies—for example, the steam engine, the electric dynamo, and radio/television— about half a century to evolve from initial laboratory demonstration to marketplace maturity and systemwide adoption.
During its first 50 years, a new physical technology must evolve through multiple generations of trial and error, experiment and refinement, in order to become a generally useful, reliable, and affordable marketplace product. Essential elements of this maturation process include the creation of both an appropriate support infrastructure and appropriate social technologies.
During its first half-century (1946–1996), the computer matured through four generations of operating technology—vacuum tubes, transistors, integrated circuits, and silicon chips—evolving from million-dollar, room-filling machines to hundred dollar hand-held devices. Simultaneously, we created a computer-support infrastructure—the Internet. Just as the construction of inter-city railway systems greatly amplified the economic benefits of the stationary steam engine, and the electricity distribution grid hugely increased the economic and social value generated by individual electric dynamos, the Internet infrastructure—or “info-structure”—is commonly expected to rapidly expand the use of IT to increase productivity by eliminating paperwork— and paper workers.
At the time that the computer and its Internet info-structure achieved marketplace maturity in the mid- 1990s, we had not yet invented new social technologies to synergize with the specific productivity-enhancing capabilities of IT. (Note: Physical technologies are specific ways of organizing fabrication materials and scientific processes, while social technologies are specific ways of organizing people and capital. Machines and structures are physical technologies, while institutions and laws are social technologies.)
After Nobel Laureate economist Robert Solow famously concluded in the late 1980s that computers had not improved U. S. productivity—” We see computers everywhere in today’s economy, except in the productivity statistics”—academics flooded into the workplace to find out why. By the late 1990s, micro-economists in Europe and North America had concluded that the failure of IT to improve real productivity could be traced to our failure to invent new social technologies—new organizations, management systems, and jobs—to complement the specific new value-adding capabilities offered by computers.
Since the mid-1990s, Erik Brynjolfsson at MIT’s Sloan School of Management has studied the computer’s impact on corporate performance at more than 1,100 businesses in 41 industries. He has concluded that, in order for IT to generate a statistically significant longterm increase in multi-factor productivity, for every $1 invested in hardware and software, an employer must spend an additional $9 or $10 on training, business process reengineering, and job redesign. Researchers have also found that the overwhelming majority of private and public sector organizations simply add computers and information systems to their existing operations like a coat of paint, without changing their social technology.
As a consequence, most organizations fail to improve their performances in spite of massive investments in IT. Since 2000, however, a widening understanding of new organizational “best practice” has combined with an increasingly competitive marketplace environment to accelerate the reinvention of our traditional industrial-era social technologies. The basic social technology for industrial era production was the “vertically integrated hierarchical bureaucracy.” But we are currently replacing our industrial bureaucracies with “virtually integrated distributed collaboratives.” Adam Smith first demonstrated the superior performance of collaborative specialization more than 200 years ago, but it wasn’t until the creation of our Internet info-structure in the mid-1990s, argues Nobel Laureate economist Ronald Coase, that enterprises have been able to exploit the transformational, productivityenhancing potential of information technology by outsourcing their noncore functions to outside specialists.
A “Wave of Creative Destruction”
Now that the computer has reached marketplace maturity and our broadband Internet info-structure is in place, the macro-economists believe that we are set for decades of soaring productivity and prosperity, as all enterprises adopt proven new best practices. In March 2003, the major IT companies—Microsoft, CISCO, Xerox, Accenture, HP, and so on—endowed MIT’s Sloan School with $4 million (plus $7 million from the National Science Foundation) to set up the Information Work Productivity Center to publish free online case studies and guidelines for “best practice” deployment of IT in 25 different industries. In their first published paper, Brynjolfsson and his colleagues report that IT applications which improve external relationships—for example, supply chain management, customer and vendor relations, and so on—consistently produce 7 percent higher return on investment (ROI) than information systems designed to improve internal data flows, such as management information systems, enterprise resource planning (ERP), human resource management, and so on.
Researchers at the Federal Reserve report that firms employing “best practice” are now, on the average, 45 percent more productive than competing firms employing common practice. In 1990, the gap between best practice and common practice was only 15 percent. Given the size of the gap between “best” and “common” practices, the Fed’s macroeconomists believe that marketplace competition will force firms in all industries to adopt best practice simply in order to survive and that this, in turn, will sustain long-term increases in U. S. productivity and prosperity. But economic historians caution that improving the productivity of our existing enterprises is only half of the techno-economic equation, the destructive part of Joseph Schumpeter’s “wave of creative destruction.”
To satisfy the “creative” half of Schumpeter’s famous formula, we must also mass-produce a new generation of value-adding middle-income jobs, which we have not yet begun to do! At the same time, while practitioners in the emergent discipline of knowledge management agree that organizational restructuring and job redesign are essential for the productive use of IT, they also believe that such investments are, by themselves, insufficient to real- ize the full productive potential of information technology. Specifically, Craig Samuel, Chief Knowledge Officer at HP, has concluded that the budgetary formula for getting the maximum yield out of IT is 10 percent for hardware and software plus 20 percent for business process reengineering and 70 percent for cultural change. Indeed, even Erik Brynjolfsson agrees there can be no “cookbook” for the “infomating” enterprise, because the internal culture of each organization is essentially unique.
While computers are clearly very good at sharing information, most organizations aren’t. For example, a recent Korn/Ferry International survey of managers in 10 high-tech firms found that:
- only one-quarter of respondents said that knowledge is ever reused across the company, and
- only 10 percent said they had access to lessons learned elsewhere in the company.
Studies by the Center for Research in Employment and Technology in Europe found that millions of dollars in information technology have been thrown at improving inhouse collaboration and knowledge sharing in companies like BP, ICI and Shell, but all have failed because the new information systems were “grafted onto cultures of excessive individualism.”
McKinsey & Co., the international consultancy, recently released the results of an internal audit in which they found that, in spite of a sophisticated in-house knowledge-sharing system, they had provided less-thanoptimal advice to three recent major clients, even though the knowledge to have improved upon that advice was available elsewhere in the McKinsey organization at the time, but unknown to the consultants. (McKinsey did not reveal the identity of the clients or offer to reduce their fees.)
Many studies confirm that such “information constipation” is common to large organizations, but it is not at all clear whether this phenomenon arises from “excessive individualism” on the part of all employees or primarily from a desire to control information by the managers who are supposed to implement and enforce corporate knowledge-sharing policies. Indeed, a growing number of chief information officers (CIOs) and motivational experts have come to believe that “information constipation” is not a natural consequence of organizational compartmentalization, but is largely a product of management culture.
Insular “A’s” vs. Collegial “B’s”
Since the 1970s, behavioral scientists have routinely divided society into two personality types. In the workplace, “Type A” personalities are depicted as primarily motivated by extrinsic rewards, such as status, power, and money, to compensate for their putative internal feelings of inadequacy and insecurity. Meanwhile, “Type B” employees are characterized as being more motivated by the intrinsic rewards of work life—professional integrity, personal growth, collegiality, peer recognition, and so on—all of which reinforce the Type B’s ascribed sense of self-worth and sociality. Over the decades, workplace surveys have found that the great majority of managers in large organizations reflect Type A personalities.
There is, as yet, no data linking insecure “control-freak” Type A managers, as a class, to constricted inhouse information flow. But there is clear evidence that the Type B employees, who make up more than 75 percent of the rank-and-file workforce, rely heavily on collegial knowledge-sharing to do their jobs well. In his recent book, Smart Business: How Knowledge Communities Can Revolutionize Your Company (Free Press, 1999), Harvard professor Jim Botkin shows that highly productive knowledge workers in all fields maintain “communities of practice”—collegial networks of past and present coworkers, former classmates and instructors, customers, and suppliers—a readily available, trustworthy source of technical professional information, guidance, and mutual assistance.
“Information constipation” is not a natural consequence of organizational compartmentalization, but is largely a product of management culture.
The world’s largest community of practice is the collaborative online network at sourceforge.net that develops and improves Linux open source software. With more than 300,000 registered members working on more than 10,000 projects, the open source community has demonstrated the enormous productive capacity of a self-policing, volunteer meritocracy based on peer collaboration and intrinsic rewards. In the process, the Linux open source model has become a new social technology for producing streams of innovation in every field of endeavor, including pharmaceuticals, automotives, and bio-tech. What’s more, John Seely Brown, former director of the Xerox Palo Alto Research Center (PARC), asserts that online groupware is about to supercharge the way communities of practice solve problems, answer questions, and push forward the frontiers of knowledge.
The principal new groupware tools include:
- Peer-to-Peer File-Sharing (P2P): Invented for online workplace collaboration, but first used by 37 million teenagers around the world to nearly bankrupt the global music industry; now a mainstay of project management throughout business.
- Instant Messaging (IM): Initially created for use in large scattered work areas, such as construction sites, ports, and railyards, but first used by British women to “swarm” Prince William, and later by college students everywhere to cheat on tests. In the past 36 months, IM has become the fastest growing technical innovation in the history of U. S. business; 84 percent of large North American firms report making formal use of IM. IM is now used by the U. S. military to enable scattered battlefield units to “swarm” an enemy; IBM uses its IM system to provide instant authoritative answers to technical questions from sales engineers, by permitting KWIC web-searches of 220,000 IBM employees worldwide to find subject-matter experts who are online at the moment.
- Web Logs (, “Blogs”): Originally an online platform for IT wonks, Blogware is now used by organizations to host online experts as in-house or customer information sources. Available both as software and as an online service, web logs also loom large as future sources of market research information.
- Wikis: Freely downloadable software for collaboratively creating new knowledge bases—dictionaries, glossaries, encyclopedias, and so on—for previously unexamined fields of study (see TWiki.org and Wikipedia.org). The need for such systems will increase exponentially throughout the foreseeable future.
Together P2P/File-Sharing, Instant Messaging, Swarms, Blogs and Wikis provide the means for rapidly mobilizing “open knowledge” systems for any topic, project, or problem. But how many encyclopedias full of new knowledge will we need in the future? Millions!!
A Never-Ending Avalanche of Innovation
While the dot.com bubble is already fading into memory, the marketplace innovations introduced during the late-1990s boom were sufficient to convince a majority of Americans that the computer really is a revolutionary technology that will ultimately transform daily life and work in unpredictable ways. Now that most people have accepted the idea that we are presently living through the “Information Revolution,” a question that is asked of futurists with increasing frequency is:, “What’s the next revolutionary technology going to be? Will it be Bio-tech, or Nano-tech, or something different altogether?”
My response to those questions is that the Information Revolution will not be followed by a similar period of socio-economic transformation driven by a single dominant technology. Rather, the coming decades will witness an avalanche of scientific breakthroughs and technical innovations throughout the workplace, the marketplace, and our lives, as IT permits us to lift the veil of ignorance from every frontier of knowledge all at once! Techno-futurist Ray Kurzweil calls this moment in time not just a “revolution,” but a “singularity” in the history of humankind.
Whatever we end up calling it, for each new frontier we explore— immune systems, food chains, computer networks, geologic faults, engineering materials, and so on—we will discover new complexities and subtleties. There will be new questions to answer, new problems to solve, new linkages to understand—all of which will create new fields of study, new multi-disciplinary professions and hybrid technical specialties in the workplace.
For nearly half a century, pioneering British cyberneticist Stafford Beer argued persuasively that the principal challenge of the Information Age will be to “manage everincreasing complexity.” We took 10 years, for example, to complete a “map” of the 34,000 genomes that make up the human gene, only to find that the most important functions of genomes are actually performed by more than 300,000 different proteins that inhabit our genomes, and whose behavior is so complex and multi-variant that we cannot yet understand them. Because proteins are clearly crucial to our efforts to conquer disease, their study has already become a new scientific discipline—proteomics—whose growth is expected to create thousands of new professional and technical jobs, plus a rising demand for more powerful research tools and production technologies.
Like the completion of the human genome map, other scientific breakthroughs will give rise to further new fields of research, employing cadres of professional and technical personnel. The principal value-adding outputs of the Industrial Economy have been goods and services; the principal value-adding outputs of the Information Economy will be productive processes and decision-critical knowledge. Leaders will need to develop the capacity to manage the everincreasing complexity of our circumstances, what we might call “complexipacity.” If the history of our previous techno-economic revolutions is a reasonable guide to the future, the transformational phase of the Computer Revolution has just begun. We’ve got at least two decades of hyper-productive workplace innovation ahead of us that can hugely enhance our capacity for purposeful creativity. Coming breakthroughs in computing technology promise to provide us with increasingly robust power tools with which to create a continuous stream of new applied knowledge for mastering the coming cornucopia of complexities:
- For example, today’s microprocessors are typically idle 70 percent of the time. Simultaneous multi-threading (SMT) will permit the creation of 8-processor chips capable of running up to 32 calculations at once and increasing the speed of today’s computers by two- to four-fold over the next five years.
- Intel’s newly-announced photo silicon chip, set for marketplace rollout in four to five years, is expected to increase both computing and communications speeds 10-fold!
- And hyper-capacious molecularscale circuits are on track to reach the marketplace by 2015–2020.
- Beyond that, of course, there’s DNA-based computing, which researchers believe will be able to process hundreds of trillions of instructions per second from a spoonful of sugar water!
Clearly, in the words of the immortal Al Jolsen, “We ain’t seen nothing yet!
- Evaluate how well your organization has supported new physical technologies (computer hardware and software) with social technologies (training, business process reengineering, job redesign) and cultural change. Have outmoded ways of working together prevented you from maximizing the value of new information systems? If so, what new practices and structural changes would lead to higher productivity?
- Assess how best practices are gathered and shared in your organization. Does your business suffer from “information constipation”? If so, what interferes with the sharing of knowledge from business unit to business unit and person to person? Is this a technological, cultural, structural, or management issue, or some combination of these?
- Look into how organizations can leverage the concept of “communities of practice.” Etienne Wenger’s book, Communities of Practice: Learning, Meaning, and Identity (Cambridge University Press, 1999) is a good place to start. In addition, Fred Nickols has compiled a list of other resources at http://home.att.net/~discon/KM/CoPReadings.htm.
- Consider how your organization is preparing leaders to manage the ever-increasing complexity of our circumstances. Will it be positioned to benefit from the scientific and technological breakthroughs yet to come? If not, what changes can your organization make to its leadership development processes?
David Pearce Snyder (firstname.lastname@example.org) has been in the forecasting business for more than 30 years, serving as the Lifestyles editor of The Futurist magazine for nearly two decades. He has compiled a database of trends and projections that he uses to create detailed scenarios—” instant preplays”—of the most probable combinations of economic, technologic, and social realities that specific industries, institutions, or individual communities or nations are likely to encounter during the next five to 15 years. He is principal partner of the Snyder Family Enterprise and has written two books and dozens of articles and reports on the future of major institutions and industries.
Adapted with permission from The Innovation Road Map Magazine, Volume 1, Issue 2 (April–June 2004).