The most broken process in enterprises today is the Byzantine mendicancy, sycophancy and outright idiocy associated with introducing new technology. If you don’t believe me, ask any Jack or Jill in the cubicles to rank the various silo captains for innovativeness. The CIO, and his IT organization, will trail the pack.
We need a forklift upgrade on how new technology enters the organization. We need to demand Technology Adoption 2.0.
Running a Six Sigma digital factory is no longer good enough. If IT is to maintain (or obtain) a seat at the big table, it needs to do more than keep the digital lights on. It needs to explain, socialize and then deploy a charismatic and value-exuding portfolio of emerging technologies.
Why is technology adoption so hard? CIOs and the organizations they lead have rarely, if ever, been in front of the curve. We missed minicomputers, we lagged on PCs, we were anti-Web, we were bum-rushed by mobility, and we are now accused of being slow on the uptake of Web 2.0 technologies like Facebook, wikis and blogs. We are of the Nancy Reagan school of technology management: We just say no. Are we destined to always be the wet blanket on technology-enabled opportunities?
Perhaps the technology adoption problem is genetic. Paul R. and Anne H. Ehrlich, authors of The Dominant Animal: Human Evolution and the Environment, observe that our just-out-of-the-trees ancestors evolved decision- making algorithms designed to respond to sudden changes in the environment. Our prehistoric ancestors passed on their genes because they were the ones who quickly reacted to hungry predators. Thus, the Ehrlichs argue, we are genetically predisposed for short-term, easy-to-quantify investments.
But the world has changed, and we have to evolve to meet the new realities of nonlinear predators — or competitors. What sets us apart from the lower orders is not our ability to communicate, create tools or collaborate. It is the ability to simultaneously entertain, evaluate and imaginatively inhabit multiple future worlds. Successful executives will need to excel at the art of futuring.
Or perhaps the technology adoption problem is systemic. Paul Saffo, an insightful forecaster at the Institute for the Future, observes, “The amount of time required for new ideas to fully seep into a culture consistently has averaged three decades.” Can’t we accelerate this process? Must we wait patiently for the vendors to come up with the technologies we need? Is the technology future truly unpredictable?
Ray Kurzweil, whose many achievements include writing The Singularity Is Near: When Humans Transcend Biology, argues that “the overall progression of information technologies is remarkably predictable. The price-performance of computing has grown at a remarkably smooth, doubly exponential pace for over a century, going back to the data processing equipment used in the 1890 U.S. Census.”
The stuff that is happening in labs around the world is invisible to most CIOs. We need to stop talking to vendors about what they have on the back of the truck and start having discussions about the technology they have on the drawing boards that will create a sustainable competitive advantage.
We need to create landing zones for near-term technologies, sandboxes for farther-out technologies and co-development labs for way-out-there technologies. We need to recapture the can-do attitude that placed a man on the moon. We live in a world soon to be populated with petaflop (1,000 trillion calculations per second), exaflop, zettaflop, yottaflop and xeraflop supercomputers. With this kind of technology waiting in the wings, upgrading to Technology Adoption 2.0 is not optional.
Thornton A. May is a longtime industry observer, management consultant and commentator. You can contact him at thorntonamay@aol.com.