How we work and how we think about work have been evolving since we humans came into existence. The ancient Greeks and Hebrews viewed work as a bad thing. It was what slaves did. Over time, cultural norms celebrating and encouraging hard work became secularized such that by the time of Ben Franklin’s Colonial America, idleness was viewed as a disgrace.
As the centuries passed, hunting and gathering gave way to agriculture. And what craftsmen used to do by hand was automated. Industrial work was replaced by information work. Since 1956, white-collar workers in America have outnumbered blue-collar workers.
In the early Information Age of the 1950s, the objective of mainstream workers was to secure work that required considerable thinking, decision-making and judgment. A college degree was thought to give one access to these “good” jobs. Generally, workers were satisfied with their work and wanted to be successful.
But somewhere in the mid-’90s, work became very complicated. IT work in particular became semi-indefinable. Is someone who translates medical notes into insurance codes an IT worker?
According to the federal government, there are 4 million IT workers in the U.S. David Foote, a frequently cited authority on IT labor issues, says the U.S. IT workforce is actually 20 million to 25 million professionals. Hard and actionable data about the current IT labor pool is very sketchy.
What’s worse, a generally accepted overarching framework for IT doesn’t currently exist. Traditional white-collar/blue-collar, manufacturing economy/service economy explanations are no longer adequate. The business specialist/tech specialist framework set forth in Computerworld’s cover story is a step in the right direction, but it doesn’t go far enough. CIOs lack a framework for understanding what’s happening in the IT workplace.
There are some things we can know about the future of the IT workforce. We know that in 2020, CIOs will oversee a bubbling massala of multigenerational attitudes and aptitudes. In the Internet Age, IT work has become globalized (it can be done anywhere), specialized (it’s very situation-specific), professionalized (certifications increasingly define IT’s skill sets) and atomized (with granular articulation of required tasks). These trends will continue.
But the major and material trend that will most dramatically impact the shape and deployment of the IT workforce is cloud labor. The labor component of IT will be virtualized, just as servers, storage and desktops have been. Which vendor will be the VMware to the huge and emerging virtual IT labor market is still up in the air.
Finally, a universal skills vocabulary will emerge, making it easier to articulate the IT work to be done. CIOs and the project managers who work for them will go online to match up skills, prices, availabilities and schedules. Sixty percent of the heavy lifting of IT work will be negotiated for online (think eBay).
There will be an emergence of analytically gifted innovators within IT. They can be found already in the innovation departments at future-focused enterprises like Kroger and McDonald’s. Working like venture capitalists, these fact-based, grounded-in-reality entrepreneurs look at what is possible on the edge of technology, then turn over their ideas to mainstream IT for scalar operations.
So don’t despair. Great jobs will still exist in the IT shop.