Site icon IT World Canada

Getting ahead of 2020 trends – Bracing for the skills shortage and why UI will disappear

Ahead of her book launch, ‘Duty of Care: An Executive Guide for Corporate Boards in the Digital Era’, certified corporate director Alizabeth Calder is publishing a five-part blog series on IT World Canada examining 10 trends that will be shaping discussions in corporate boardrooms in 2020. Today is Part 3 of 5.

A trend is a movement; a shift in thinking or direction; a change in perspective. The key trends that will be impactful going into the next decade are not about technology – they are about what will be different for business.

Skills – Moving back to investing in training and development

Shortly after Y2K, the industry saw a shift toward the use of contingent labour forces, and that trend has continued.  Moving into the next decade, two-thirds of technology leaders anticipate an extreme skills shortage, and yet 30 per cent of our human resources are still contingent on labour. The gap is a self-fulfilling prophecy.

Where the experienced practitioners get the opportunities those who are early in their careers continue to lack the experience they need to fill the next generation of positions. As we go toward the next decade, CIOs may need to go back to investing in talented people instead of fighting over the limited pool of talent available. We need to be more vigilant than ever about our investments in training.

According to CompTIA – The U.S.-based non-profit Computing Technology Industry Association – 40 per cent of IT professionals say they do not get the training they need. Time set aside for training is seen as time away from ‘real work’, and there is a continuing refrain of “if I train them, they will leave.” Consider, instead, what if I don’t train them and they stay?

AI – Moving to a continuum of immersive UX and zero UI

The trend in cognitive technology is a genuine transformation of the human/machine interface.  Consider the data:

There are three very different levels of AI capability; ‘Assisted Intelligence’ (improves what we are already doing), ‘Augmented Intelligence’ (enables things that couldn’t otherwise be done) and ‘Autonomous Intelligence’ (lets systems act on their own).  They are often discussed as discrete levels of capability but use cases show that it is, instead, a continuum.

Changes at the Assisted and Augmented Intelligence end of the spectrum are widely accepted.  They represent most of the 1 + 1 = 3 business cases that support investments. The processes and requirements can be clearly understood.  It is relatively easy to get buy-in to the outcomes. These are the “human-in-the-loop” solutions. This end of the continuum is enabled by the increasing elasticity, and we can expect to see more non-traditional applications.  For example, the recruiting industry believes that by 2020 a quarter of companies will replace initial screening interviews with chatbots, according to CapGemini. Computers are adapting to their humans rather than the other way around.

At the other end of the continuum, the trend line for Autonomous Intelligence is less clear.  According to PWC, as more autonomous devices are quietly listening for a call to action some 87 per cent of consumers have concerns about their privacy, according to PwC.  Autonomous devices such as cars are evolving, but the human factor will be a barrier to full adoption. Innovators will feel more pressure from these fear-based human dynamics.  The advantage we technologists have is that we have been through this battle before.  Think of ATMs in the early 1980s – customer responses ranged from skeptical to hostile.  Notable quotes included “at least the girl behind the window doesn’t die in the middle of helping me,” from Smithsonian Magazine.

CIOs who need to stay ahead of this trend may want to look at how resistance was tackled in early technology use-cases.

Duty of Care: An Executive Guide for Corporate Boards in the Digital Era is available on May 7.
Exit mobile version