The Ontario’s troubled SAMS information management system is “absolutely among the worst” examples of poor project management one expert says she’s ever seen
“Hard for me to believe any sentient human being could look at the danger signals on this project and press the ‘Go’ button,” Anna Murray, principal in TMG-emedia Inc, a New York City technology consulting who has run large technology projects for 20 years including large EPR rollouts, publishing systems for media companies and infrastructure systems, said in an interview.
She’s also the author of an upcoming book on software project management.
Murray made the comments after reviewing the 35-page report by Ontario’s auditor general into the disasterous launch of the ministry of community and social services’ Social Assistance Management System last year.
Almost three years behind schedule, over budget and riddled with serious defects and legacy data errors, the auditor largely blamed the ministry for not overseeing the project more closely. But the report also noted IBM, hired initially to convert two years of legacy data, delivered its efforts late and filled with errors. Curam Software, which had to adopt its case management suite and was bought by IBM after the project started (and was therefore managed by IBM) delivered software with major defects.
Fault lay ultimately with the ministry’s executive committee, said the auditor — a committee that included the CIOs of the province and the social services ministry — for knowingly assuming the significant risk of launching a new computer system that was not functioning properly.
The committee was told SAMS met only one of 18 go-live launch criteria, the auditor noted, and the software had 418 serious defects and there were workaround for only 217 of them.
Still, the report added, the committee wasn’t told of many serious defects that project staff knew about.
Nor was the committee told of the actual number of user acceptance tests conducted and their results;Â that not all interfaces were tested;Â the lack of testing done to compare daily-payruns in SAMS with the previous system; and the lack of testing of converted data.
Ministry staff told the auditor the executive committee wasn’t told of serious defects they had started developing solutions or fixes for them. However, SAMS launched before those fixes were implemented.
The report was done from the perspective of an auditor and Murray admitted it left some questions unanswered, such as how some decisions were made.
But she said there are still lessons to be learned:
–Things go wrong when bad decisions are made early. “Project management and project success is holistic — by the time the discovery, staffing and the decisions are made on a project your fate is largely determined.”
In this case she notes the auditor pointed out flaws in SAMS’ design (see the main story).
–Be careful buying package software for customization: “There’s this siren song of packaged software, because it’s off the shelf and ‘it has this name that seems to match what we’re doing,'” Murray said. but going that route is tricky and if you don’t make that decision correctly you can find yourself at a late stage wishing you wrote a custom piece of software because the level of customization is quite high. Packaged pieces of software don’t like to be bent in a direction that they are not intended to go in.”
“Really carefully map the packaged software functions against the business requirement and ask what is the level of customization (needed). Second, ask if the vendor has done those customizations before.”
–Make sure decision-makers understand what risk means. Ordinary people “interpret the word as ‘there is a chance of harm but I likely will not endure it,'” said Murray. “But in software, risk means ‘a consequence will very likely happen’Â Â –not might, but will.”
In approving SAMS it seemed to her the executive committee didn’t understand the risk, although two CIOs sat on the committee along with other senior bureaucrats.
On the other hand, the auditor’s report suggests the committee wasn’t given the full picture. Still, it decided to roll the dice.
And the old system was so balky it seems no one wanted to keep using it until all of the bugs were ironed out of SAMS — a violation of Murray’s rule that there should always be a roll-back plan.
But Murray insists someone with technology knowledge and all the data should have been there to “look in the eyes of the executive committee and ask ‘Do they get it?’
There are five risky things in an software project, Murray says: Data migration, integration/interfaces, a large project, customization, new technology. “This project had four out of the five. That means the risk profile soars.”
Murray’s book, ‘The Complete Software Project Manager,’ will be published in January by Wiley.