I recently attended a Webinar entitled “Computing Professionalism: Do Good and Avoid Evil … and why it is complicated to do that in computing”. It was very informative and so was the Q&A session afterward. I think we all laughed when one participant asked “How do we break the temptation to just add more technology to solve the problem?”
The speaker was Don Gotterbarn from the Software Engineering Ethics Research Institute in Tennessee and I was very pleased that even in the U.S.A. he knew about our CIPS Code of Ethics.
He had used an example a bar code scanner at a supermarket where the customers couldn’t see a clerk’s face. The answer was to add a screen higher up so the clerk would look there. But the screen blocked the customer’s face.
We all laughed because it is in our nature to apply the technology we know to solve any problem we are given. If I understood the implications correctly, the solution here was just to reduce the need to maximize throughput and allow time for the clerk to look up to add personal value for the customer.
Don’s point was that we should not just “do whatever someone wants” (which he calls the Agency model). Nor should we just “do what we think is right” (which he calls the Paternalistic model). It should be a joint effort (Fiduciary model) and there should be a deliberate effort to think outside the frame of reference you are given and think of how other people will be impacted.
I particularly identified with his examples regarding making interfaces accessible. If we focus only on profit and deadlines, it is very hard to make a business case for a better interface. But Don encourages us to use our professional judgement and take those few extra steps during design that can make a big difference to someone using the system. That is the cheapest time to do it, and they are paying us for ALL of our talents – not just to build what they say. In his experience if you suggest something “extra” or different to the users, they will either become your biggest fan (which is good) or they will say “you don’t understand our situation, you would have done this all wrong” (which is good – find out early!).
Don warns that psychiatrists have found most people believe we are more moral than we are. Having said that, Don agrees with Socrates that if people know what the right thing is to do, they will do it. Or 99 per cent of them will. Let’s focus on that majority so they know the right thing. With this in mind, he recommends a Software Development Impact Statement that can become as standard as the Environmental Impact study has become for civil engineers.
How can we move forward with this suggestion? I suspect we need to educate business as much as we have to educate designers about this.