When the Internet and the World Wide Web caught on, well beyond a public curiosity and firmly entrenched as a wave of the future, there was all sorts of speculation about how all of our data would be stored out on the Web. Our documents and information would exist only in cyberspace and the ASP and thin client would rule.
Though it is a possibility for the future, the reality is that companies and individuals are keeping their data close to their chest.
“There is a desire for computing as a utility,” said Irving Wladawsky-Berger at a recent Global Grid Forum in Toronto. “It is a wonderful concept but utilities need commonality to work…the reason that utilities and ASPs have gone close to nowhere is that they don’t have a common protocol.”
Wladawsky-Berger, often referred to as IBM’s CTO, equated the situation to countries and cities using completely different electrical systems. Though we need an adapter to plug in a Canadian laptop to a French outlet, that is the extent of it. With the Internet, the situation is more complex.
Ten years ago there were no cultural standards, he said. “Everybody at that table was out to screw everybody else at that table,” he added. Today the situation is improving with the likes of WSDL, XML, SOAP and Linux. Now the culture is no longer vendor specific but rather industry specific, he added.
Network is the message
Wladawsky-Berger, whose official title is vice-president, technology and strategy, IBM server group, said IBM’s customers are telling them that technology is not getting cheaper fast enough. Also, since there is a proliferation of technology at companies, and costs are becoming a bigger issue, there is the need for more efficiency from the technology they do own.
One way to help reduce costs is to use the Internet more for networking needs. But there is corporate trepidation. When a big Web site fails – for a variety of reasons – the response is, “Well it’s the Internet, what do you expect?”
But this is what companies want. “Make the damn thing (the Internet) really work as a computing platform,” he said, paraphrasing the demand he has heard more than once.
Wladawsky-Berger said the work being done by groups such as the Globus Project is helping.
According to the Globus Project Web site, “Grid computing is distinguished from conventional distributed computing by its focus on large-scale resource sharing, innovative applications, and – in some cases – high-performance orientation.”
And like many Grid folk, Wladawsky-Berger is a firm believer in open source. “[It] will do tremendous amount to facilitate the integration of layers.”
Once there is protocol commonality, companies will be able to outsource the solutions they choose not to own and computing can be a truly virtual experience. “You can choose what parts you want to buy, own or rent, he said.”
All of this is, of course, helped by the fact that increased bandwidth is available to users at increasingly affordable prices. In the early 1990s, “the cost of bandwidth was a joke.” Wladawsky-Berger said. Grid computing was just not feasible given connectivity prices. Working with a virtual computer which has components distributed over the Internet needs extremely fast connectivity, he added. “Grid, I think, is going to be a huge user of bandwidth.”
In the near future IT managers are going to both need and want the ability to pass off some of the work to other systems.
In a few years, IT managers will be managing by roughly an order of magnitude more technology, and this will continue to increase every five years, Wladawsky-Berger said. If they are having a tough time of if today, tomorrow it will be next to impossible, he added.
In the future, the main requirement of IT infrastructures will be their adaptability, and key to this is an integrated world of grid computing Wladawsky-Berger said.