Database maker InterSystems Corp. is expected to unveil integration software next week aimed at making it easier for users to build composite applications that cull data from existing legacy systems.
InterSystems’ new Ensemble platform combines the vendor’s database technology with new integration tools, an application server and a common development environment. Modeling and management features help users integrate data sources, map business processes, and build and monitor composite applications.
At the core of the Ensemble suite is an object database for managing and storing metadata, messages and process information. This persistent object engine, which is based on InterSystems’ Cache relational database technology, is what differentiates Ensemble from other application integration suites, said Roy Schulte, a vice-president at Stamford, Conn.-based research firm Gartner Inc.
The object engine sits between systems and makes it easier for users to develop new applications by masking the complexity of the links to heterogeneous back-end systems, Schulte said.
“The new code, rather than going out to get data from existing databases, goes to this virtual object which is sitting in the middle of the network,” Schulte said. “Behind the scenes there may be a lot of sophisticated things happening to map that persistent object in the middle back to the actual original source application databases. But you don’t have to see all the ugliness.”
While InterSystems’ approach is novel – neither BEA Systems Inc. nor IBM Corp. nor Microsoft Corp. offers anything similar – the technology is not mainstream and requires learning a new style of development, he said.
Moreover, InterSystems will have to work to gain recognition among users because it’s not known as a player in the integration market, said Mark Ehr, a senior analyst at Enterprise Management Associates Inc.
One customer that’s already onboard is the State of Florida’s Department of Children and Families (DCF). The agency uses Ensemble to construct a consolidated view of data that’s scattered across dozens of disparate databases and accessed by 59 different systems.
The goal is to create a single view of all relevant data about a client, says Glenn Palmiere, IT director at DCF in Tallahassee. DCF conducted a pilot project last year to connect five systems and is working to extend its Ensemble deployment across all 59 systems, Palmiere said. Down the road, DCF plans to pull in information from other agencies that deal with health and human services in Florida.
“We’re basically going to create a single family interface so that regardless of which agency provides the service and regardless of the system that the data is stored on, an individual will be able to access all the information related to a person,” he said.
Ensemble’s real-time characteristics are critical, Palmiere sais. Rather than collecting and storing information in a data warehouse, Ensemble lets DCF extract data elements at the moment a user needs the information, he said. “When you’re talking about data that’s relevant to an individual’s life or care, you do not want data that was refreshed a week ago,” he said.
Ensemble runs on Hewlett-Packard Co. Alpha OpenVMS, HP Alpha Tru64, Unix, HP-UX, IBM AIX, Linux, Sun Microsystems Inc.’s Solaris and Windows platforms. Pricing starts at US$125,000 per CPU.