Eric Knorr
Contributing writer

IBM’s master plan for data

news
Nov 21, 20053 mins

Big Blue takes its time stitching together three acquisitions to create a master-data offering

What’s the ugliest problem in IT? Many would say it’s the contradictory or incomplete data strewn around the enterprise in various databases and formats. Reconciling and normalizing all that data is hard, tedious work.

There’s no magic bullet, but “master data” solutions of the type IBM formally introduced last week can go a long way toward enabling enterprises to create a single version of the truth without driving IT insane.

Three years in the making, WebSphere Product Center, Customer Center Version 6, and Integration Center together offer a complete “information service” solution, said Dan Druker, IBM’s director of enterprise master data solutions. The new offerings are primarily a consolidation of products acquired in an IBM shopping spree that included the purchase of Trigo Technologies (product information middleware); DWL (customer data integration software); Ascential Software (enterprise information integration); and SRD (identity resolution software). Druker said IBM has been working with these products in the field for some time — with 500 customers using them in various combinations — and has drawn on that experience to evolve the products to their latest release versions.

Chuck Coleman, director of product support systems at Corporate Express, is one of those customers. He has been using Trigo Product Center — now WebSphere Product Center — “to manage our master data for the product side of the house.” Corporate Express’ customers were demanding more product catalogs and more frequent updates, and Coleman needed to “add consistency to the data to reduce our Q&A.” The IBM/Trigo solution also filled other needs, such as catalog versioning and batch update scheduling, and provided a clean workflow that divvied up data ownership to the appropriate parties. A big benefit, Coleman said, is that end- users can make changes that previously had required formal requests to database administrators.

IBM’s Druker said a number of key drivers point toward the master-data approach. The first is simply cost savings through elimination of redundancy — what he calls the “once and done” method of updating. Next in line is compliance because it’s hard for companies to meet regulatory demands if there are multiple versions of the truth running around. Accurate, up-to-date credit and collections information is another motivation, as is the new wave of M&A activity, which has resulted in huge data consolidation problems.

Druker also noted that IBM’s master data suite fits in nicely with today’s overarching IT trend, SOA, which demands that a full range of applications distributed across the enterprise have access to a consolidated set of services. And rigorously consistent data information services — whether data for products, customers, suppliers, or employees — are arguably the most important services of all.

Eric Knorr

Eric Knorr is a freelance writer, editor, and content strategist. Previously he was the Editor in Chief of Foundry’s enterprise websites: CIO, Computerworld, CSO, InfoWorld, and Network World. A technology journalist since the start of the PC era, he has developed content to serve the needs of IT professionals since the turn of the 21st century. He is the former Editor of PC World magazine, the creator of the best-selling The PC Bible, a founding editor of CNET, and the author of hundreds of articles to inform and support IT leaders and those who build, evaluate, and sustain technology for business. Eric has received Neal, ASBPE, and Computer Press Awards for journalistic excellence. He graduated from the University of Wisconsin, Madison with a BA in English.

More from this author