Bob Lewis
Columnist

A new EDI architecture?

analysis
Apr 8, 20064 mins

Dear Bob ...I thought I remembered something from an old column but couldn't find it.  So I thought I'd drop a line to see if you remembered; see if you had $.02; or see if perhaps a column springs out of it.Topic - electronic data interchange (without caps).Our experience:Generation 1 (1980s) - point to point (RJE)Generation 2 (1990s) - hub / clearinghouse (xmodem)Current - Generation 2 plus FTP to hub.&nb

Dear Bob …

I thought I remembered something from an old column but couldn’t find it.  So I thought I’d drop a line to see if you remembered; see if you had $.02; or see if perhaps a column springs out of it.

Topic – electronic data interchange (without caps).

Our experience:

Generation 1 (1980s) – point to point (RJE)

Generation 2 (1990s) – hub / clearinghouse (xmodem)

Current – Generation 2 plus FTP to hub.   AND FTP Everywhere (point to point)

The article I remembered was some company’s regret about this (essentially) unmanaged point to point stuff.  Not talking Security here.. I’m thinking of data flow integrity, controls, etc.

I also see this behaviour as a perpetuation and expansion of a defacto batch-processing mentality; and it will be a wrenching change to adapt these processes, when the time comes, to a services-based process.

I’m visualizing an architecture where the outward-facing ftp services connect to the service bus which then manages connection of the data to the internal service; which can be batch or can be real-time (depending).

But I can hear the arguments that this is just complicating life, etc…

Interesting topic?

– Connecting

Dear Connecting …

Very interesting, to me at least, although I don’t recall writing any columns about this.

My first involvement in the EDI wars was in the early 1990s. It’s where I first learned that the single biggest challenge with respect to EDI isn’t technical – it’s semantic (I think the current buzzword is “ontology,” although I lose track).

It’s also where I first embraced the notion of clearinghouse store-and-forward services as solutions to the logistical nightmare of managing point-to-point connections with the web of suppliers and customers most companies would otherwise have to deal with directly in a full EDI deployment.

Of course, if you have more than just one or two clearninghouses, they’d need to exchange data. Otherwise, they’d just recreate the problem they’re trying to solve.

You’re right – the whole shebang is built on batch. The ballyhood switch to XML formats (from ANSI X.12 and EDIFACT) modernized the data format but didn’t change anything essential in the processing architecture.

I confess that I don’t see much advantage to the architecture you propose. When the incoming and outgoing data feeds are intrinsically batch there isn’t much point to connecting them to an internal real-time architecture. It won’t have any impact on either timeliness of data or business process design. So unless inbound EDI transactions have to negotiate a serious interface spiderweb, I’d say make sure you’re using a modern ETL tool to put the data stream in the right format and leave it at that.

What would, in some cases, be valuable would be changing the architecture of the external clearinghouse services from batch to quasi-real-time SOA. I’m not talking about true transaction processing, of course – a two-phase commit at Internet latencies would be an ugly thing. But connecting service buses across an extranet in a way that fed transactions to their proper destinations as they happen might be of significant interest in some industries.

I’d see two major challenges to this. The first is security: The worst thing bogus data can do in a batch feed is to embezzle. In an SOA environment we’d be creating a brand new, exciting Trojan Horse architecture.

The second, and truly difficult challenge is this: Innovation is fast; infrastructure change-out is slow. We’re talking about a coordinated change in processing architecture among a critical mass of participants in a whole industry. That typically takes a decade or so.

Just about the only way this could happen, I’d think, would be for a dominant customer in some industry to mandate the change within a specified timetable. WalMart would be a likely candidate; so might General Electric in some of the industries in which it is a major player.

But even with this kind of driver, the logistics involved in switching a whole industry from batch to real-time dwarfs what was needed (or what is needed in many cases) when shifting a single company’s internal architecture from batch to real-time.

– Bob