Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

IT initiatives from recent years need to be made integration-ready

Rob Hailstone | Jan. 25, 2010
Integration processes have become as important as the functionality of the systems being integrated.

Other recent initiatives, such as complex event processing (CEP), are integration-ready e.g. a condition that has been detected can initiate a transaction or process in the traditional system world. However, CEP is just the tip of the iceberg of the broader issue of processing in-stream data. In-stream processing breaks all of our ingrained architectural expectations. Instead of being built around the generic IT model (detect event, store the captured information, process the data), processing in-stream data omits the data-capture step and performs all analysis and subsequent actions in-flight.

Many of the applications of in-stream processing appear to stand apart from mainstream IT, appearing in applications such as fraud detection, patient health monitoring, security applications and production-line automation. In reality, every meaningful event has a role to play in a business process, and simply detecting these without closing the loop to influence the process is really missing the whole point.

We need to remember the build to integrate lesson

This same requirement applies to all IT innovations. Using humans as integrators should not be acceptable, so we need to raise our expectations about the ease of integration of new technologies. IT should be alert to individuals investing in technologies that will cause ongoing integration problems, and should keep the pressure up on vendors to only deliver products that are integration-ready. The effectiveness of transforming investments in new technologies into business advantage will depend on it.

Rob Hailstone is an Ovum analyst.

 

Previous Page  1  2 

Sign up for MIS Asia eNewsletters.