Following a presentation made as part of my activity with the French association of software publishers last December, here is a series of articles containing the ideas developed during this speech.
3 good reasons to get interested in interoperability
Interoperability itself isn’t a new subject. Set on the systems border, as these are hardware or software, the problem has technically been asked since the beginning of the computing science.
Addressing this challenge is necessary for software’s publisher, today more than ever, since it is often complex and multiform, and not only on the technical aspect.
Reason 1: platform fragmentation
The first reason to get interested in is the background trend concerning the multiplication of platforms.
Indeed, with the strong supremacy of the couple Wintel until the half of the 2000, interoperability had lost a bit of its emphasis in some areas.
But it is clear that Microsoft lost ground on part of the platform during the last years. Of course, Windows and PC are still dominating the market, but the multiplication of devices, especially mobile ones, changed the rules of the game.
No doubt that the tablets aren’t going to make it better, and even if Microsoft works hard on it with Windows 8, it’s almost sure that a market with only one system provider seems to be behind us… In a near future at least!
Even concerning the PC, the plurality of web browsers, something we could have legitimately believed to be over, is back with all the technical issues it implies for developers.
All this brings interoperability technical challenges platform-wide with intrinsic complexity.
Reason 2: specialization
The second reason of a major come back of the interoperability is the specialization. Each business growing, software’s offers become more and more verticalized and specialized.
Nowadays, it’s almost impossible to satisfy client needs within a single integrated offer, even for leading publishers who have large offerings. By the way, in most cases, these don’t have a consistent offer, since they grew through successive acquisitions, and interoperability issues appears even inside the product lines.
Moreover, largest solutions are penalized by a sort of inertia and are convicted to be late comparing to emerging offers which are more accurate. Consequently, clients are looking for the best of breed to answer their needs.
Ultimately, the optimal solution for a client goes through software’s composition, posing the interoperability challenge concerning data and processes.
Reason 3: the cloud computing
The third and last reason is the emergence of cloud-based models. Even if this model is in its early stage in many sectors, the underlying trend has begun and the cloud computing is going to raise new interoperability challenges at platform and application level.
No doubts that Facebook itself becomes on its own a development platform, and -side Salesforce has already become one in business-to-business for several years.
This transition to the cloud will probably be achieved with plenty of hybrid solutions, at least while those solutions are not thoroughly functional.
Consequently the result will be a lot of significant interoperability challenges between the on-premise and cloud-based infrastructures, not to mention a likely distribution between different providers.
Different types of interoperability
Otherwise, it’s important to understand that there are different types of interoperability.
In the first place, there are differences in formats. For instance, HTML, XML, ODF, CSV or PDF are files formats. This allows software suites to recognize the type of data they’re dealing with.
Communication protocols are also important in interoperability. HTTP, FTP, POP, IMAP or SMTP are well-known protocols since they’re allowing communications between different systems.
From development side APIs, generalists like JSON/REST or specific to a provider like for the .NET platform or WinRT, allow developers to use applications functionalities without caring about how the program works internally.
Finally, some business-oriented functional models are more or less standardized like HL7 (e.g Health Level 7) in health sector.
In the end, interoperability has different flavors depending on the level you’re looking at it.
Standards and limitations
To guarantee the interoperability between providers, some of these notions are standardized through different organizations like W3C, ECMA, ISO or IEEE. In lot of vertical sectors, consortiums exist for data exchange processes and formats.
But the standardization processes or normalizations are necessarily slow and always late regarding to the latest technologies because of:
- The computing complexity,
- The divergence of interests,
- The rapid pace of innovation that doesn’t wait for validated agreements.
In some cases, standards are far from being useful anymore or simply useless. CORBA is an example of a failed standard. It looks like UML is taking the same path.