Page 54 -
P. 54
A Company and its Data 15
solve this problem, IT integration solutions need to be put
into place, with particular focus on data transport between
systems (EAI/ESB/ETL).
Most of the time, these data exchanges do not sufficiently
take into account the rules of validation and the referential
integrity constraints that link the data together. It is
therefore necessary, in parallel with these data exchanges, to
develop complementary software to control these flows
during their transport.
Due to the fact that this processing is not described in the
data model, an added complexity arises in the software
which penalizes the transparency of these data exchanges:
everything that is not expressed in the data model is the
object of hard-coded programming in the integration
software.
The duplication of data also generates duplication of its
validation rules. For instance, for the same data update, it is
not uncommon to have validation rules in the system that
are at the origin of the modification, as well as in the
integration layer (EAI/ESB/ETL) and in the target systems.
From then on, it is impossible to guarantee that these
validation rules all compatible with one another. This can
lead to a situation where one system accepts a data
modification and another rejects it and keeps the old value.
A validation gap can then appear in the IT system. This is a
sure source of low quality.
IT architects have a solution to avoid this gap. They try
and put a distributed transaction mechanism in place: if one
of the systems refuses the modification, then all refuse it.
Sadly, this mechanism is very complex and greatly affects
the responsiveness of IT. It is better to avoid it. Indeed, the
cost of the implementation of a two-phase commit technology
and/or a specific software development for dealing with the
distributed transaction is often too expensive.