Page 29 -
P. 29
Preface xxxi
billion per year in billing, accounting and inventory,
according to a survey of 599 companies by
PriceWaterhouseCoopers in New York” [BET 01].
This lack of control is accelerated because a company
inherits the systems that integrate with new ones, which
there are increasingly more of and which are spread out.
They must also interact with others under the control of
partners, clients, providers, etc. Consequently, there are
multiple information exchanges between databases and their
quality is strategic to support the reliability of processes. To
manage these exchanges, IT often has at its disposal
technical integration solutions which, unfortunately, often
2
lack sufficient modeling. Since the beginning of the 1990s,
these solutions have been used in a technical manner,
without taking into account data modeling. The
consequences of this are an increase in the complexity of
integration software and poor data exchange quality. Most of
the time, no reliable documentation exists on their validation
rules. They are locked and scattered in the software, without
being open enough to the business itself. To restore
reliability of exchanges, it is necessary to provide a data
model which can be understood by business users and which
they can profit from in order to improve data validation
rules.
Data repositories
It is in this context, of worrying about the quality of IT
systems, that data modeling is taking the front of the stage,
after several years of being abandoned. In particular,
reference and master data (that which is shared by a large
number of functions in the company) are the objects of
particular attention. They are often filed under the term
2. EAI: Enterprise Application Integration; ESB: Enterprise Service Bus;
ETL: Extract-Transform-Load.