Customer Master Data

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 45 Experts worldwide ranked by ideXlab platform

Hubert Osterle - One of the best experts on this subject based on the ideXlab platform.

  • CONF-IRM - Controlling Customer Master Data Quality: Findings from a Case Study.
    2020
    Co-Authors: Ehsan Baghi, Boris Otto, Hubert Osterle
    Abstract:

    Data quality management plays a critical role in all kinds of organizations. High-quality Data is one of the most important prerequisites for making strategic business decisions and executing business processes. In order to be able to assess Data quality, ensure efficient process execution, and verify the effectiveness of Data quality initiatives, Data quality has to be monitored and controlled. This can be achieved by implementing a comprehensive controlling system for Data quality. However, only few organizations have managed to implement such a system. This paper presents a single-case study describing the process of implementing a comprehensive Data quality controlling system. The study focuses on controlling activities defined in the field of business management.

  • controlling Customer Master Data quality findings from a case study
    International Conference on Information Resources Management (Conf-IRM), 2013
    Co-Authors: Ehsan Baghi, Boris Otto, Hubert Osterle
    Abstract:

    Data quality management plays a critical role in all kinds of organizations. High-quality Data is one of the most important prerequisites for making strategic business decisions and executing business processes. In order to be able to assess Data quality, ensure efficient process execution, and verify the effectiveness of Data quality initiatives, Data quality has to be monitored and controlled. This can be achieved by implementing a comprehensive controlling system for Data quality. However, only few organizations have managed to implement such a system. This paper presents a single-case study describing the process of implementing a comprehensive Data quality controlling system. The study focuses on controlling activities defined in the field of business management.

Ehsan Baghi - One of the best experts on this subject based on the ideXlab platform.

  • CONF-IRM - Controlling Customer Master Data Quality: Findings from a Case Study.
    2020
    Co-Authors: Ehsan Baghi, Boris Otto, Hubert Osterle
    Abstract:

    Data quality management plays a critical role in all kinds of organizations. High-quality Data is one of the most important prerequisites for making strategic business decisions and executing business processes. In order to be able to assess Data quality, ensure efficient process execution, and verify the effectiveness of Data quality initiatives, Data quality has to be monitored and controlled. This can be achieved by implementing a comprehensive controlling system for Data quality. However, only few organizations have managed to implement such a system. This paper presents a single-case study describing the process of implementing a comprehensive Data quality controlling system. The study focuses on controlling activities defined in the field of business management.

  • controlling Customer Master Data quality findings from a case study
    International Conference on Information Resources Management (Conf-IRM), 2013
    Co-Authors: Ehsan Baghi, Boris Otto, Hubert Osterle
    Abstract:

    Data quality management plays a critical role in all kinds of organizations. High-quality Data is one of the most important prerequisites for making strategic business decisions and executing business processes. In order to be able to assess Data quality, ensure efficient process execution, and verify the effectiveness of Data quality initiatives, Data quality has to be monitored and controlled. This can be achieved by implementing a comprehensive controlling system for Data quality. However, only few organizations have managed to implement such a system. This paper presents a single-case study describing the process of implementing a comprehensive Data quality controlling system. The study focuses on controlling activities defined in the field of business management.

Valter Šorli - One of the best experts on this subject based on the ideXlab platform.

  • Master Data management – Customer Data integration
    2014
    Co-Authors: Valter Šorli
    Abstract:

    In this Master’s thesis we deal with the problem of managing Customer Master Data. Organizations are often faced with inconsistent Data, which are scattered throughout the various silos applications. Since silos are living their own lives, the Master Data in them remains uncoordinated and business users often do not know which copy reflects the latest valid state. With a desire to raise the quality of Customer Master Data, it is necessary to establish a system to manage them. The purpose of this work is to present and describe the properties of such systems. We introduce three usage methods of systems for management: collaborative, operational and analytical, which in addition to the four modes of implementation, define a system for managing Customer Master Data. Considered are all four methods of implementation that significantly determine the properties of MDM hub. We describe the operation of the registry, consolidation, transactional and coexistence hub, and perform their mutual comparison. A method for determining the degree of maturity of Master Data management is presented. An organization can use it as a measure of the quality of its current solution and for seeking of further steps for improvement. The project of establishing a system of governance is extremely extensive and expensive process which, due to interference with the business processes of the organization, involves certain risks. In this work, we propose an appropriate methodology for development and project management, additionally there are given some guidelines for management team that will lead the project. On the market there are many providers of Customer Master Data management solutions. To narrow the list of bidders and facilitate the organization selection, we present the results of two analytical houses, which periodically monitor the supply of the Customer Master Data management segment. For five of the leading providers we sets out the main advantages and disadvantages of their solutions. Information that can be beneficial for an organization which is establishing a system for Customer Master Data management, are in this work gathered on one place.

  • Master Data management Customer Data integration
    2014
    Co-Authors: Valter Šorli
    Abstract:

    In this Master’s thesis we deal with the problem of managing Customer Master Data. Organizations are often faced with inconsistent Data, which are scattered throughout the various silos applications. Since silos are living their own lives, the Master Data in them remains uncoordinated and business users often do not know which copy reflects the latest valid state. With a desire to raise the quality of Customer Master Data, it is necessary to establish a system to manage them. The purpose of this work is to present and describe the properties of such systems. We introduce three usage methods of systems for management: collaborative, operational and analytical, which in addition to the four modes of implementation, define a system for managing Customer Master Data. Considered are all four methods of implementation that significantly determine the properties of MDM hub. We describe the operation of the registry, consolidation, transactional and coexistence hub, and perform their mutual comparison. A method for determining the degree of maturity of Master Data management is presented. An organization can use it as a measure of the quality of its current solution and for seeking of further steps for improvement. The project of establishing a system of governance is extremely extensive and expensive process which, due to interference with the business processes of the organization, involves certain risks. In this work, we propose an appropriate methodology for development and project management, additionally there are given some guidelines for management team that will lead the project. On the market there are many providers of Customer Master Data management solutions. To narrow the list of bidders and facilitate the organization selection, we present the results of two analytical houses, which periodically monitor the supply of the Customer Master Data management segment. For five of the leading providers we sets out the main advantages and disadvantages of their solutions. Information that can be beneficial for an organization which is establishing a system for Customer Master Data management, are in this work gathered on one place.

Boris Otto - One of the best experts on this subject based on the ideXlab platform.

  • CONF-IRM - Controlling Customer Master Data Quality: Findings from a Case Study.
    2020
    Co-Authors: Ehsan Baghi, Boris Otto, Hubert Osterle
    Abstract:

    Data quality management plays a critical role in all kinds of organizations. High-quality Data is one of the most important prerequisites for making strategic business decisions and executing business processes. In order to be able to assess Data quality, ensure efficient process execution, and verify the effectiveness of Data quality initiatives, Data quality has to be monitored and controlled. This can be achieved by implementing a comprehensive controlling system for Data quality. However, only few organizations have managed to implement such a system. This paper presents a single-case study describing the process of implementing a comprehensive Data quality controlling system. The study focuses on controlling activities defined in the field of business management.

  • controlling Customer Master Data quality findings from a case study
    International Conference on Information Resources Management (Conf-IRM), 2013
    Co-Authors: Ehsan Baghi, Boris Otto, Hubert Osterle
    Abstract:

    Data quality management plays a critical role in all kinds of organizations. High-quality Data is one of the most important prerequisites for making strategic business decisions and executing business processes. In order to be able to assess Data quality, ensure efficient process execution, and verify the effectiveness of Data quality initiatives, Data quality has to be monitored and controlled. This can be achieved by implementing a comprehensive controlling system for Data quality. However, only few organizations have managed to implement such a system. This paper presents a single-case study describing the process of implementing a comprehensive Data quality controlling system. The study focuses on controlling activities defined in the field of business management.

Andreas Obermeier - One of the best experts on this subject based on the ideXlab platform.

  • DESRIST - Anomaly-Based Duplicate Detection: A Probabilistic Approach
    Lecture Notes in Computer Science, 2019
    Co-Authors: Andreas Obermeier
    Abstract:

    The importance of identifying records in Databases that refer to the same real-world entity (“duplicate detection”) has been recognized in both research and practice. However, existing supervised approaches for duplicate detection need training Data with labeled instances of duplicates and non-duplicates, which is often costly and time-consuming to generate. On the contrary, unsupervised approaches can forego such training Data but may suffer from limiting assumptions (e.g., monotonicity) and providing less reliable results. To address the issue of generating high-quality results using easy to acquire duplicate-free training Data only, we propose a probabilistic approach for anomaly-based duplicate detection. Duplicates exhibit specific characteristics which differ significantly from the characteristics of non-duplicates and therefore represent anomalies. Based on the grade of anomaly compared to duplicate-free training Data, our approach assigns the probability of being a duplicate to each analyzed pair of records while avoiding limiting assumptions (of existing approaches). We demonstrate the practical applicability and effectiveness of our approach in a real-world setting by analyzing Customer Master Data of a German insurer. The evaluation shows that the results provided by the approach are reliable and useful for decision support and can outperform even fully supervised state-of-the-art approaches for duplicate detection.

  • ECIS - Event-Driven Duplicate Detection: A Probability-based Approach
    2018
    Co-Authors: Bernd Heinrich, Andreas Obermeier, Mathias Klier, Alexander Schiller
    Abstract:

    The importance of probability-based approaches for duplicate detection has been recognized in both research and practice. However, existing approaches do not aim to consider the underlying real-world events resulting in duplicates (e.g., that a relocation may lead to the storage of two records for the same Customer, once before and after the relocation). Duplicates resulting from real-world events exhibit specific characteristics. For instance, duplicates resulting from relocations tend to have significantly different attribute values for all address-related attributes. Hence, existing approaches focusing on high similarity with respect to attribute values are hardly able to identify possible duplicates resulting from such real-world events. To address this issue, we propose an approach for event-driven duplicate de-tection based on probability theory. Our approach assigns the probability of being a duplicate resulting from real-world events to each analysed pair of records while avoiding limiting assumptions (of existing approaches). We demonstrate the practical applicability and effectiveness of our approach in a real-world setting by analysing Customer Master Data of a German insurer. The evaluation shows that the results provided by the approach are reliable and useful for decision support and can outperform well-known state-of-the-art approaches for duplicate detection.