Data Quality Metric

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 66 Experts worldwide ranked by ideXlab platform

Teresa Vidal-calleja - One of the best experts on this subject based on the ideXlab platform.

  • IROS - Combining multiple sensor modalities for a localisation robust to smoke
    2011 IEEE RSJ International Conference on Intelligent Robots and Systems, 2011
    Co-Authors: Christopher Brunner, Thierry Peynot, Teresa Vidal-calleja
    Abstract:

    This paper proposes an approach to obtain a localisation that is robust to smoke by exploiting multiple sensing modalities: visual and infrared (IR) cameras. This localisation is based on a state-of-the-art visual SLAM algorithm. First, we show that a reasonably accurate localisation can be obtained in the presence of smoke by using only an IR camera, a sensor that is hardly affected by smoke, contrary to a visual camera (operating in the visible spectrum). Second, we demonstrate that improved results can be obtained by combining the information from the two sensor modalities (visual and IR cameras). Third, we show that by detecting the impact of smoke on the visual images using a Data Quality Metric, we can anticipate and mitigate the degradation in performance of the localisation by discarding the most affected Data. The experimental validation presents multiple trajectories estimated by the various methods considered, all thoroughly compared to an accurate dGPS/INS reference.

  • Combining multiple sensor modalities for a localisation robust to smoke
    2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011
    Co-Authors: Christopher Brunner, Thierry Peynot, Teresa Vidal-calleja
    Abstract:

    This paper proposes an approach to obtain a localisation that is robust to smoke by exploiting multiple sensing modalities: visual and infrared (IR) cameras. This localisation is based on a state-of-the-art visual SLAM algorithm. First, we show that a reasonably accurate localisation can be obtained in the presence of smoke by using only an IR camera, a sensor that is hardly affected by smoke, contrary to a visual camera (operating in the visible spectrum). Second, we demonstrate that improved results can be obtained by combining the information from the two sensor modalities (visual and IR cameras). Third, we show that by detecting the impact of smoke on the visual images using a Data Quality Metric, we can anticipate and mitigate the degradation in performance of the localisation by discarding the most affected Data. The experimental validation presents multiple trajectories estimated by the various methods considered, all thoroughly compared to an accurate dGPS/INS reference.

Christopher Brunner - One of the best experts on this subject based on the ideXlab platform.

  • IROS - Combining multiple sensor modalities for a localisation robust to smoke
    2011 IEEE RSJ International Conference on Intelligent Robots and Systems, 2011
    Co-Authors: Christopher Brunner, Thierry Peynot, Teresa Vidal-calleja
    Abstract:

    This paper proposes an approach to obtain a localisation that is robust to smoke by exploiting multiple sensing modalities: visual and infrared (IR) cameras. This localisation is based on a state-of-the-art visual SLAM algorithm. First, we show that a reasonably accurate localisation can be obtained in the presence of smoke by using only an IR camera, a sensor that is hardly affected by smoke, contrary to a visual camera (operating in the visible spectrum). Second, we demonstrate that improved results can be obtained by combining the information from the two sensor modalities (visual and IR cameras). Third, we show that by detecting the impact of smoke on the visual images using a Data Quality Metric, we can anticipate and mitigate the degradation in performance of the localisation by discarding the most affected Data. The experimental validation presents multiple trajectories estimated by the various methods considered, all thoroughly compared to an accurate dGPS/INS reference.

  • Combining multiple sensor modalities for a localisation robust to smoke
    2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011
    Co-Authors: Christopher Brunner, Thierry Peynot, Teresa Vidal-calleja
    Abstract:

    This paper proposes an approach to obtain a localisation that is robust to smoke by exploiting multiple sensing modalities: visual and infrared (IR) cameras. This localisation is based on a state-of-the-art visual SLAM algorithm. First, we show that a reasonably accurate localisation can be obtained in the presence of smoke by using only an IR camera, a sensor that is hardly affected by smoke, contrary to a visual camera (operating in the visible spectrum). Second, we demonstrate that improved results can be obtained by combining the information from the two sensor modalities (visual and IR cameras). Third, we show that by detecting the impact of smoke on the visual images using a Data Quality Metric, we can anticipate and mitigate the degradation in performance of the localisation by discarding the most affected Data. The experimental validation presents multiple trajectories estimated by the various methods considered, all thoroughly compared to an accurate dGPS/INS reference.

Thierry Peynot - One of the best experts on this subject based on the ideXlab platform.

  • IROS - Combining multiple sensor modalities for a localisation robust to smoke
    2011 IEEE RSJ International Conference on Intelligent Robots and Systems, 2011
    Co-Authors: Christopher Brunner, Thierry Peynot, Teresa Vidal-calleja
    Abstract:

    This paper proposes an approach to obtain a localisation that is robust to smoke by exploiting multiple sensing modalities: visual and infrared (IR) cameras. This localisation is based on a state-of-the-art visual SLAM algorithm. First, we show that a reasonably accurate localisation can be obtained in the presence of smoke by using only an IR camera, a sensor that is hardly affected by smoke, contrary to a visual camera (operating in the visible spectrum). Second, we demonstrate that improved results can be obtained by combining the information from the two sensor modalities (visual and IR cameras). Third, we show that by detecting the impact of smoke on the visual images using a Data Quality Metric, we can anticipate and mitigate the degradation in performance of the localisation by discarding the most affected Data. The experimental validation presents multiple trajectories estimated by the various methods considered, all thoroughly compared to an accurate dGPS/INS reference.

  • Combining multiple sensor modalities for a localisation robust to smoke
    2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011
    Co-Authors: Christopher Brunner, Thierry Peynot, Teresa Vidal-calleja
    Abstract:

    This paper proposes an approach to obtain a localisation that is robust to smoke by exploiting multiple sensing modalities: visual and infrared (IR) cameras. This localisation is based on a state-of-the-art visual SLAM algorithm. First, we show that a reasonably accurate localisation can be obtained in the presence of smoke by using only an IR camera, a sensor that is hardly affected by smoke, contrary to a visual camera (operating in the visible spectrum). Second, we demonstrate that improved results can be obtained by combining the information from the two sensor modalities (visual and IR cameras). Third, we show that by detecting the impact of smoke on the visual images using a Data Quality Metric, we can anticipate and mitigate the degradation in performance of the localisation by discarding the most affected Data. The experimental validation presents multiple trajectories estimated by the various methods considered, all thoroughly compared to an accurate dGPS/INS reference.

Mathias Klier - One of the best experts on this subject based on the ideXlab platform.

  • Metric-based Data Quality assessment - Developing and evaluating a probability-based currency Metric
    Decision Support Systems, 2015
    Co-Authors: Bernd Heinrich, Mathias Klier
    Abstract:

    Data Quality assessment has been discussed intensively in the literature and is critical in business. The importance of using up-to-date Data in business, innovation, and decision-making processes has revealed the need for adequate Metrics to assess the currency of Data in information systems. In this paper, we propose a Data Quality Metric for currency that is based on probability theory. Our Metric allows for a reproducible configuration and a high level of automation when assessing the currency of attribute values. The Metric values represent probabilities and can be integrated into a decision calculus (e.g., based on decision theory) to support decision-making. The evaluation of our Metric consists of two main steps: (1) we define an instantiation of the Metric for a real-use situation of a German mobile services provider to demonstrate both the applicability and the practical benefit of the approach; (2) we use publicly available real world Data provided by the Federal Statistical Office of Germany and the German Institute of Economic Research to demonstrate its feasibility by defining an instantiation of the Metric and to evaluate its strength (compared to existing approaches). We propose a well-founded probability-based Data Quality Metric for currency.The Metric values can be used in a decision calculus to support decision-making.The Metric has successfully been applied in several real-use situations.The Metric has significant advantages and can yield substantial practical benefit.

  • ECIS - A Novel Data Quality Metric for Timeliness considering Supplemental Data
    2011
    Co-Authors: Bernd Heinrich, Mathias Klier
    Abstract:

    It is intensively discussed in both science and practice how Data Quality (DQ) can be assured and improved. The growing relevance of DQ has revealed the need for adequate Metrics because quantifying DQ is essential for planning Quality measures in an economic manner. This paper analyses how DQ can be quantified with respect to the DQ dimension timeliness. Based on an existing approach, we design a new Metric to quantify timeliness in a well-founded manner that considers so-called supplemental Data (supplemental Data are additional Data attributes that allow drawing conclusions about the timeliness of the Data attribute considered). In addition, it is possible to take the values of the Metric into account when calculating expected values, an advantage that in turn leads to improved and comprehensible decision support. We evaluate the presented Metric briefly with regard to requirements for designing DQ Metrics from literature. Then, we illustrate the Metric’s applicability as well as its practical benefit. In cooperation with a financial services provider, the Metric was applied in the field of customer valuation in order to support the measurement of customer lifetime values.

  • a novel Data Quality Metric for timeliness considering supplemental Data
    European Conference on Information Systems, 2009
    Co-Authors: Bernd Heinrich, Mathias Klier
    Abstract:

    It is intensively discussed in both science and practice how Data Quality (DQ) can be assured and improved. The growing relevance of DQ has revealed the need for adequate Metrics because quantifying DQ is essential for planning Quality measures in an economic manner. This paper analyses how DQ can be quantified with respect to the DQ dimension timeliness. Based on an existing approach, we design a new Metric to quantify timeliness in a well-founded manner that considers so-called supplemental Data (supplemental Data are additional Data attributes that allow drawing conclusions about the timeliness of the Data attribute considered). In addition, it is possible to take the values of the Metric into account when calculating expected values, an advantage that in turn leads to improved and comprehensible decision support. We evaluate the presented Metric briefly with regard to requirements for designing DQ Metrics from literature. Then, we illustrate the Metric’s applicability as well as its practical benefit. In cooperation with a financial services provider, the Metric was applied in the field of customer valuation in order to support the measurement of customer lifetime values.

Alejandro A Vaisman - One of the best experts on this subject based on the ideXlab platform.

  • rule based multidimensional Data Quality assessment using contexts
    International Conference on Big Data, 2016
    Co-Authors: Adriana Marotta, Alejandro A Vaisman
    Abstract:

    It is an accepted fact that a value for a Data Quality Metric can be acceptable or not, depending on the context in which Data are produced and consumed. In particular, in a Data warehouse (DW), the context for the value of a measure is given by the dimensions, and external Data. In this paper we propose the use of logic rules to assess the Quality of measures in a DW, accounting for the context in which these measures are considered. For this, we propose the use of three sets of rules: one, for representing the DW; a second one, for defining the particular context for the measures in the warehouse; and a third one for representing Data Quality Metrics. This provides an uniform, elegant, and flexible framework for context-aware DW Quality assessment. Our representation is implementation independent, and not only allows us to assess the Quality of measures at the lowest granularity level in a Data cube, but also the Quality of aggregate and dimension Data.

  • DaWaK - Rule-Based Multidimensional Data Quality Assessment Using Contexts
    Big Data Analytics and Knowledge Discovery, 2016
    Co-Authors: Adriana Marotta, Alejandro A Vaisman
    Abstract:

    It is an accepted fact that a value for a Data Quality Metric can be acceptable or not, depending on the context in which Data are produced and consumed. In particular, in a Data warehouse (DW), the context for the value of a measure is given by the dimensions, and external Data. In this paper we propose the use of logic rules to assess the Quality of measures in a DW, accounting for the context in which these measures are considered. For this, we propose the use of three sets of rules: one, for representing the DW; a second one, for defining the particular context for the measures in the warehouse; and a third one for representing Data Quality Metrics. This provides an uniform, elegant, and flexible framework for context-aware DW Quality assessment. Our representation is implementation independent, and not only allows us to assess the Quality of measures at the lowest granularity level in a Data cube, but also the Quality of aggregate and dimension Data.