Data Mart

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 12015 Experts worldwide ranked by ideXlab platform

Elaine Larson - One of the best experts on this subject based on the ideXlab platform.

  • challenges associated with using large Data sets for quality assessment and research in clinical settings
    Policy Politics & Nursing Practice, 2015
    Co-Authors: Bevin Cohen, David K Vawdrey, David W Caplan, Yoko E Furuya, Elaine Larson
    Abstract:

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of Data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these Data for clinical research, including issues surrounding access and information security, poor Data quality, inconsistency of Data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical Data sets. In this article, we describe our experience with assembling a Data-Mart and conducting clinical research using electronic Data from four facilities within a single hospital network in New York City. We culled Data from several electronic sources, including the institution’s admission-discharge-transfer system, cost accounting system, electronic health record, clinical Data warehouse, and departmental records. The final Data-Mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Us...

  • challenges associated with using large Data sets for quality assessment and research in clinical settings
    Policy Politics & Nursing Practice, 2015
    Co-Authors: Bevin Cohen, David K Vawdrey, Yoko E Furuya, Jianfang Liu, David Caplan, Frederick W Mis, Elaine Larson
    Abstract:

    The rapidly expanding use of electronic records in health-care settings is generating unprecedented quantities of Data available for clinical, epidemiological, and cost-effectiveness research. Several challenges are associated with using these Data for clinical research, including issues surrounding access and information security, poor Data quality, inconsistency of Data within and across institutions, and a paucity of staff with expertise to manage and manipulate large clinical Data sets. In this article, we describe our experience with assembling a Data-Mart and conducting clinical research using electronic Data from four facilities within a single hospital network in New York City. We culled Data from several electronic sources, including the institution's admission-discharge-transfer system, cost accounting system, electronic health record, clinical Data warehouse, and departmental records. The final Data-Mart contained information for more than 760,000 discharges occurring from 2006 through 2012. Using categories identified by the National Institutes of Health Big Data to Knowledge initiative as a framework, we outlined challenges encountered during the development and use of a domain-specific Data-Mart and recommend approaches to overcome these challenges.

Matteo Golfarelli - One of the best experts on this subject based on the ideXlab platform.

  • Temporal Data Warehousing: Approaches and Techniques
    'IGI Global', 2011
    Co-Authors: Matteo Golfarelli, S. Rizzi
    Abstract:

    Data warehouses are information repositories specialized in supporting decision making. Since the decisional process typically requires an analysis of historical trends, time and its management acquire a huge importance. In this paper we consider the variety of issues, often grouped under term temporal Data warehousing, implied by the need for accurately describing how information changes over time in Data warehousing systems. We recognize that, with reference to a three-levels architecture, these issues can be classified into some topics, namely: handling Data/schema changes in the Data warehouse, handling Data/schema changes in the Data Mart, querying temporal Data, and designing temporal Data warehouses. After introducing the main concepts and terminology of temporal Databases, we separately survey these topics. Finally, we discuss the open research issues also in connection with their implementation on commercial tools

  • A Survey on Temporal Data Warehousing
    International Journal of Data Warehousing and Mining, 2009
    Co-Authors: Matteo Golfarelli, Stefano Rizzi
    Abstract:

    Data warehouses are information repositories specialized in supporting decision making. Since the decisional process typically requires an analysis of historical trends, time and its management acquire a huge importance. In this paper we consider the variety of issues, often grouped under term temporal Data warehousing, implied by the need for accurately describing how information changes over time in Data warehousing systems. We recognize that, with reference to a three-levels architecture, these issues can be classified into some topics, namely: handling Data/schema changes in the Data warehouse, handling Data/schema changes in the Data Mart, querying temporal Data, and designing temporal Data warehouses. After introducing the main concepts and terminology of temporal Databases, we separately survey these topics. Finally, we discuss the open research issues also in connection with their implementation on commercial tools.

  • Data warehouse design from XML sources
    2004
    Co-Authors: Matteo Golfarelli, Stefano Rizzi, Boris Vrdoljak
    Abstract:

    A large amount of Data needed in decision-making processes is stored in the XML Data format, which is widely used for e-commerce and Internet-based information exchange. Thus, as more organizations view the web as an integral part of their communication and business, the importance of integrating XML Data in Data warehousing environments is becoming increasingly high. In this paper we show how the design of a Data Mart can be carried out starting directly from an XML source. Two main issues arise: on the one hand, since XML models semi-structured Data, not all the information needed for design can be safely derived; on the other, different approaches for representing relationships in XML DTDs and Schemas are possible, each with different expressive power. After discussing these issues, we propose a semi-automatic approach for building the conceptual schema for a Data Mart starting from the XML sources.

  • integrating xml sources into a Data warehouse environment
    International Conference on Software Telecommunications and Computer Networks, 2001
    Co-Authors: Matteo Golfarelli, Stefano Rizzi, Boris Vrdoljak
    Abstract:

    A Data warehousing system is a collection of technologies and tools which enables knowledge workers to acquire, integrate and flexibly analyze information from different sources aimed at improving the knowledge assets of the enterprise. The importance of integrating XML Data in Data warehousing environments is becoming increasingly high as more organizations view the web as an integral part of their communication and business. In this paper we propose a semi-automatic approach for building the conceptual schema for a Data Mart starting from the DTDs describing the XML sources. The main issue arising is that, since XML models semi-structured Data, not all the information needed for design can be safely derived. In our approach, this issue is addressed by querying the source XML documents and, if necessary, by asking the designer's help.

  • wand a case tool for Data warehouse design
    ICDE Demo Sessions, 2001
    Co-Authors: Matteo Golfarelli, Stefano Rizzi
    Abstract:

    The statistic reports about Data warehouse project failures state that a major cause lies in the absence of a structured design methodology. In this direction, our research is aimed at defining the basic steps required for a correct design. The goal of this demonstration is to present the main features of WAND, the prototype CASE tool we have implemented to support our methodology. WAND assists the designer in structuring a Data Mart, carries out conceptual design in a semi-automatic fashion, allows for a core workload to be defined on the conceptual scheme and carries out logical design to produce the Data Mart scheme. 1 . Motivation and overview Building a Data warehouse (DW) for an enterprise is a huge and complex task, which requires an accurate planning aimed at devising satisfactory answers to organizational and architectural questions. There is substantial agreement on the fact that, when planning a DW, a bottom-up approach should be followed: the Data Mart playing the most strategic role for the enterprise should be identified and prototyped first in order to convince the final users of the potential benefits; other Data Marts are built progressively, to be finally integrated bottom-up into the global warehouse. Within this process, the phase of Data Mart design sometimes seems to be given relatively small importance. On the other hand, the statistic reports related to DW project failures state that a major cause lies in the absence of a global view of the design process: in other terms, in the absence of a structured design methodology [6]. We believe that a methodological framework for design is an essential requirement to ensure the success of complex projects. In this direction, our research is aimed at defining the basic steps required for a correct design. In particular, we considered the problem of conceptual design from the operational Database as well as some relevant issues related to logical design. It is well-known among software engineers that devising a design methodology is almost useless, if no CASE tool to support it is provided. The goal of this demonstration is to present the main features of WAND ( Wa rehouse I ntegrated Designer), the prototype CASE tool we have implemented to support our methodology. WAND assists the designer in structuring a Data Mart; it carries out conceptual design in a semi-automatic fashion starting from the logical scheme of the operational Database (read via ODBC), allows for a core workload to be defined on the conceptual scheme and carries out logical design to produce the Data Mart scheme. Both the Table I. The six phases in our DW design methodology. Step Input Output Involves Analysis of the operational Database existing documentation (reconciled) Database scheme designer; managers of the information system Requirement specification Database scheme facts; preliminary workload designer; final users Conceptual design Database scheme; facts; preliminary workload conceptual scheme designer; final users Workload refinement, conc. scheme validation conceptual scheme; preliminary workload validated conceptual scheme; workload designer; final users Logical design conceptual scheme; Data volume; workload logical scheme designer Physical design logical scheme; target DBMS; workload physical scheme designer

Paul Gray - One of the best experts on this subject based on the ideXlab platform.

  • Business Intelligence
    Handbook on Decision Support Systems, Vol 2, 2004
    Co-Authors: Solomon Negash, Schwain Schwepps, Paul Gray
    Abstract:

    Business intelligence (BI) is a Data-driven DSS that combines Data gathering, Data storage, and knowledge management with analysis to provide input to the decision process. The term originated in 1989; prior to that many of its characteristics were part of executive information systems. Business intelligence emphasizes analysis of large volumes of Data about the firm and its operations. It includes competitive intelligence (monitoring competitors) as a subset. In computer-based environments, business intelligence uses a large Database, typically stored in a Data warehouse or Data Mart, as its source of information and as the basis for sophisticated analysis. Analyses ranges from simple reporting to slice-and-dice, drill down, answering ad hoc queries, real-time analysis, and forecasting. A large number of vendors provide analysis tools. Perhaps the most useful of these is the dashboard. Recent developments in BI include business performance measurement (BPM), business activity monitoring (BAM), and the expansion of BI from being a staff tool to being used by people throughout the organization (BI for the masses). In the long-term, BI techniques and findings will be imbedded into business processes.

Sang Chan Park - One of the best experts on this subject based on the ideXlab platform.

  • application of Data mining tools to hotel Data Mart on the intranet for Database marketing
    Expert Systems With Applications, 1998
    Co-Authors: Sang Chan Park
    Abstract:

    Abstract Data mining, which is also referred to as knowledge discovery in Databases, is the process of extracting valid, previously unknown, comprehensible and actionable information from large Databases and using it to make crucial business decisions. In this paper, we present the Data mining process from Data extraction to knowledge interpretation and Data mining tasks, and corresponding algorithms. Before applying Data mining techniques to a real-world application, we build a Data Mart on the enterprise Intranet. RFM (recency, frequency, and monetary) Data extracted from the Data Mart are used extensively for our analysis. We then propose a new marketing strategy that fully utilizes the knowledge resulting from Data mining.

Stefano Rizzi - One of the best experts on this subject based on the ideXlab platform.

  • A Survey on Temporal Data Warehousing
    International Journal of Data Warehousing and Mining, 2009
    Co-Authors: Matteo Golfarelli, Stefano Rizzi
    Abstract:

    Data warehouses are information repositories specialized in supporting decision making. Since the decisional process typically requires an analysis of historical trends, time and its management acquire a huge importance. In this paper we consider the variety of issues, often grouped under term temporal Data warehousing, implied by the need for accurately describing how information changes over time in Data warehousing systems. We recognize that, with reference to a three-levels architecture, these issues can be classified into some topics, namely: handling Data/schema changes in the Data warehouse, handling Data/schema changes in the Data Mart, querying temporal Data, and designing temporal Data warehouses. After introducing the main concepts and terminology of temporal Databases, we separately survey these topics. Finally, we discuss the open research issues also in connection with their implementation on commercial tools.

  • Data warehouse design from XML sources
    2004
    Co-Authors: Matteo Golfarelli, Stefano Rizzi, Boris Vrdoljak
    Abstract:

    A large amount of Data needed in decision-making processes is stored in the XML Data format, which is widely used for e-commerce and Internet-based information exchange. Thus, as more organizations view the web as an integral part of their communication and business, the importance of integrating XML Data in Data warehousing environments is becoming increasingly high. In this paper we show how the design of a Data Mart can be carried out starting directly from an XML source. Two main issues arise: on the one hand, since XML models semi-structured Data, not all the information needed for design can be safely derived; on the other, different approaches for representing relationships in XML DTDs and Schemas are possible, each with different expressive power. After discussing these issues, we propose a semi-automatic approach for building the conceptual schema for a Data Mart starting from the XML sources.

  • integrating xml sources into a Data warehouse environment
    International Conference on Software Telecommunications and Computer Networks, 2001
    Co-Authors: Matteo Golfarelli, Stefano Rizzi, Boris Vrdoljak
    Abstract:

    A Data warehousing system is a collection of technologies and tools which enables knowledge workers to acquire, integrate and flexibly analyze information from different sources aimed at improving the knowledge assets of the enterprise. The importance of integrating XML Data in Data warehousing environments is becoming increasingly high as more organizations view the web as an integral part of their communication and business. In this paper we propose a semi-automatic approach for building the conceptual schema for a Data Mart starting from the DTDs describing the XML sources. The main issue arising is that, since XML models semi-structured Data, not all the information needed for design can be safely derived. In our approach, this issue is addressed by querying the source XML documents and, if necessary, by asking the designer's help.

  • wand a case tool for Data warehouse design
    ICDE Demo Sessions, 2001
    Co-Authors: Matteo Golfarelli, Stefano Rizzi
    Abstract:

    The statistic reports about Data warehouse project failures state that a major cause lies in the absence of a structured design methodology. In this direction, our research is aimed at defining the basic steps required for a correct design. The goal of this demonstration is to present the main features of WAND, the prototype CASE tool we have implemented to support our methodology. WAND assists the designer in structuring a Data Mart, carries out conceptual design in a semi-automatic fashion, allows for a core workload to be defined on the conceptual scheme and carries out logical design to produce the Data Mart scheme. 1 . Motivation and overview Building a Data warehouse (DW) for an enterprise is a huge and complex task, which requires an accurate planning aimed at devising satisfactory answers to organizational and architectural questions. There is substantial agreement on the fact that, when planning a DW, a bottom-up approach should be followed: the Data Mart playing the most strategic role for the enterprise should be identified and prototyped first in order to convince the final users of the potential benefits; other Data Marts are built progressively, to be finally integrated bottom-up into the global warehouse. Within this process, the phase of Data Mart design sometimes seems to be given relatively small importance. On the other hand, the statistic reports related to DW project failures state that a major cause lies in the absence of a global view of the design process: in other terms, in the absence of a structured design methodology [6]. We believe that a methodological framework for design is an essential requirement to ensure the success of complex projects. In this direction, our research is aimed at defining the basic steps required for a correct design. In particular, we considered the problem of conceptual design from the operational Database as well as some relevant issues related to logical design. It is well-known among software engineers that devising a design methodology is almost useless, if no CASE tool to support it is provided. The goal of this demonstration is to present the main features of WAND ( Wa rehouse I ntegrated Designer), the prototype CASE tool we have implemented to support our methodology. WAND assists the designer in structuring a Data Mart; it carries out conceptual design in a semi-automatic fashion starting from the logical scheme of the operational Database (read via ODBC), allows for a core workload to be defined on the conceptual scheme and carries out logical design to produce the Data Mart scheme. Both the Table I. The six phases in our DW design methodology. Step Input Output Involves Analysis of the operational Database existing documentation (reconciled) Database scheme designer; managers of the information system Requirement specification Database scheme facts; preliminary workload designer; final users Conceptual design Database scheme; facts; preliminary workload conceptual scheme designer; final users Workload refinement, conc. scheme validation conceptual scheme; preliminary workload validated conceptual scheme; workload designer; final users Logical design conceptual scheme; Data volume; workload logical scheme designer Physical design logical scheme; target DBMS; workload physical scheme designer