Data Planning

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 231 Experts worldwide ranked by ideXlab platform

Pin Pin Yeo - One of the best experts on this subject based on the ideXlab platform.

Robyn B. Reed - One of the best experts on this subject based on the ideXlab platform.

  • Diving into Data: Planning a Research Data Management Event.
    Journal of escience librarianship, 2015
    Co-Authors: Robyn B. Reed
    Abstract:

    The roles librarians play in Data management vary depending on institutional need and support. While some libraries have established collaborations in these areas and have integrated themselves into Data management activities, other libraries are in the beginning stages of assisting researchers with their Data management challenges. The areas where librarians play roles also vary widely and may include consulting and writing Data management plans for grant applications, assisting with determining metaData standards, Data curation and archiving, and finding and citing appropriate Data repositories. (Tenopir et al. 2012; Soehner 2010) Additionally, many academic research libraries are Planning to offer Data management services but have not initiated them at this time. (Tenopir et al. 2014) Since these services can be institution-specific, they can be implemented in many ways. (Raboin et al. 2013) A challenge most libraries face is in addressing the needs of a diverse clientele. The George T. Harrell Health Sciences Library (Harrell Library) supports the information, research, and education needs of almost 10,000 faculty, staff, students, and postdoctoral scholars across both the Penn State College of Medicine and the Milton S. Hershey Medical Center (Penn State Hershey). In addition to the large user population, the Harrell Library supports a wide range of research activities in clinical, biomedical, and translational areas, as well as providing support for medical and graduate education programs. With no formal mechanism to assist researchers with Data management issues, most information was scattered throughout the institution. Many people relied on “word of mouth” or did not know where to turn when faced with questions related to Data management. The action taken to initiate library involvement in Data management activities was to host a half-day Data management symposium, with the target audience being researchers – faculty, staff, and students at Penn State Hershey and University Park campuses. The goals of this event were to assist researchers in identifying resources and information on Data management and to highlight the library as a conduit of information.

Graeme Shanks - One of the best experts on this subject based on the ideXlab platform.

  • The challenges of strategic Data Planning in practice: An interpretive case study
    The Journal of Strategic Information Systems, 1997
    Co-Authors: Graeme Shanks
    Abstract:

    Many organisations have had difficulty with strategic Data Planning despite strong arguments about its value. A number of empirical studies of strategic Data Planning have identified various factors important to its success but few have presented detailed contextual explanations. This paper reports an in-depth, interpretive case study which examines the strategic Data Planning process in a large Australian bank. The paper explains why strategic Data Planning is such a difficult undertaking and suggests three important implications for practitioners. First, both business managers and information systems staff find the output Data architecture difficult to understand, and improved representations and explanations of the Data architecture should be used. Second, strategic Data Planning is a complex social activity and an understanding of the organisational context within which it takes place is critical to its success. Third, strategic Data Planning may not be the best way to build a Data architecture, and other approaches which facilitate participation should be considered.

Nina Lewin - One of the best experts on this subject based on the ideXlab platform.

B F Asztalos - One of the best experts on this subject based on the ideXlab platform.

  • Improved allocation of costs through analysis of variation in Data: Planning of laboratory studies.
    Clinica Chimica Acta, 2001
    Co-Authors: T A Foster, B F Asztalos
    Abstract:

    Abstract Background: When developing a new laboratory test for study of human diseases, it is important to identify and control internal and external sources of variation that affect test results. It is also imperative that the precision of the test not only meets pre-established requirements and not exceed allowable total error, but also that these objectives are reached without undue expenditure of either time or financial resources. Methods: This study applies statistical principles in designing a cost-effective experimental approach for determining the analytical precision of a new test. This approach applies the statistical concept of variance components to the problem of balancing a pre-established level of analytical precision against expenses incurred in achieving this precision. Results: We demonstrated (1) estimation of variance components, (2) use of these estimates for improving allocation of costs within the experiment, and (3) use of these estimates for determining the optimal number of replicate measurements. Conclusions: Although elimination of all sources of variation that can affect laboratory test results is unlikely, the application of analysis of variance (ANOVA) statistical techniques can lead to a cost-effective allocation of resources for estimating the precision of a laboratory test.

  • Improved allocation of costs through analysis of variation in Data: Planning of laboratory studies.
    Clinica chimica acta; international journal of clinical chemistry, 2001
    Co-Authors: T A Foster, B F Asztalos
    Abstract:

    When developing a new laboratory test for study of human diseases, it is important to identify and control internal and external sources of variation that affect test results. It is also imperative that the precision of the test not only meets pre-established requirements and not exceed allowable total error, but also that these objectives are reached without undue expenditure of either time or financial resources. This study applies statistical principles in designing a cost-effective experimental approach for determining the analytical precision of a new test. This approach applies the statistical concept of variance components to the problem of balancing a pre-established level of analytical precision against expenses incurred in achieving this precision. We demonstrated (1) estimation of variance components, (2) use of these estimates for improving allocation of costs within the experiment, and (3) use of these estimates for determining the optimal number of replicate measurements. Although elimination of all sources of variation that can affect laboratory test results is unlikely, the application of analysis of variance (ANOVA) statistical techniques can lead to a cost-effective allocation of resources for estimating the precision of a laboratory test.