Data Acquisition Process

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 327 Experts worldwide ranked by ideXlab platform

Di Wang - One of the best experts on this subject based on the ideXlab platform.

  • an analysis of interaction between users and open government Data portals in Data Acquisition Process
    Pacific Rim Knowledge Acquisition Workshop, 2018
    Co-Authors: Di Wang, Deborah Richards, Chuanfu Chen
    Abstract:

    The rate of development of open government Data (OGD) portals has been fast in recent years due to potential benefits of the utilization of OGD. However, scholars have emphasized lack of use as a key problem in the realization of these benefits. Although studies have been carried out to understand decisive factors in OGD utilization from the aspects of either portals or users, they failed to consider the interaction between the two. Therefore, our study carried out an analysis of the interaction between users and OGD portals during users’ Data Acquisition Process from three aspects: Data Acquisition methods, Data quality requirements, and helping functions. We carried out a survey in a Chinese population to collect Data for analysis. Results show users’ high acceptance of keyword search as their method for Data Acquisition through OGD portals but browsing showed higher usage frequency and was a more stable Data Acquisition behavior. Females show better acceptance of regular recommendations (e.g. RSS) based on their visiting histories than males. Users’ age, education background and occupation affect their demands of different Data quality attributes. Our analysis also shows positive relationship between users’ Data Acquisition habits with their demands of Data quality, users’ need of help with their feelings of difficulties in using the portal, and users’ need of help with their demands of Data quality. We suggest promoting OGD utilization by offering better helping functions and improving Data qualities in future development of OGD portals.

  • PKAW - An analysis of interaction between users and open government Data portals in Data Acquisition Process
    Knowledge Management and Acquisition for Intelligent Systems, 2018
    Co-Authors: Di Wang, Deborah Richards, Chuanfu Chen
    Abstract:

    The rate of development of open government Data (OGD) portals has been fast in recent years due to potential benefits of the utilization of OGD. However, scholars have emphasized lack of use as a key problem in the realization of these benefits. Although studies have been carried out to understand decisive factors in OGD utilization from the aspects of either portals or users, they failed to consider the interaction between the two. Therefore, our study carried out an analysis of the interaction between users and OGD portals during users’ Data Acquisition Process from three aspects: Data Acquisition methods, Data quality requirements, and helping functions. We carried out a survey in a Chinese population to collect Data for analysis. Results show users’ high acceptance of keyword search as their method for Data Acquisition through OGD portals but browsing showed higher usage frequency and was a more stable Data Acquisition behavior. Females show better acceptance of regular recommendations (e.g. RSS) based on their visiting histories than males. Users’ age, education background and occupation affect their demands of different Data quality attributes. Our analysis also shows positive relationship between users’ Data Acquisition habits with their demands of Data quality, users’ need of help with their feelings of difficulties in using the portal, and users’ need of help with their demands of Data quality. We suggest promoting OGD utilization by offering better helping functions and improving Data qualities in future development of OGD portals.

  • The Data Acquisition, Process and Reconstruction for a Reverse Engineering Study
    Advanced Materials Research, 2011
    Co-Authors: Zhi Hua Gao, Di Wang
    Abstract:

    As a modern design method that contrary to traditional design, RE technology use of special measuring apparatus to get dimensional Data and enables one to start a design from an existing production model by combining computer technology, measurement technology and CAD/CAM technology. Firstly, The author has list the important application fields of RE technology, and then divided the RE Process into four steps and described the detailed working flowchart. The Data Acquisition procedure includes datum-points creation and model measurement. For the threedimensional coordinate measuring machine, a contact measurement, the measuring method and working principal is deep discussed. As to the acquired Data, the Data Processing should also be operated to meet the technique requirement, which consists of Data splicing, noise eliminating, Data interpolating, Data smoothing and Data filtering. Finally, the author summarized the surface reconstruction technology with the well Data and assumed the RE technology will have more extensively application in mechanical fields.

Jeanine Darmiento - One of the best experts on this subject based on the ideXlab platform.

  • parenchymal airspace profiling sensitive quantification and characterization of lung structure evaluating parenchymal destruction
    American Journal of Respiratory Cell and Molecular Biology, 2016
    Co-Authors: Rui Xiao, Monica P Goldklang, Jeanine Darmiento
    Abstract:

    Lung morphometry was introduced over 50 years ago to provide quantitative evaluation of the lung structure. The existing parameters, such as mean linear intercept and destructive index, suffer from simplistic Data interpretation and a subjective Data Acquisition Process. To overcome these existing shortcomings, parenchymal airspace profiling (PAP) was developed to provide a more detailed and unbiased quantitative method. Following the standard protocols of fixation, embedding, and sectioning, lung micrographs were: (1) marked with nonparenchymal area, preProcessed, and binarized under the researcher’s supervision; (2) analyzed with a statistical learning method, Gaussian mixture model, to provide an unbiased categorization of parenchymal airspace compartments, corresponding to a single alveolus, alveolar sac, and ductal/destructive airspace; and (3) further quantified into morphometric parameters, including reference volume, alveolar count, and ductal/destructive fraction (DF) based on stereological princ...

Luz M Hernandez - One of the best experts on this subject based on the ideXlab platform.

  • diffused matrix format a new storage and Processing format for airborne hyperspectral sensor images
    Sensors, 2010
    Co-Authors: Pablo Martinez, Alejandro Cristo, Magaly Koch, Rosa M Perez, Thomas Schmid, Luz M Hernandez
    Abstract:

    At present, hyperspectral images are mainly obtained with airborne sensors that are subject to turbulences while the spectrometer is acquiring the Data. Therefore, geometric corrections are required to produce spatially correct images for visual interpretation and change detection analysis. This paper analyzes the Data Acquisition Process of airborne sensors. The main objective is to propose a new Data format called Diffused Matrix Format (DMF) adapted to the sensor's characteristics including its spectral and spatial information. The second objective is to compare the accuracy of the quantitative maps derived by using the DMF Data structure with those obtained from raster images based on traditional Data structures. Results show that DMF Processing is more accurate and straightforward than conventional image Processing of remotely sensed Data with the advantage that the DMF file structure requires less storage space than other Data formats. In addition the Data Processing time does not increase when DMF is used.

J M B Dias - One of the best experts on this subject based on the ideXlab platform.

  • IbPRIA - Does Independent Component Analysis Play a Role in Unmixing Hyperspectral Data
    Pattern Recognition and Image Analysis, 2003
    Co-Authors: Jose M P Nascimento, J M B Dias
    Abstract:

    Independent Component Analysis (ICA) have recently been proposed as a tool to unmix hyperspectral Data. ICA is founded on two assumptions: i) The observed Data vector is a linear mixture of the sources (abundance fractions); ii) sources are independent. Concerning hyperspectral Data, the first assumption is valid whenever the constituent substances are surface distributed. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the Data Acquisition Process. Thus, sources cannot be independent. This paper gives evidence that ICA, at least in its canonical form, is not suited to unmix hyperspectral Data. We arrive to this conclusion by minimizing the mutual information of simulated hyperspectral mixtures. The hyperspectral Data model includes signature variability, abundance perturbation, sensor Point Spread Function (PSF), abundance constraint and electronic noise. Mutual information computation is based on fitting mixtures of Gaussians to the observed Data.

  • does independent component analysis play a role in unmixing hyperspectral Data
    Iberian Conference on Pattern Recognition and Image Analysis, 2003
    Co-Authors: Jose M P Nascimento, J M B Dias
    Abstract:

    Independent Component Analysis (ICA) have recently been proposed as a tool to unmix hyperspectral Data. ICA is founded on two assumptions: i) The observed Data vector is a linear mixture of the sources (abundance fractions); ii) sources are independent. Concerning hyperspectral Data, the first assumption is valid whenever the constituent substances are surface distributed. The second assumption, however, is violated, since the sum of abundance fractions associated to each pixel is constant due to physical constraints in the Data Acquisition Process. Thus, sources cannot be independent. This paper gives evidence that ICA, at least in its canonical form, is not suited to unmix hyperspectral Data. We arrive to this conclusion by minimizing the mutual information of simulated hyperspectral mixtures. The hyperspectral Data model includes signature variability, abundance perturbation, sensor Point Spread Function (PSF), abundance constraint and electronic noise. Mutual information computation is based on fitting mixtures of Gaussians to the observed Data.

Michelle R Embry - One of the best experts on this subject based on the ideXlab platform.

  • a framework for cumulative risk assessment in the 21st century
    Critical Reviews in Toxicology, 2017
    Co-Authors: Angelo Moretto, Ammie N Bachman, Alan R Boobis, Keith R Solomon, Timothy P Pastoor, Martin F Wilks, Michelle R Embry
    Abstract:

    AbstractThe ILSI Health and Environmental Sciences Institute (HESI) has developed a framework to support a transition in the way in which information for chemical risk assessment is obtained and used (RISK21). The approach is based on detailed problem formulation, where exposure drives the Data Acquisition Process in order to enable informed decision-making on human health safety as soon as sufficient evidence is available. Information is evaluated in a transparent and consistent way with the aim of optimizing available resources. In the context of risk assessment, cumulative risk assessment (CRA) poses additional problems and questions that can be addressed using the RISK21 approach. The focus in CRA to date has generally been on chemicals that have common mechanisms of action. Recently, concern has also been expressed about chemicals acting on multiple pathways that lead to a common health outcome, and non-chemical other conditions (non-chemical stressors) that can lead to or modify a common outcome. Ac...