Open Source Technology

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 3609 Experts worldwide ranked by ideXlab platform

Min-hsin Chen - One of the best experts on this subject based on the ideXlab platform.

  • E-Learning: Creating a High Touch Learning Community with High Tech
    2009 Fourth International Conference on Innovative Computing Information and Control (ICICIC), 2009
    Co-Authors: Pat Lemay Burr, Annette E. Craven, Min-hsin Chen
    Abstract:

    E-Learning can be heightened as a learning experience through the use of free, Open Source Technology that allows students to create, produce, edit, compress, and post their final academic podcasts online. This paper discusses how the process might evolve and how students may be tutored with a multitude of readily available online reSources. In addition, ideas are offered for appropriate topic development for the podcasts.

Oki Gumilar - One of the best experts on this subject based on the ideXlab platform.

  • integrating pipeline data management application and google maps dataset on web based gis application using Open Source Technology sharp map and Open layers
    2010 8th International Pipeline Conference Volume 3, 2010
    Co-Authors: Arie Wisianto, Hidayatus Saniya, Oki Gumilar
    Abstract:

    Development of web based GIS application often requires high cost on base map datasets and software licenses. Web based GIS Pipeline Data Management Application can be developed using the benefit of Google Maps datasets combined with available local spatial datasets resulting comprehensive spatial information. Sharp Map is an easy-to-use mapping library for use in web and desktop applications. It provides access and enables spatial querying to many types of GIS data. The engine is written in C# and based on the .Net 2.0 frameworks and provides advantages for integration with Pipeline Data Model such as PODS using .NET Technology. Sharp Map enables development of WMS and web services for serving pipeline data management information on internet/intranet web based application. Open Layers is use to integrate pipelines data model and Google Maps dataset on single map display with user friendly and dynamic user interfaces. The use of Sharp Map and Open Layers creating powerful Pipeline Data Management web based GIS application by combining specific information from pipelines data model and comprehensive Google Maps satellites datasets without publishing private information from local datasets. The combination on Sharp Map, Open Layers, Google Maps datasets, and .NET Technology resulting a low cost and powerful Pipeline Data Management web based GIS solution. Impact zone of the event then we can calculate their consequences and finally we can figure their risk.Copyright © 2010 by ASME

Pat Lemay Burr - One of the best experts on this subject based on the ideXlab platform.

  • E-Learning: Creating a High Touch Learning Community with High Tech
    2009 Fourth International Conference on Innovative Computing Information and Control (ICICIC), 2009
    Co-Authors: Pat Lemay Burr, Annette E. Craven, Min-hsin Chen
    Abstract:

    E-Learning can be heightened as a learning experience through the use of free, Open Source Technology that allows students to create, produce, edit, compress, and post their final academic podcasts online. This paper discusses how the process might evolve and how students may be tutored with a multitude of readily available online reSources. In addition, ideas are offered for appropriate topic development for the podcasts.

Maciej Ciesielski - One of the best experts on this subject based on the ideXlab platform.

  • Parallel Multi-core Verilog HDL Simulation Using Domain Partitioning
    2014 IEEE Computer Society Annual Symposium on VLSI, 2014
    Co-Authors: Tariq B. Ahmad, Maciej Ciesielski
    Abstract:

    While multi-core computing has become pervasive, scaling single core computations to multi-core computations remains a challenge. This paper aims to accelerate RTL and functional gate-level simulation in the current multi-core computing environment. This work addresses two types of partitioning schemes for multi-core simulation: functional, and domain-based. We discuss the limitations of functional partitioning, offered by new commercial multi-core simulators to speedup functional gate-level simulations. We also present a novel solution to increase RTL and functional gate-level simulation performance based on domain partitioning. This is the first known work that improves simulation performance by leveraging Open Source Technology against commercial simulators.

  • ISVLSI - Parallel Multi-core Verilog HDL Simulation Using Domain Partitioning
    2014 IEEE Computer Society Annual Symposium on VLSI, 2014
    Co-Authors: Tariq B. Ahmad, Maciej Ciesielski
    Abstract:

    While multi-core computing has become pervasive, scaling single core computations to multi-core computations remains a challenge. This paper aims to accelerate RTL and functional gate-level simulation in the current multi-core computing environment. This work addresses two types of partitioning schemes for multi-core simulation: functional, and domain-based. We discuss the limitations of functional partitioning, offered by new commercial multi-core simulators to speedup functional gate-level simulations. We also present a novel solution to increase RTL and functional gate-level simulation performance based on domain partitioning. This is the first known work that improves simulation performance by leveraging Open Source Technology against commercial simulators.

R.g. Mark - One of the best experts on this subject based on the ideXlab platform.

  • The annotation station: an Open-Source Technology for annotating large biomedical databases
    Computers in Cardiology 2004, 2004
    Co-Authors: O.t. Abdala, G.d. Clifford, M. Saeed, A. Reisner, G. Moody, I. Henry, R.g. Mark
    Abstract:

    The authors present a new framework for annotating large databases of multichannel clinical data such as MIMIC II. MIMIC II is an ICU database which includes both regularly sampled (but often discontinuous) high rate data (such as ECG and BP waveforms) and low resolution data (such as waveform derived averages, lab results, medication changes, fluid balances, and nurse-verified signal values) which are often sparse, asynchronous and irregularly sampled. Because of the extremely rich high-dimensional nature of MIMIC II medical data, we require a vast quantity of labeled data in order to test and validate ICU decision-support algorithms. MIMIC II presents a new annotation challenge which cannot be met by currently existing annotation structures due to the heterogeneous data types and unavailability of data. We have constructed a hardware/software configuration known as the "annotation station", a quad-monitor, time synchronized, viewing tool which displays all of this data in an organized fashion. The software gives the user the opportunity to produce annotations in a practicable format that serve the goals of the MIMIC II project. The annotation structure must apply to all the numeric signals in MIMIC as well as nonnumeric data such as nursing notes, discharge summaries and patient histories. Furthermore, in order for the annotation framework to adequately represent the state of the patient to a human or machine, it must involve clinical coding using accepted medical lexicons and causal linkage of one annotation to another. This linkage is the basis of causal reasoning between significant events in different streams of the data. The annotations also include subjective expert assessments of a patient's hemodynamic state and trajectory. These assessments provide objective and subjective labels for assessing algorithms that track trends in the data with a view to producing intelligent alarms.