Modeling Environment

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 211443 Experts worldwide ranked by ideXlab platform

Vsevolod Yu Tanchuk - One of the best experts on this subject based on the ideXlab platform.

  • online chemical Modeling Environment ochem web platform for data storage model development and publishing of chemical information
    Journal of Computer-aided Molecular Design, 2011
    Co-Authors: Iurii Sushko, Sergii Novotarskyi, Robert Korner, Anil Kumar Pandey, Matthias Rupp, Wolfram Teetz, Stefan Brandmaier, Ahmed Abdelaziz, Volodymyr V Prokopenko, Vsevolod Yu Tanchuk
    Abstract:

    The Online Chemical Modeling Environment is a web-based platform that aims to automate and simplify the typical steps required for QSAR Modeling. The platform consists of two major subsystems: the database of experimental measurements and the Modeling framework. A user-contributed database contains a set of tools for easy input, search and modification of thousands of records. The OCHEM database is based on the wiki principle and focuses primarily on the quality and verifiability of the data. The database is tightly integrated with the Modeling framework, which supports all the steps required to create a predictive model: data search, calculation and selection of a vast variety of molecular descriptors, application of machine learning methods, validation, analysis of the model and assessment of the applicability domain. As compared to other similar systems, OCHEM is not intended to re-implement the existing tools or models but rather to invite the original authors to contribute their results, make them publicly available, share them with other users and to become members of the growing research community. Our intention is to make OCHEM a widely used platform to perform the QSPR/QSAR studies online and share it with other users on the Web. The ultimate goal of OCHEM is collecting all possible chemoinformatics tools within one simple, reliable and user-friendly resource. The OCHEM is free for web users and it is available online at http://www.ochem.eu.

Sergii Novotarskyi - One of the best experts on this subject based on the ideXlab platform.

  • Modeling of non-additive mixture properties using the Online CHEmical database and Modeling Environment (OCHEM)
    Journal of Cheminformatics, 2013
    Co-Authors: Ioana Oprisiu, Sergii Novotarskyi, Igor V Tetko
    Abstract:

    The Online Chemical Modeling Environment (OCHEM, http://ochem.eu ) is a web-based platform that provides tools for automation of typical steps necessary to create a predictive QSAR/QSPR model. The platform consists of two major subsystems: a database of experimental measurements and a Modeling framework. So far, OCHEM has been limited to the processing of individual compounds. In this work, we extended OCHEM with a new ability to store and model properties of binary non-additive mixtures. The developed system is publicly accessible, meaning that any user on the Web can store new data for binary mixtures and develop models to predict their non-additive properties. The database already contains almost 10,000 data points for the density, bubble point, and azeotropic behavior of binary mixtures. For these data, we developed models for both qualitative (azeotrope/zeotrope) and quantitative endpoints (density and bubble points) using different learning methods and specially developed descriptors for mixtures. The prediction performance of the models was similar to or more accurate than results reported in previous studies. Thus, we have developed and made publicly available a powerful system for Modeling mixtures of chemical compounds on the Web.

  • online chemical Modeling Environment ochem web platform for data storage model development and publishing of chemical information
    Journal of Computer-aided Molecular Design, 2011
    Co-Authors: Iurii Sushko, Sergii Novotarskyi, Robert Korner, Anil Kumar Pandey, Matthias Rupp, Wolfram Teetz, Stefan Brandmaier, Ahmed Abdelaziz, Volodymyr V Prokopenko, Vsevolod Yu Tanchuk
    Abstract:

    The Online Chemical Modeling Environment is a web-based platform that aims to automate and simplify the typical steps required for QSAR Modeling. The platform consists of two major subsystems: the database of experimental measurements and the Modeling framework. A user-contributed database contains a set of tools for easy input, search and modification of thousands of records. The OCHEM database is based on the wiki principle and focuses primarily on the quality and verifiability of the data. The database is tightly integrated with the Modeling framework, which supports all the steps required to create a predictive model: data search, calculation and selection of a vast variety of molecular descriptors, application of machine learning methods, validation, analysis of the model and assessment of the applicability domain. As compared to other similar systems, OCHEM is not intended to re-implement the existing tools or models but rather to invite the original authors to contribute their results, make them publicly available, share them with other users and to become members of the growing research community. Our intention is to make OCHEM a widely used platform to perform the QSPR/QSAR studies online and share it with other users on the Web. The ultimate goal of OCHEM is collecting all possible chemoinformatics tools within one simple, reliable and user-friendly resource. The OCHEM is free for web users and it is available online at http://www.ochem.eu.

Philip J. Maechling - One of the best experts on this subject based on the ideXlab platform.

  • GRID COMPUTING IN THE SCEC COMMUNITY Modeling Environment
    Seismological Research Letters, 2005
    Co-Authors: Philip J. Maechling, Edward H. Field, David Okaya, Vipin Gupta, Nitin Gupta, Thomas H. Jordan
    Abstract:

    In our work on the Southern California Earthquake Center Community Modeling Environment (SCEC/CME) Project (Jordan et al., 2003), we are developing computer systems to support dynamic distributed scientific collaborations. Scientists participating in SCEC collaborations are often willing to share their computer resources, particularly if in return they can gain access to computing capabilities that they do not currently possess. Interorganizational computer sharing can be difficult to achieve due to the many organizational and technical differences between groups. Recently, however, a new software technology called grid computing (Foster et al., 2001) has emerged which is designed to help dynamic organizations, such as SCEC, share heterogeneous collections of computer and storage resources. Grid technology enables organizations to share computer resources with other organizations even when the shared computers are administered differently and have dissimilar hardware and operating systems. Organizations can create a grid Environment to provide their users with computer resources such as CPU cycles, disk storage, and software programs that are available outside of their local computer administrative domains. This is done by creating a new computer administrative domain referred to as a virtual organization (VO). A VO has its own set of administrative policies that represents a combination of local computer policies, the computer policies of the groups you are sharing with, plus some administrative policies required by the VO itself. When we run a program on the “grid”, we are saying, in a sense, that our program is running outside of our own local administrative domain. Grid middleware is used to facilitate the execution of computer programs in a VO. In addition to creating multiorganizational administrative domains, grid middleware also strives to hide the heterogeneity of the shared computing Environment. Grid software provides a set of commands to perform basic computing operations, and these commands are the same regardless …

  • simplifying construction of complex workflows for non expert users of the southern california earthquake center community Modeling Environment
    International Conference on Management of Data, 2005
    Co-Authors: Philip J. Maechling, Vipin Gupta, Hans Chalupsky, Maureen Dougherty, Ewa Deelman, Sridhar Gullapalli, Carl Kesselman, Gaurang Mehta, Brian Mendenhall, Thomas A Russ
    Abstract:

    Workflow systems often present the user with rich interfaces that express all the capabilities and complexities of the application programs and the computing Environments that they support. However, non-expert users are better served with simple interfaces that abstract away system complexities and still enable them to construct and execute complex workflows. To explore this idea, we have created a set of tools and interfaces that simplify the construction of workflows. Implemented as part of the Community Modeling Environment developed by the Southern California Earthquake Center, these tools, are integrated into a comprehensive workflow system that supports both domain experts as well as non expert users.

  • the scec community Modeling Environment an information infrastructure for system level earthquake science
    Seismological Research Letters, 2003
    Co-Authors: Thomas H. Jordan, Philip J. Maechling
    Abstract:

    In the July/August issue of SRL , Tom Owens, writing his first column for the Electronic Seismologist (ES), asked “The Question”: Can we as a community develop the long-term relationships with the computer science community, the financial resources, and the culture necessary to build and maintain a state-of-the-art information technology (IT) structure that will really make a difference in the way we do our science? He discussed the tough practical challenges posed by The Question and highlighted some of the IT efforts underway by IRIS, EarthScope, and other geoscience organizations. In particular, he mentioned a new IT project by the Southern California Earthquake Center (SCEC) that involves an interesting mix of geoscientists and computer scientists. As a follow-up, he invited us to contribute an article to ES on how this project was formulated, how the collaborations have been set up, and how it's going. Tall order, but we'll try! We begin with a little background. In October 2001, SCEC was awarded a five-year, $10 million grant by the NSF Information Technology Research (ITR) Program to develop a Community Modeling Environment (CME)—what IT aficionados would call a “collaboratory”—for system-level earthquake science. In addition to a number of participating SCEC institutions, the project involves IRIS (http://www.iris.edu), the USGS' Pasadena office (http://pasadena.wr.usgs.gov), and two major IT research organizations, the Information Sciences Institute (ISI) of the University of Southern California (http://www.isi.edu) and the San Diego Supercomputer Center (SDSC) of the University of California at San Diego (http://www.sdsc.edu). The overarching goal of the CME project is to improve the information infrastructure for seismic hazard analysis (SHA), which has historically been the major focus of SCEC's research program. SHA seeks to describe the maximum level of shaking that can be expected at a specified site on the Earth's surface due to earthquakes anticipated over a …

Thomas H. Jordan - One of the best experts on this subject based on the ideXlab platform.

  • GRID COMPUTING IN THE SCEC COMMUNITY Modeling Environment
    Seismological Research Letters, 2005
    Co-Authors: Philip J. Maechling, Edward H. Field, David Okaya, Vipin Gupta, Nitin Gupta, Thomas H. Jordan
    Abstract:

    In our work on the Southern California Earthquake Center Community Modeling Environment (SCEC/CME) Project (Jordan et al., 2003), we are developing computer systems to support dynamic distributed scientific collaborations. Scientists participating in SCEC collaborations are often willing to share their computer resources, particularly if in return they can gain access to computing capabilities that they do not currently possess. Interorganizational computer sharing can be difficult to achieve due to the many organizational and technical differences between groups. Recently, however, a new software technology called grid computing (Foster et al., 2001) has emerged which is designed to help dynamic organizations, such as SCEC, share heterogeneous collections of computer and storage resources. Grid technology enables organizations to share computer resources with other organizations even when the shared computers are administered differently and have dissimilar hardware and operating systems. Organizations can create a grid Environment to provide their users with computer resources such as CPU cycles, disk storage, and software programs that are available outside of their local computer administrative domains. This is done by creating a new computer administrative domain referred to as a virtual organization (VO). A VO has its own set of administrative policies that represents a combination of local computer policies, the computer policies of the groups you are sharing with, plus some administrative policies required by the VO itself. When we run a program on the “grid”, we are saying, in a sense, that our program is running outside of our own local administrative domain. Grid middleware is used to facilitate the execution of computer programs in a VO. In addition to creating multiorganizational administrative domains, grid middleware also strives to hide the heterogeneity of the shared computing Environment. Grid software provides a set of commands to perform basic computing operations, and these commands are the same regardless …

  • opensha a developing community Modeling Environment for seismic hazard analysis
    Seismological Research Letters, 2003
    Co-Authors: Edward H. Field, Thomas H. Jordan, Allin C Cornell
    Abstract:

    Probabilistic seismic hazard analysis (PSHA) provides the conceptual framework for estimating the likelihood that something of concern related to earthquake shaking will occur over a specified time period. Based on more than thirty years of research and development ( e.g. , Cornell, 1968; Algermissen et al. , 1982; SSHAC, 1997), PSHA has become a standard tool for combining information on earthquake occurrence, seismic radiation, and shaking response to produce hazard estimates, including the U.S. Geological Survey's national seismic hazard maps (Frankel et al. , 1996, 1997). PSHA methods, while now mature, continue to evolve as scientists improve the characterization of earthquake hazards and engineers develop new measures of seismic shaking for performance-based design. The Southern California Earthquake Center (SCEC), in collaboration with USGS, the California Geological Survey (CGS), and other partners, has undertaken a series of studies aimed at improving the regional application of PSHA methods. Phase I examined the implications of the 1992 Landers earthquake sequence for regional seismic hazards (WGCEP, 1992), Phase II developed a probabilistic earthquake forecast model (WGCEP, 1995), and Phase III assessed the wave-propagation and site effects that give rise to local variations in seismic shaking (see Field et al. [2000] for an overview). The Phase III study found that accounting for some site attributes ( i.e. , the 30-meter shear-wave velocity and basin depth) can lead to significant improvements in PSHA. It was also found that making such corrections does not significantly reduce the prediction uncertainty associated with empirical ground-motion relations. In fact, 3D waveform Modeling presented in the same report (Olsen, 2000) implied that this residual uncertainty represents an intrinsic variability caused by complex propagation effects that are unique to each earthquake-rupture/site combination. The Phase III study therefore concluded that significant improvements in PSHA will require replacing the standard empirical-regression …

  • the scec community Modeling Environment an information infrastructure for system level earthquake science
    Seismological Research Letters, 2003
    Co-Authors: Thomas H. Jordan, Philip J. Maechling
    Abstract:

    In the July/August issue of SRL , Tom Owens, writing his first column for the Electronic Seismologist (ES), asked “The Question”: Can we as a community develop the long-term relationships with the computer science community, the financial resources, and the culture necessary to build and maintain a state-of-the-art information technology (IT) structure that will really make a difference in the way we do our science? He discussed the tough practical challenges posed by The Question and highlighted some of the IT efforts underway by IRIS, EarthScope, and other geoscience organizations. In particular, he mentioned a new IT project by the Southern California Earthquake Center (SCEC) that involves an interesting mix of geoscientists and computer scientists. As a follow-up, he invited us to contribute an article to ES on how this project was formulated, how the collaborations have been set up, and how it's going. Tall order, but we'll try! We begin with a little background. In October 2001, SCEC was awarded a five-year, $10 million grant by the NSF Information Technology Research (ITR) Program to develop a Community Modeling Environment (CME)—what IT aficionados would call a “collaboratory”—for system-level earthquake science. In addition to a number of participating SCEC institutions, the project involves IRIS (http://www.iris.edu), the USGS' Pasadena office (http://pasadena.wr.usgs.gov), and two major IT research organizations, the Information Sciences Institute (ISI) of the University of Southern California (http://www.isi.edu) and the San Diego Supercomputer Center (SDSC) of the University of California at San Diego (http://www.sdsc.edu). The overarching goal of the CME project is to improve the information infrastructure for seismic hazard analysis (SHA), which has historically been the major focus of SCEC's research program. SHA seeks to describe the maximum level of shaking that can be expected at a specified site on the Earth's surface due to earthquakes anticipated over a …

Arthur M. Geoffrion - One of the best experts on this subject based on the ideXlab platform.

  • The design and implementation of a prototype structured Modeling Environment
    Annals of Operations Research, 1992
    Co-Authors: L. Neustadter, Arthur M. Geoffrion, Sergio Maturana, Yao-chuan Tsai, F. Vicuña
    Abstract:

    This paper describes the design and implementation of the prototype structured Modeling Environment FW/SM. The underlying design principles provide the central focus. Other points of interest include discussions of FW/SM's delivery platform, its interface to external packages, and its optimization interfaces. The intended audience is all Modeling system evaluators, designers, and implementors, including those who do not happen to take a structured Modeling approach.

  • FW/SM: A Prototype Structured Modeling Environment
    Management Science, 1991
    Co-Authors: Arthur M. Geoffrion
    Abstract:

    A research prototype implementation has been evolving for several years in parallel with the vision, theory, language, and applicative aspects of structured Modeling. The objective of this article is to describe the capabilities of this implementation as it now stands, and to comment on how it contributes toward fulfilling a previously published agenda for the development of a new generation of Modeling Environments. The intended audience is all Modeling system designers and evaluators, including those who do not happen to take a structured Modeling approach.