Paradigm Science

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 70005 Experts worldwide ranked by ideXlab platform

Janos Vegh - One of the best experts on this subject based on the ideXlab platform.

  • The need for modern computing Paradigm: Science applied to computing
    arXiv: General Literature, 2019
    Co-Authors: Janos Vegh
    Abstract:

    More than hundred years ago the 'classic physics' was it in its full power, with just a few unexplained phenomena; which however led to a revolution and the development of the 'modern physics'. Today the computing is in a similar position: computing is a sound success story, with exponentially growing utilization, but with a growing number of difficulties and unexpected issues as moving towards extreme utilization conditions. In physics studying the nature under extreme conditions has lead to the understanding of the relativistic and quantal behavior. Quite similarly in computing some phenomena, acquired in connection with extreme (computing) conditions, cannot be understood based on of the 'classic computing Paradigm'. The paper draws the attention that under extreme conditions qualitatively different behaviors may be encountered in both physics and computing, and pinpointing that certain, formerly unnoticed or neglected aspects enable to explain new phenomena as well as to enhance computing features. Moreover, an idea of modern computing Paradigm implementation is proposed.

  • The Need for Modern Computing Paradigm: Science Applied to Computing
    2019 International Conference on Computational Science and Computational Intelligence (CSCI), 2019
    Co-Authors: Janos Vegh, Alin Tisan
    Abstract:

    More than hundred years ago the classic physics was in its full power, with just a few unexplained phenomena; which, however, led to a revolution and the development of the 'modern physics'. The outbreak was possible by studying the nature under extreme conditions which finally led to the understanding of the relativistic and quantal behavior. Today, the computing is in a similar position: it is a sound success story, with exponentially growing utilization but, as moving towards extreme utilization conditions, with a growing number of difficulties and unexpected issues which cannot be explained based on the 'classic computing Paradigm'. The paper draws the attention that under extreme conditions, computing behavior could differ than the one in the normal conditions, and pinpoints that certain, unnoticed or neglected features enable the explanation of the new phenomena, and the enhancement of some computing features. Moreover, a new modern computing Paradigm implementation idea is proposed.

Alin Tisan - One of the best experts on this subject based on the ideXlab platform.

  • The Need for Modern Computing Paradigm: Science Applied to Computing
    2019 International Conference on Computational Science and Computational Intelligence (CSCI), 2019
    Co-Authors: Janos Vegh, Alin Tisan
    Abstract:

    More than hundred years ago the classic physics was in its full power, with just a few unexplained phenomena; which, however, led to a revolution and the development of the 'modern physics'. The outbreak was possible by studying the nature under extreme conditions which finally led to the understanding of the relativistic and quantal behavior. Today, the computing is in a similar position: it is a sound success story, with exponentially growing utilization but, as moving towards extreme utilization conditions, with a growing number of difficulties and unexpected issues which cannot be explained based on the 'classic computing Paradigm'. The paper draws the attention that under extreme conditions, computing behavior could differ than the one in the normal conditions, and pinpoints that certain, unnoticed or neglected features enable the explanation of the new phenomena, and the enhancement of some computing features. Moreover, a new modern computing Paradigm implementation idea is proposed.

Bartha Maria Knoppers - One of the best experts on this subject based on the ideXlab platform.

  • Beyond ELSIs: Where to from Here? From “Regulating” to Anticipating and Shaping the Innovation Trajectory in Personalized Medicine
    Pharmacogenomics, 2013
    Co-Authors: Vural Ozdemir, Yann Joly, Emily Kirby, Denise Avard, Bartha Maria Knoppers
    Abstract:

    Abstract Postgenomics personalized medicine Science is experiencing a “data deluge” from emerging high-throughput technologies and a myriad of sensors and exponential growth in electronic records. This new data-intensive Science was named the “fourth Paradigm of Science,” preceded by the third (last few decades: computational branch, modeling, and simulating complex phenomena), the second (last few hundred years: theoretical branch, using models leading to generalizations), and the first Paradigm (a thousand years ago: empirical description of natural phenomena). These shifts in the architecture of twenty-first-century global personalized medicine Science call for rethinking twenty-first-century bioethics so as to proactively steer the Science and technology innovation trajectory, instead of the more narrowly framed “enabler,” “protector,” or “regulator” roles hitherto assigned to bioethics in the twentieth century. This chapter introduces the emerging innovative approaches to twenty-first-century bioethics for data-intensive fourth-Paradigm Science that form the central pillar of postgenomics personalized medicine research and development. Additionally, the chapter goes beyond the classic prescriptive approaches to bioethics, in that it also examines, in a bottom-up manner, the “ethics of bioethics”—as with pharmacogenomics Science, bioethics needs to be examined for a deeper, integrated, and panoptic discourse on genomics innovations and their anticipated trajectory from “lab to global society.”

Nils C. Hanwahr - One of the best experts on this subject based on the ideXlab platform.

  • “Mr. Database”
    NTM Zeitschrift für Geschichte der Wissenschaften Technik und Medizin, 2017
    Co-Authors: Nils C. Hanwahr
    Abstract:

    Although the widespread use of the term “Big Data” is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as “Mr. Database” in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research , Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush’s idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed “Fourth Paradigm Science”. This article gives an overview of Gray’s contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use. Auch wenn die verbreitete Verwendung des Begriffs Big Data vergleichsweise neu ist, geht dieser zurück auf ein Phänomen in der Entwicklung von Datenbankentechnologien mit eindeutig historischem Hintergrund. Der Informatiker Jim Gray, im Silicon Valley bekannt als „Mr. Database“ bevor er 2007 auf See verschollen ging, war an vielen entscheidenden Entwicklungen seit den 1970er Jahren beteiligt, die die Basis für immer größere, schnellere und dezentralisierte Datenbanken bilden. Auf Grundlage der von Edgar F. Codd bei IBM konzipierten Prinzipien war Jim Gray an der Entwicklung von Relational Database Systemen beteiligt, und entwickelte später selbst Standards des Transaction Processing. Außerdem wirkte er mit daran, Austauschforen zwischen Wissenschaft und Industrie zu schaffen, die Funktionsstandards und Forschungsprogramme beeinflussten. Als Mitbegründer von Microsoft Research in San Francisco wandte sich Gray der wissenschaftlichen Anwendung von Datenbanktechnologien zu, etwa im TerraServer Projekt, einer Onlinedatenbank von Satellitenbildern. Inspiriert von Vannevar Bushs Idee des Memex entwickelte Gray seine Vision eines Personal Memex sowie eines World Memex, und postulierte letztlich ein neues Zeitalter der auf Daten basierenden wissenschaftlichen Entdeckung genannt „Fourth Paradigm Science“. Dieser Artikel gibt einen Überblick über Grays Beitrag zur Entwicklung von Datenbanktechnologien sowie seiner Forschungsagenda und zeigt, dass zentrale Ideen rund um Big Data die Akteure der technologischen Entwicklung schon sehr viel länger beschäftigten als der Begriff selbst in Verwendung ist.

  • Mr. Database" : Jim Gray and the History of Database Technologies.
    Ntm, 2017
    Co-Authors: Nils C. Hanwahr
    Abstract:

    Although the widespread use of the term "Big Data" is comparatively recent, it invokes a phenomenon in the developments of database technology with distinct historical contexts. The database engineer Jim Gray, known as "Mr. Database" in Silicon Valley before his disappearance at sea in 2007, was involved in many of the crucial developments since the 1970s that constitute the foundation of exceedingly large and distributed databases. Jim Gray was involved in the development of relational database systems based on the concepts of Edgar F. Codd at IBM in the 1970s before he went on to develop principles of Transaction Processing that enable the parallel and highly distributed performance of databases today. He was also involved in creating forums for discourse between academia and industry, which influenced industry performance standards as well as database research agendas. As a co-founder of the San Francisco branch of Microsoft Research, Gray increasingly turned toward scientific applications of database technologies, e. g. leading the TerraServer project, an online database of satellite images. Inspired by Vannevar Bush's idea of the memex, Gray laid out his vision of a Personal Memex as well as a World Memex, eventually postulating a new era of data-based scientific discovery termed "Fourth Paradigm Science". This article gives an overview of Gray's contributions to the development of database technology as well as his research agendas and shows that central notions of Big Data have been occupying database engineers for much longer than the actual term has been in use.

Vural Ozdemir - One of the best experts on this subject based on the ideXlab platform.

  • Beyond ELSIs: Where to from Here? From “Regulating” to Anticipating and Shaping the Innovation Trajectory in Personalized Medicine
    Pharmacogenomics, 2013
    Co-Authors: Vural Ozdemir, Yann Joly, Emily Kirby, Denise Avard, Bartha Maria Knoppers
    Abstract:

    Abstract Postgenomics personalized medicine Science is experiencing a “data deluge” from emerging high-throughput technologies and a myriad of sensors and exponential growth in electronic records. This new data-intensive Science was named the “fourth Paradigm of Science,” preceded by the third (last few decades: computational branch, modeling, and simulating complex phenomena), the second (last few hundred years: theoretical branch, using models leading to generalizations), and the first Paradigm (a thousand years ago: empirical description of natural phenomena). These shifts in the architecture of twenty-first-century global personalized medicine Science call for rethinking twenty-first-century bioethics so as to proactively steer the Science and technology innovation trajectory, instead of the more narrowly framed “enabler,” “protector,” or “regulator” roles hitherto assigned to bioethics in the twentieth century. This chapter introduces the emerging innovative approaches to twenty-first-century bioethics for data-intensive fourth-Paradigm Science that form the central pillar of postgenomics personalized medicine research and development. Additionally, the chapter goes beyond the classic prescriptive approaches to bioethics, in that it also examines, in a bottom-up manner, the “ethics of bioethics”—as with pharmacogenomics Science, bioethics needs to be examined for a deeper, integrated, and panoptic discourse on genomics innovations and their anticipated trajectory from “lab to global society.”