Analytical Engine

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 13812 Experts worldwide ranked by ideXlab platform

Hao Fen Wang - One of the best experts on this subject based on the ideXlab platform.

  • HadoopRDF: A scalable semantic data Analytical Engine
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2012
    Co-Authors: Jin Hang Du, Hao Fen Wang, Yuan Ni, Yong Yu
    Abstract:

    With the rapid growth of the scale of semantic data, to handle the problem of analyzing this large-scale data has become a hot topic. Traditional triple stores deployed on a single machine have been proved to be effective to provide storage and retrieval of RDF data. However, the scalability is limited and cannot handle billion ever growing triples. On the other hand, Hadoop is an open-source project which provides HDFS as a distributed file storage system and MapReduce as a computing framework for distributed processing. It has proved to perform well for large data analysis. In this paper, we propose, HadoopRDF, a system to combine both worlds (triple stores and Hadoop) to provide a scalable data analysis service for the RDF data. It benefits the scalability of Hadoop and the ability to support flexible analysis query like SPARQL of traditional triple stores. Experimental evaluation results show the effectiveness and efficiency of the approach.

  • ICIC (2) - HadoopRDF: a scalable semantic data Analytical Engine
    Lecture Notes in Computer Science, 2012
    Co-Authors: Hao Fen Wang
    Abstract:

    With the rapid growth of the scale of semantic data, to handle the problem of analyzing this large-scale data has become a hot topic. Traditional triple stores deployed on a single machine have been proved to be effective to provide storage and retrieval of RDF data. However, the scalability is limited and cannot handle billion ever growing triples. On the other hand, Hadoop is an open-source project which provides HDFS as a distributed file storage system and MapReduce as a computing framework for distributed processing. It has proved to perform well for large data analysis. In this paper, we propose, HadoopRDF, a system to combine both worlds (triple stores and Hadoop) to provide a scalable data analysis service for the RDF data. It benefits the scalability of Hadoop and the ability to support flexible analysis query like SPARQL of traditional triple stores. Experimental evaluation results show the effectiveness and efficiency of the approach.

Sundaresan Jayaraman - One of the best experts on this subject based on the ideXlab platform.

  • The wearables revolution and Big Data: the textile lineage
    Journal of The Textile Institute, 2016
    Co-Authors: Sungmee Park, Sundaresan Jayaraman
    Abstract:

    AbstractJohn Kay’s invention of the flying shuttle in 1733 sparked off the first Industrial Revolution, which led to the transformation of industry and subsequently of civilization itself. While Basile Bouchon’s work led to the concept of a stored program, it was Joseph-Marie Jacquard’s automated punched card system that made it a viable industrial reality. The Jacquard loom proved to be the inspiration for Charles Babbage’s Analytical Engine and then Hollerith’s punched card. Thus, an invention in the textile industry was instrumental in bringing about one of the most profound technological advancements known to humans, viz., the second Industrial Revolution also known as the Information Processing Revolution or the Computer Revolution. In the late 1990s, the successful development of the wearable motherboard gave birth to the new paradigm of an interactive fabric-based wearable information infrastructure, which has played a key role in today’s emerging wearables revolution, further illustrating the text...

  • CASES - Textiles and computing: background and opportunities for convergence
    Proceedings of the international conference on Compilers architecture and synthesis for embedded systems - CASES '01, 2001
    Co-Authors: Sungmee Park, Sundaresan Jayaraman
    Abstract:

    John Kay's invention of the flying shuttle in 1733 sparked the first Industrial Revolution, which led to the transformation of industry and subsequently of civilization itself. Yet another invention in the field of textiles - the Jacquard head by Joseph Marie Jacquard (circa 1801) - was the first binary information processor. At any given point, the thread in a woven fabric can be in one of two states or positions: on the face of the fabric or on the back. Pattern cards were punched or cut according to the required fabric design. A hole in the card signified that the thread would appear on the face of the fabric, while a blank meant that the end would be left down and appear on the back of the fabric. The Jacquard head was used on the weaving loom or machine for raising and lowering the warp threads to form desired patterns based on the lifting plan or program embedded in the cards. Thus the Jacquard mechanism set the stage for modern day binary information processing. Ada Lovelace, the benefactor for Charles Babbage who worked on the Analytical Engine (the predecessor to the modern day computer), is said to have remarked, "The Analytical Engine weaves algebraic patterns just as the Jacquard loom weaves flowers and leaves." The Jacquard mechanism that inspired Babbage and spawned the Hollerith punched card has been instrumental in bringing about one of the most profound technological advancements known to humans, viz., the second Industrial Revolution also known as the Information Processing Revolution [1]. In fact, when Intel introduced its Pentium class of microprocessors, one of the advertisements had a "fabric of chips" emerging from a weaving machine; this picture eloquently captured the essence of chip making - a true blending of art and science - much like the design and production of textiles.

Krasimir Markov - One of the best experts on this subject based on the ideXlab platform.

  • Big data mining: In-database Oracle data mining over hadoop
    2017
    Co-Authors: Zlatinka Kovacheva, Ina Naydenova, Kalinka Kaloyanova, Krasimir Markov
    Abstract:

    Big data challenges different aspects of storing, processing and managing data, as well as analyzing and using data for business purposes. Applying Data Mining methods over Big Data is another challenge because of huge data volumes, variety of information, and the dynamic of the sources. Different applications are made in this area, but their successful usage depends on understanding many specific parameters. In this paper we present several opportunities for using Data Mining techniques provided by the Analytical Engine of RDBMS Oracle over data stored in Hadoop Distributed File System (HDFS). Some experimental results are given and they are discussed.Big data challenges different aspects of storing, processing and managing data, as well as analyzing and using data for business purposes. Applying Data Mining methods over Big Data is another challenge because of huge data volumes, variety of information, and the dynamic of the sources. Different applications are made in this area, but their successful usage depends on understanding many specific parameters. In this paper we present several opportunities for using Data Mining techniques provided by the Analytical Engine of RDBMS Oracle over data stored in Hadoop Distributed File System (HDFS). Some experimental results are given and they are discussed.

  • big data mining in database oracle data mining over hadoop
    American Institute of Physics Conference Series, 2017
    Co-Authors: Zlatinka Kovacheva, Ina Naydenova, Kalinka Kaloyanova, Krasimir Markov
    Abstract:

    Big data challenges different aspects of storing, processing and managing data, as well as analyzing and using data for business purposes. Applying Data Mining methods over Big Data is another challenge because of huge data volumes, variety of information, and the dynamic of the sources. Different applications are made in this area, but their successful usage depends on understanding many specific parameters. In this paper we present several opportunities for using Data Mining techniques provided by the Analytical Engine of RDBMS Oracle over data stored in Hadoop Distributed File System (HDFS). Some experimental results are given and they are discussed.

Zlatko Trajanoski - One of the best experts on this subject based on the ideXlab platform.

  • CARMAweb: Comprehensive R- and bioconductor-based web service for microarray data analysis
    Nucleic Acids Research, 2006
    Co-Authors: Johannes Rainer, Gernot Stocker, Alexander Sturn, Fátima Sánchez-cabo, Zlatko Trajanoski
    Abstract:

    CARMAweb (Comprehensive R-based Microarray Analysis web service) is a web application designed for the analysis of microarray data. CARMAweb performs data preprocessing (background correction, quality control and normalization), detection of differentially expressed genes, cluster analysis, dimension reduction and visualization, classification, and Gene Ontology-term analysis. This web application accepts raw data from a variety of imaging software tools for the most widely used microarray platforms: Affymetrix GeneChips, spotted two-color microarrays and Applied Biosystems (ABI) microarrays. R and packages from the Bioconductor project are used as an Analytical Engine in combination with the R function Sweave, which allows automatic generation of analysis reports. These report files contain all R commands used to perform the analysis and guarantee therefore a maximum transparency and reproducibility for each analysis. The web application is implemented in Java based on the latest J2EE (Java 2 Enterprise Edition) software technology. CARMAweb is freely available at https://carmaweb.genome.tugraz.at.

Sungmee Park - One of the best experts on this subject based on the ideXlab platform.

  • The wearables revolution and Big Data: the textile lineage
    Journal of The Textile Institute, 2016
    Co-Authors: Sungmee Park, Sundaresan Jayaraman
    Abstract:

    AbstractJohn Kay’s invention of the flying shuttle in 1733 sparked off the first Industrial Revolution, which led to the transformation of industry and subsequently of civilization itself. While Basile Bouchon’s work led to the concept of a stored program, it was Joseph-Marie Jacquard’s automated punched card system that made it a viable industrial reality. The Jacquard loom proved to be the inspiration for Charles Babbage’s Analytical Engine and then Hollerith’s punched card. Thus, an invention in the textile industry was instrumental in bringing about one of the most profound technological advancements known to humans, viz., the second Industrial Revolution also known as the Information Processing Revolution or the Computer Revolution. In the late 1990s, the successful development of the wearable motherboard gave birth to the new paradigm of an interactive fabric-based wearable information infrastructure, which has played a key role in today’s emerging wearables revolution, further illustrating the text...

  • CASES - Textiles and computing: background and opportunities for convergence
    Proceedings of the international conference on Compilers architecture and synthesis for embedded systems - CASES '01, 2001
    Co-Authors: Sungmee Park, Sundaresan Jayaraman
    Abstract:

    John Kay's invention of the flying shuttle in 1733 sparked the first Industrial Revolution, which led to the transformation of industry and subsequently of civilization itself. Yet another invention in the field of textiles - the Jacquard head by Joseph Marie Jacquard (circa 1801) - was the first binary information processor. At any given point, the thread in a woven fabric can be in one of two states or positions: on the face of the fabric or on the back. Pattern cards were punched or cut according to the required fabric design. A hole in the card signified that the thread would appear on the face of the fabric, while a blank meant that the end would be left down and appear on the back of the fabric. The Jacquard head was used on the weaving loom or machine for raising and lowering the warp threads to form desired patterns based on the lifting plan or program embedded in the cards. Thus the Jacquard mechanism set the stage for modern day binary information processing. Ada Lovelace, the benefactor for Charles Babbage who worked on the Analytical Engine (the predecessor to the modern day computer), is said to have remarked, "The Analytical Engine weaves algebraic patterns just as the Jacquard loom weaves flowers and leaves." The Jacquard mechanism that inspired Babbage and spawned the Hollerith punched card has been instrumental in bringing about one of the most profound technological advancements known to humans, viz., the second Industrial Revolution also known as the Information Processing Revolution [1]. In fact, when Intel introduced its Pentium class of microprocessors, one of the advertisements had a "fabric of chips" emerging from a weaving machine; this picture eloquently captured the essence of chip making - a true blending of art and science - much like the design and production of textiles.