Crowdsourcing

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 360 Experts worldwide ranked by ideXlab platform

Puneet Agarwal - One of the best experts on this subject based on the ideXlab platform.

  • Harnessing the Crowdsourcing power of social media for disaster relief
    IEEE Intelligent Systems, 2011
    Co-Authors: Huiji Gao, Nathan Morrow, Mojtaba Maghrebi, Ke Tao, Taha Hossein Rashidi, Geert-jan Houben, Nicholas Kocmich, Geoffrey Barbier, S. Travis Waller, Adam Papendieck, Fabian Abel, Claudia Hauff, Richard Stronkman, Rebecca Goolsby, Nancy Mock, Alireza Abbasi, Puneet Agarwal
    Abstract:

    This article briefly describes the advantages and disadvantages of Crowdsourcing applications applied to disaster relief coordination. It also discusses several challenges that must be addressed to make Crowdsourcing a useful tool that can effectively facilitate the relief progress in coordination, accuracy, and security.

Yue Jia - One of the best experts on this subject based on the ideXlab platform.

  • A survey of the use of Crowdsourcing in software engineering
    Journal of Systems and Software, 2017
    Co-Authors: Ke Mao, Mark Harman, Licia Capra, Yue Jia
    Abstract:

    The term ‘Crowdsourcing’ was initially introduced in 2006 to describe an emerging distributed problem-solving model by online workers. Since then it has been widely studied and practiced to support software engineering. In this paper we provide a comprehensive survey of the use of Crowdsourcing in software engineering, seeking to cover all literature on this topic. We first review the definitions of Crowdsourcing and derive our definition of Crowdsourcing Software Engineering together with its taxonomy. Then we summarise industrial Crowdsourcing practice in software engineering and corresponding case studies. We further analyse the software engineering domains, tasks and applications for Crowdsourcing and the platforms and stakeholders involved in realising Crowdsourced Software Engineering solutions. We conclude by exposing trends, open issues and opportunities for future research on Crowdsourced Software Engineering.

Meihui Zhang - One of the best experts on this subject based on the ideXlab platform.

  • crowdop query optimization for declarative Crowdsourcing systems
    IEEE Transactions on Knowledge and Data Engineering, 2015
    Co-Authors: Ju Fan, Meihui Zhang, Stanley Kok, Beng Chin Ooi
    Abstract:

    We study the query optimization problem in declarative Crowdsourcing systems. Declarative Crowdsourcing is designed to hide the complexities and relieve the user of the burden of dealing with the crowd. The user is only required to submit an SQL-like query and the system takes the responsibility of compiling the query, generating the execution plan and evaluating in the Crowdsourcing marketplace. A given query can have many alternative execution plans and the difference in Crowdsourcing cost between the best and the worst plans may be several orders of magnitude. Therefore, as in relational database systems, query optimization is important to Crowdsourcing systems that provide declarative query interfaces. In this paper, we propose CrowdOp , a cost-based query optimization approach for declarative Crowdsourcing systems. CrowdOp considers both cost and latency in query optimization objectives and generates query plans that provide a good balance between the cost and latency. We develop efficient algorithms in the CrowdOp for optimizing three types of queries: selection queries, join queries, and complex selection-join queries. We validate our approach via extensive experiments by simulation as well as with the real crowd on Amazon Mechanical Turk.

  • cdas a Crowdsourcing data analytics system
    Very Large Data Bases, 2012
    Co-Authors: Meiyu Lu, Sai Wu, Yanyan Shen, Meihui Zhang
    Abstract:

    Some complex problems, such as image tagging and natural language processing, are very challenging for computers, where even state-of-the-art technology is yet able to provide satisfactory accuracy. Therefore, rather than relying solely on developing new and better algorithms to handle such tasks, we look to the Crowdsourcing solution -- employing human participation -- to make good the shortfall in current technology. Crowdsourcing is a good supplement to many computer tasks. A complex job may be divided into computer-oriented tasks and human-oriented tasks, which are then assigned to machines and humans respectively. To leverage the power of Crowdsourcing, we design and implement a Crowdsourcing Data Analytics System, CDAS. CDAS is a framework designed to support the deployment of various Crowdsourcing applications. The core part of CDAS is a quality-sensitive answering model, which guides the Crowdsourcing engine to process and monitor the human tasks. In this paper, we introduce the principles of our quality-sensitive model. To satisfy user required accuracy, the model guides the Crowdsourcing query engine for the design and processing of the corresponding Crowdsourcing jobs. It provides an estimated accuracy for each generated result based on the human workers' historical performances. When verifying the quality of the result, the model employs an online strategy to reduce waiting time. To show the effectiveness of the model, we implement and deploy two analytics jobs on CDAS, a twitter sentiment analytics job and an image tagging job. We use real Twitter and Flickr data as our queries respectively. We compare our approaches with state-of-the-art classification and image annotation techniques. The results show that the human-assisted methods can indeed achieve a much higher accuracy. By embedding the quality-sensitive model into Crowdsourcing query engine, we effectively reduce the processing cost while maintaining the required query answer quality.

  • CDAS: A Crowdsourcing data analytics system
    Proceedings of the VLDB Endowment, 2012
    Co-Authors: Xuan Liu, Meiyu Lu, Sai Wu, Yanyan Shen, Beng Chin Ooi, Meihui Zhang
    Abstract:

    Some complex problems, such as image tagging and natural language processing, are very challenging for computers, where even state-of-the-art technology is yet able to provide satisfactory accuracy. Therefore, rather than relying solely on developing new and better algorithms to handle such tasks, we look to the Crowdsourcing solution -- employing human participation -- to make good the shortfall in current technology. Crowdsourcing is a good supplement to many computer tasks. A complex job may be divided into computer-oriented tasks and human-oriented tasks, which are then assigned to machines and humans respectively. To leverage the power of Crowdsourcing, we design and implement a Crowdsourcing Data Analytics System, CDAS. CDAS is a framework designed to support the deployment of various Crowdsourcing applications. The core part of CDAS is a quality-sensitive answering model, which guides the Crowdsourcing engine to process and monitor the human tasks. In this paper, we introduce the principles of our quality-sensitive model. To satisfy user required accuracy, the model guides the Crowdsourcing query engine for the design and processing of the corresponding Crowdsourcing jobs. It provides an estimated accuracy for each generated result based on the human workers' historical performances. When verifying the quality of the result, the model employs an online strategy to reduce waiting time. To show the effectiveness of the model, we implement and deploy two analytics jobs on CDAS, a twitter sentiment analytics job and an image tagging job. We use real Twitter and Flickr data as our queries respectively. We compare our approaches with state-of-the-art classification and image annotation techniques. The results show that the human-assisted methods can indeed achieve a much higher accuracy. By embedding the quality-sensitive model into Crowdsourcing query engine, we effectiv...[truncated].

Ian P Mccarthy - One of the best experts on this subject based on the ideXlab platform.

  • how to work a crowd developing crowd capital through Crowdsourcing
    arXiv: Computers and Society, 2017
    Co-Authors: John Prpic, Prashant Shukla, Jan Kietzmann, Ian P Mccarthy
    Abstract:

    Traditionally, the term crowd was used almost exclusively in the context of people who self-organized around a common purpose, emotion or experience. Today, however, firms often refer to crowds in discussions of how collections of individuals can be engaged for organizational purposes. Crowdsourcing, the use of information technologies to outsource business responsibilities to crowds, can now significantly influence a firms ability to leverage previously unattainable resources to build competitive advantage. Nonetheless, many managers are hesitant to consider Crowdsourcing because they do not understand how its various types can add value to the firm. In response, we explain what Crowdsourcing is, the advantages it offers and how firms can pursue Crowdsourcing. We begin by formulating a Crowdsourcing typology and show how its four categories (crowd-voting, micro-task, idea and solution Crowdsourcing) can help firms develop crowd capital, an organizational-level resource harnessed from the crowd. We then present a three-step process model for generating crowd capital. Step one includes important considerations that shape how a crowd is to be constructed. Step two outlines the capabilities firms need to develop to acquire and assimilate resources (knowledge, labor, funds) from the crowd. Step three addresses key decision-areas that executives need to address to effectively engage crowds.

  • how to work a crowd developing crowd capital through Crowdsourcing
    Business Horizons, 2015
    Co-Authors: John Prpic, Prashant Shukla, Jan Kietzmann, Ian P Mccarthy
    Abstract:

    Traditionally, the term ‘crowd’ was used almost exclusively in the context of people who self-organized around a common purpose, emotion, or experience. Today, however, firms often refer to crowds in discussions of how collections of individuals can be engaged for organizational purposes. Crowdsourcing—defined here as the use of information technologies to outsource business responsibilities to crowds—can now significantly influence a firm's ability to leverage previously unattainable resources to build competitive advantage. Nonetheless, many managers are hesitant to consider Crowdsourcing because they do not understand how its various types can add value to the firm. In response, we explain what Crowdsourcing is, the advantages it offers, and how firms can pursue Crowdsourcing. We begin by formulating a Crowdsourcing typology and show how its four categories—crowd voting, micro-task, idea, and solution Crowdsourcing—can help firms develop ‘crowd capital,’ an organizational-level resource harnessed from the crowd. We then present a three-step process model for generating crowd capital. Step one includes important considerations that shape how a crowd is to be constructed. Step two outlines the capabilities firms need to develop to acquire and assimilate resources (e.g., knowledge, labor, funds) from the crowd. Step three outlines key decision areas that executives need to address to effectively engage crowds.

  • how to work a crowd developing crowd capital through Crowdsourcing
    Social Science Research Network, 2014
    Co-Authors: John Prpic, Prashant Shukla, Jan Kietzmann, Ian P Mccarthy
    Abstract:

    Traditionally, the term ‘crowd’ was used almost exclusively in the context of people who self-organized around a common purpose, emotion or experience. Today, however, firms often refer to ‘crowds’ in discussions of how collections of individuals can be engaged for organizational purposes. Crowdsourcing, the use of information technologies to outsource business responsibilities to crowds, can now significantly influence a firm’s ability to leverage previously unattainable resources to build competitive advantage. Nonetheless, many managers are hesitant to consider Crowdsourcing because they don’t understand how its various types can add value to the firm. In response, we explain what Crowdsourcing is, the advantages it offers and how firms can pursue Crowdsourcing. We begin by formulating a Crowdsourcing typology and show how its four categories (crowd-voting, micro-task, idea and solution Crowdsourcing) can help firms develop ‘crowd capital’, an organizational-level resource harnessed from the crowd. We then present a three-step process model for generating crowd capital. Step one includes important considerations that shape how a crowd is to be constructed. Step two outlines the capabilities firms need to develop to acquire and assimilate resources (e.g., knowledge, labor, funds) from the crowd. Step three addresses key decision-areas that executives need to address to effectively engage crowds.

Michael Lichtenstern - One of the best experts on this subject based on the ideXlab platform.

  • providing real time assistance in disaster relief by leveraging Crowdsourcing power
    Ubiquitous Computing, 2014
    Co-Authors: Dingqi Yang, Daqing Zhang, Korbinian Frank, Patrick Robertson, Edel Jennings, Mark Roddy, Michael Lichtenstern
    Abstract:

    Crowdsourcing platforms for disaster management have drawn a lot of attention in recent years due to their efficiency in disaster relief tasks, especially for disaster data collection and analysis. Although the on-site rescue staff can largely benefit from these Crowdsourcing data, due to the rapidly evolving situation at the disaster site, they usually encounter various difficulties and have requests, which need to be resolved in a short time. In this paper, aiming at efficiently harnessing Crowdsourcing power to provide those on-site rescue staff with real-time remote assistance, we design and develop a Crowdsourcing disaster support platform by considering three unique features, viz., selecting and notifying relevant off-site users for individual request according to their expertise; providing collaborative working functionalities to off-site users; improving answer credibility via "crowd voting." To evaluate the platform, we conducted a series of experiments with three-round user trials and also a System Usability Scale survey after each trial. The results show that the platform can effectively support on-site rescue staff by leveraging Crowdsourcing power and achieve good usability .