Protection Regulation

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 147378 Experts worldwide ranked by ideXlab platform

Raul Eamets - One of the best experts on this subject based on the ideXlab platform.

  • labour market flexibility and employment Protection Regulation in the baltic states
    2004
    Co-Authors: Raul Eamets, Jaan Masso
    Abstract:

    There is increasing pressure for the flexibility of labour markets both in current EU member states and candidate countries. The paper aims to estimate the strictness of employment Protection Regulation, one of the most relevant aspects of labour market flexibility, and the degree of its actual enforcement, for the Baltic States. For the studies on CEE labour markets the novelty in our approach is that we use information from the applicable legislation as well as on the coverage of labour legislation and the practice of law enforcement. The analysis shows that, though overall EPL strictness is close to the average of EU countries, individual and collective dismissals are relatively heavily and temporary forms of employment relatively weakly regulated. Still, the effective flexibility is increased by larger share of less protected workers and problems with law enforcement, which may be the reason why employers’ estimates on the flexibility differ somewhat from the flexibility of formal legislation. The employment Protection legislation seems not to have influenced the level of unemployment in the sample of CEE countries. However, it is possible that labour markets of Baltic States will become more rigid if the law enforcement improves, with possible adverse effects on labour market performance.

  • labour market flexibility and employment Protection Regulation in the baltic states
    Social Science Research Network, 2004
    Co-Authors: Jaan Masso, Raul Eamets
    Abstract:

    There is increasing pressure for the flexibility of labour markets both in current EU member states and candidate countries. The paper aims to estimate the strictness of employment Protection Regulation, one of the most relevant aspects of labour market flexibility, and the degree of its actual enforcement, for the Baltic States. For the studies on CEE labour markets the novelty in our approach is that we use information from the applicable legislation as well as on the coverage of labour legislation and the practice of law enforcement. The analysis shows that, though overall EPL strictness is close to the average of EU countries, individual and collective dismissals are relatively heavily and temporary forms of employment relatively weakly regulated. Still, the effective flexibility is increased by the larger share of less protected workers and problems with law enforcement, which may be the reason why employers' estimates on the flexibility differ somewhat from the flexibility of formal legislation. The employment Protection legislation seems not to have influenced the level of unemployment in the sample of CEE countries. However, it is possible that labour markets of Baltic States will become more rigid if the law enforcement improves, with possible adverse effects in labour market performance.

Luciano Floridi - One of the best experts on this subject based on the ideXlab platform.

  • soft ethics and the governance of the digital and the general data Protection Regulation
    2018
    Co-Authors: Luciano Floridi
    Abstract:

    The article discusses the governance of the digital as the new challenge posed by technological innovation. It then introduces a new distinction between soft ethics, which applies after legal compliance with legislation, such as the General Data Protection Regulation in the European Union, and hard ethics, which precedes and contributes to shape legislation. It concludes by developing an analysis of the role of digital ethics with respect to digital Regulation and digital governance.

  • why a right to explanation of automated decision making does not exist in the general data Protection Regulation
    International Data Privacy Law, 2017
    Co-Authors: Sandra Wachter, Brent Mittelstadt, Luciano Floridi
    Abstract:

    Since approval of the EU General Data Protection Regulation (GDPR) in 2016, it has been widely and repeatedly claimed that the GDPR will legally mandate a ‘right to explanation’ of all decisions made by automated or artificially intelligent algorithmic systems. This right to explanation is viewed as an ideal mechanism to enhance the accountability and transparency of automated decision-making. However, there are several reasons to doubt both the legal existence and the feasibility of such a right. In contrast to the right to explanation of specific automated decisions claimed elsewhere, the GDPR only mandates that data subjects receive meaningful, but properly limited, information (Articles 13-15) about the logic involved, as well as the significance and the envisaged consequences of automated decision-making systems, what we term a ‘right to be informed’. Further, the ambiguity and limited scope of the ‘right not to be subject to automated decision-making’ contained in Article 22 (from which the alleged ‘right to explanation’ stems) raises questions over the Protection actually afforded to data subjects. These problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless. We propose a number of legislative and policy steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.

  • why a right to explanation of automated decision making does not exist in the general data Protection Regulation
    Social Science Research Network, 2016
    Co-Authors: Sandra Wachter, Brent Mittelstadt, Luciano Floridi
    Abstract:

    Since approval of the EU General Data Protection Regulation (GDPR) in 2016, it has been widely and repeatedly claimed that a ‘right to explanation’ of decisions made by automated or artificially intelligent algorithmic systems will be legally mandated by the GDPR. This right to explanation is viewed as an ideal mechanism to enhance the accountability and transparency of automated decision-making. However, there are several reasons to doubt both the legal existence and the feasibility of such a right. In contrast to the right to explanation of specific automated decisions claimed elsewhere, the GDPR only mandates that data subjects receive limited information (Articles 13-15) about the logic involved, as well as the significance and the envisaged consequences of automated decision-making systems, what we term a ‘right to be informed’. Further, the ambiguity and limited scope of the ‘right not to be subject to automated decision-making’ contained in Article 22 (from which the alleged ‘right to explanation’ stems) raises questions over the Protection actually afforded to data subjects. These problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless. We propose a number of legislative steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.

Heidi Beate Bentzen - One of the best experts on this subject based on the ideXlab platform.

Sandra Wachter - One of the best experts on this subject based on the ideXlab platform.

  • why a right to explanation of automated decision making does not exist in the general data Protection Regulation
    International Data Privacy Law, 2017
    Co-Authors: Sandra Wachter, Brent Mittelstadt, Luciano Floridi
    Abstract:

    Since approval of the EU General Data Protection Regulation (GDPR) in 2016, it has been widely and repeatedly claimed that the GDPR will legally mandate a ‘right to explanation’ of all decisions made by automated or artificially intelligent algorithmic systems. This right to explanation is viewed as an ideal mechanism to enhance the accountability and transparency of automated decision-making. However, there are several reasons to doubt both the legal existence and the feasibility of such a right. In contrast to the right to explanation of specific automated decisions claimed elsewhere, the GDPR only mandates that data subjects receive meaningful, but properly limited, information (Articles 13-15) about the logic involved, as well as the significance and the envisaged consequences of automated decision-making systems, what we term a ‘right to be informed’. Further, the ambiguity and limited scope of the ‘right not to be subject to automated decision-making’ contained in Article 22 (from which the alleged ‘right to explanation’ stems) raises questions over the Protection actually afforded to data subjects. These problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless. We propose a number of legislative and policy steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.

  • why a right to explanation of automated decision making does not exist in the general data Protection Regulation
    Social Science Research Network, 2016
    Co-Authors: Sandra Wachter, Brent Mittelstadt, Luciano Floridi
    Abstract:

    Since approval of the EU General Data Protection Regulation (GDPR) in 2016, it has been widely and repeatedly claimed that a ‘right to explanation’ of decisions made by automated or artificially intelligent algorithmic systems will be legally mandated by the GDPR. This right to explanation is viewed as an ideal mechanism to enhance the accountability and transparency of automated decision-making. However, there are several reasons to doubt both the legal existence and the feasibility of such a right. In contrast to the right to explanation of specific automated decisions claimed elsewhere, the GDPR only mandates that data subjects receive limited information (Articles 13-15) about the logic involved, as well as the significance and the envisaged consequences of automated decision-making systems, what we term a ‘right to be informed’. Further, the ambiguity and limited scope of the ‘right not to be subject to automated decision-making’ contained in Article 22 (from which the alleged ‘right to explanation’ stems) raises questions over the Protection actually afforded to data subjects. These problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless. We propose a number of legislative steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.

Njal Hostmaelingen - One of the best experts on this subject based on the ideXlab platform.