Feature Configuration

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 90 Experts worldwide ranked by ideXlab platform

Patrick Heymans - One of the best experts on this subject based on the ideXlab platform.

  • Deriving Configuration Interfaces from Feature Models: A Vision Paper
    2012
    Co-Authors: Quentin Boucher, Gilles Perrouin, Patrick Heymans
    Abstract:

    In software product lines, Feature models are the de-facto standard for representing variability as well as for configuring products. Yet, Configuration relying on Feature models faces two issues: i) it assumes knowledge of the underlying formalism, which may not be true for end users and ii) it does not take advantage of advanced user-interface controls, leading to usability and integration problems with other parts of the user interface. To address these issues, our research focuses on the generation of Configuration interfaces based on variability models, both from the visual and behavioral perspectives. We tackle visual issues by generating abstract user-interfaces from Feature models. Regarding Configuration behavior, in particular the Configuration sequence, we plan to use Feature Configuration workflows, variability-aware models that exhibit similar characteristics as of task, user, discourse and business models found in the in the human-computer interaction community. This paper discusses the main challenges and possible solutions to realize our vision.

  • analysis of Feature Configuration workflows
    IEEE International Conference on Requirements Engineering, 2009
    Co-Authors: Andreas Classen, Arnaud Hubaux, Patrick Heymans
    Abstract:

    We recently introduced Feature Configuration workflows, a formalism for modelling the complex Configuration processes in software product line engineering. In earlier work we identified obstacles to efficient tool support for which we now outline the main concepts of a solution. These take the form of a set of analysis tasks that can be performed on Feature Configuration workflows.

  • formal modelling of Feature Configuration workflows
    Software Product Lines, 2009
    Co-Authors: Arnaud Hubaux, Andreas Classen, Patrick Heymans
    Abstract:

    In software product line engineering, the Configuration process can be a long and complex undertaking that involves many participants. When Configuration is supported by Feature diagrams, two challenges are to modularise the Feature diagram into related chunks, and to schedule them as part of the Configuration process. Existing work has only focused on the first of these challenges and, for the rest, assumes that Feature diagram modules are configured sequentially. This paper addresses the second challenge. It suggests using YAWL, a state-of-the-art workflow language, to represent the Configuration workflow while Feature diagrams model the available Configuration options. The principal contribution of the paper is a new combined formalism: Feature Configuration workflows. A formal semantics is provided so as to pave the way for unambiguous tool specification and safer reasoning about of the Configuration process. The work is motivated and illustrated through a Configuration scenario taken from the space industry.

Peter Corke - One of the best experts on this subject based on the ideXlab platform.

  • predicting target Feature Configuration of non stationary objects for grasping with image based visual servoing
    arXiv: Robotics, 2020
    Co-Authors: Jesse Haviland, Feras Dayoub, Peter Corke
    Abstract:

    In this paper we consider the problem of the final approach stage of closed-loop grasping where RGB-D cameras are no longer able to provide valid depth information. This is essential for grasping non-stationary objects; a situation where current robotic grasping controllers fail. We predict the image-plane coordinates of observed image Features at the final grasp pose and use image-based visual servoing to guide the robot to that pose. Image-based visual servoing is a well established control technique that moves a camera in 3D space so as to drive the image-plane Feature Configuration to some goal state. In previous works the goal Feature Configuration is assumed to be known but for some applications this may not be feasible, if for example the motion is being performed for the first time with respect to a scene. Our proposed method provides robustness with respect to scene motion during the final phase of grasping as well as to errors in the robot kinematic control. We provide experimental results in the context of dynamic closed-loop grasping.

  • predicting target Feature Configuration of non stationary objects for grasping with image based visual servoing
    2020
    Co-Authors: Jesse Haviland, Feras Dayoub, Peter Corke
    Abstract:

    This paper considers the final approach phase of visual-closed-loop grasping where the RGB-D camera is no longer able to provide valid depth information. Many current robotic grasping controllers are not closed-loop and therefore fail for moving objects. Closed-loop grasp controllers based on RGB-D imagery can track a moving object, but fail when the sensor's minimum object distance is violated just before grasping. To overcome this we propose the use of image-based visual servoing (IBVS) to guide the robot to the object-relative grasp pose using camera RGB information. IBVS robustly moves the camera to a goal pose defined implicitly in terms of an image-plane Feature Configuration. In this work, the goal image Feature coordinates are predicted from RGB-D data to enable RGB-only tracking once depth data becomes unavailable -- this enables more reliable grasping of previously unseen moving objects. Experimental results are provided.

Andreas Classen - One of the best experts on this subject based on the ideXlab platform.

  • analysis of Feature Configuration workflows
    IEEE International Conference on Requirements Engineering, 2009
    Co-Authors: Andreas Classen, Arnaud Hubaux, Patrick Heymans
    Abstract:

    We recently introduced Feature Configuration workflows, a formalism for modelling the complex Configuration processes in software product line engineering. In earlier work we identified obstacles to efficient tool support for which we now outline the main concepts of a solution. These take the form of a set of analysis tasks that can be performed on Feature Configuration workflows.

  • formal modelling of Feature Configuration workflows
    Software Product Lines, 2009
    Co-Authors: Arnaud Hubaux, Andreas Classen, Patrick Heymans
    Abstract:

    In software product line engineering, the Configuration process can be a long and complex undertaking that involves many participants. When Configuration is supported by Feature diagrams, two challenges are to modularise the Feature diagram into related chunks, and to schedule them as part of the Configuration process. Existing work has only focused on the first of these challenges and, for the rest, assumes that Feature diagram modules are configured sequentially. This paper addresses the second challenge. It suggests using YAWL, a state-of-the-art workflow language, to represent the Configuration workflow while Feature diagrams model the available Configuration options. The principal contribution of the paper is a new combined formalism: Feature Configuration workflows. A formal semantics is provided so as to pave the way for unambiguous tool specification and safer reasoning about of the Configuration process. The work is motivated and illustrated through a Configuration scenario taken from the space industry.

Arnaud Hubaux - One of the best experts on this subject based on the ideXlab platform.

  • analysis of Feature Configuration workflows
    IEEE International Conference on Requirements Engineering, 2009
    Co-Authors: Andreas Classen, Arnaud Hubaux, Patrick Heymans
    Abstract:

    We recently introduced Feature Configuration workflows, a formalism for modelling the complex Configuration processes in software product line engineering. In earlier work we identified obstacles to efficient tool support for which we now outline the main concepts of a solution. These take the form of a set of analysis tasks that can be performed on Feature Configuration workflows.

  • formal modelling of Feature Configuration workflows
    Software Product Lines, 2009
    Co-Authors: Arnaud Hubaux, Andreas Classen, Patrick Heymans
    Abstract:

    In software product line engineering, the Configuration process can be a long and complex undertaking that involves many participants. When Configuration is supported by Feature diagrams, two challenges are to modularise the Feature diagram into related chunks, and to schedule them as part of the Configuration process. Existing work has only focused on the first of these challenges and, for the rest, assumes that Feature diagram modules are configured sequentially. This paper addresses the second challenge. It suggests using YAWL, a state-of-the-art workflow language, to represent the Configuration workflow while Feature diagrams model the available Configuration options. The principal contribution of the paper is a new combined formalism: Feature Configuration workflows. A formal semantics is provided so as to pave the way for unambiguous tool specification and safer reasoning about of the Configuration process. The work is motivated and illustrated through a Configuration scenario taken from the space industry.

Jesse Haviland - One of the best experts on this subject based on the ideXlab platform.

  • predicting target Feature Configuration of non stationary objects for grasping with image based visual servoing
    arXiv: Robotics, 2020
    Co-Authors: Jesse Haviland, Feras Dayoub, Peter Corke
    Abstract:

    In this paper we consider the problem of the final approach stage of closed-loop grasping where RGB-D cameras are no longer able to provide valid depth information. This is essential for grasping non-stationary objects; a situation where current robotic grasping controllers fail. We predict the image-plane coordinates of observed image Features at the final grasp pose and use image-based visual servoing to guide the robot to that pose. Image-based visual servoing is a well established control technique that moves a camera in 3D space so as to drive the image-plane Feature Configuration to some goal state. In previous works the goal Feature Configuration is assumed to be known but for some applications this may not be feasible, if for example the motion is being performed for the first time with respect to a scene. Our proposed method provides robustness with respect to scene motion during the final phase of grasping as well as to errors in the robot kinematic control. We provide experimental results in the context of dynamic closed-loop grasping.

  • predicting target Feature Configuration of non stationary objects for grasping with image based visual servoing
    2020
    Co-Authors: Jesse Haviland, Feras Dayoub, Peter Corke
    Abstract:

    This paper considers the final approach phase of visual-closed-loop grasping where the RGB-D camera is no longer able to provide valid depth information. Many current robotic grasping controllers are not closed-loop and therefore fail for moving objects. Closed-loop grasp controllers based on RGB-D imagery can track a moving object, but fail when the sensor's minimum object distance is violated just before grasping. To overcome this we propose the use of image-based visual servoing (IBVS) to guide the robot to the object-relative grasp pose using camera RGB information. IBVS robustly moves the camera to a goal pose defined implicitly in terms of an image-plane Feature Configuration. In this work, the goal image Feature coordinates are predicted from RGB-D data to enable RGB-only tracking once depth data becomes unavailable -- this enables more reliable grasping of previously unseen moving objects. Experimental results are provided.