Augmented Reality System

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 24003 Experts worldwide ranked by ideXlab platform

Naokazu Yokoya - One of the best experts on this subject based on the ideXlab platform.

  • Wearable Augmented Reality System using an IrDA device and a passometer
    Stereoscopic Displays and Virtual Reality Systems X, 2003
    Co-Authors: Ryuhei Tenmoku, Masayuki Kanbara, Naokazu Yokoya
    Abstract:

    This paper describes a wearable Augmented Reality System with an IrDA device and a passometer. To realize Augmented Reality Systems, the position and orientation of user's viewpoint should be obtained in real time for aligning the real and virtual coordinate Systems. In the proposed System, the orientation of user's viewpoint is measured by an inertial sensor attached to the user's glasses, and the position is measured by using an IrDA device and a passometer. First, the user's position is specified exactly when the user comes into the infrared ray range of the IrDA markers which are set up to the appointed points. When the user goes out of the infrared ray range, the user's position is estimated by using a passometer. The passometer is constructed of an electronic compass and acceleration sensors. The former can detect the user's walking direction. The latter can count how many steps the user walks. These data and the user's pace make it possible to estimate the user's position in the neighborhood of the IrDA markers. We have developed a navigation System based on using the techniques above and have proven the feasibility of the System with experiments.

  • A stereoscopic video see-through Augmented Reality System based on real-time vision-based registration
    Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048), 2000
    Co-Authors: Masayuki Kanbara, Toshiyuki Okuma, Haruo Takemura, Naokazu Yokoya
    Abstract:

    In an Augmented Reality System, it is required to obtain the position and orientation of the user's viewpoint in order to display the composed image while maintaining a correct registration between the real and virtual worlds. All the procedures must be done in real time. This paper proposes a method for Augmented Reality with a stereo vision sensor and a video see-through head-mounted display (HMD). It can synchronize the display timing between the virtual and real worlds so that the alignment error is reduced. The method calculates camera parameters from three markers in image sequences captured by a pair of stereo cameras mounted on the HMD. In addition, it estimates the real-world depth from a pair of stereo images in order to generate a composed image maintaining consistent occlusions between real and virtual objects. The depth estimation region is efficiently limited by calculating the position of the virtual object by using the camera parameters. Finally, we have developed a video see-through Augmented Reality System which mainly consists of a pair of stereo cameras mounted on the HMD and a standard graphics workstation. The feasibility of the System has been successfully demonstrated with experiments

  • ISMAR - A wearable Augmented Reality System for navigation using positioning infrastructures and a pedometer
    The Second IEEE and ACM International Symposium on Mixed and Augmented Reality 2003. Proceedings., 1
    Co-Authors: Ryuhei Tenmoku, Masayuki Kanbara, Naokazu Yokoya
    Abstract:

    This paper describes a wearable Augmented Reality System using positioning infrastructures and a pedometer. To realize Augmented Reality Systems, the position and orientation of user's viewpoint should be obtained in real time. The proposed System measures the orientation of user's viewpoint by an inertial sensor and the user's position using positioning infrastructures in environments and a pedometer. The System specifies the user's position using the position ID received from RFID tags or IrDA markers which are the components of positioning infrastructures. When the user goes away from them, the user's position is alternatively estimated by using a pedometer. We have developed a navigation System using the proposed techniques and have proven the feasibility of the System with experiments.

  • A stereo vision-based Augmented Reality System with marker and natural feature tracking
    Proceedings Seventh International Conference on Virtual Systems and Multimedia, 1
    Co-Authors: Masayuki Kanbara, Naokazu Yokoya, Haruo Takemura
    Abstract:

    Proposes a method to extend the registration range of a vision-based Augmented Reality System. We propose to use natural feature points contained in images captured by a pair of stereo cameras in conjunction with pre-defined fixed fiducial markers. The System also incorporates an inertial sensor to achieve a robust registration method which can handle user's fast head rotation and movement. The System first uses pre-defined fiducial markers to estimate a projection matrix between real and virtual coordinate Systems. At the same time, the System picks up and tracks a set of natural feature points from the initial image. As a user moves around in an AR environment, the initial markers fall out from the camera frame and natural features are then used to recover the projection matrix. Experiments evaluating the feasibility, of the method are carried out and show the potential benefits of the method.

Masayuki Kanbara - One of the best experts on this subject based on the ideXlab platform.

  • Wearable Augmented Reality System using an IrDA device and a passometer
    Stereoscopic Displays and Virtual Reality Systems X, 2003
    Co-Authors: Ryuhei Tenmoku, Masayuki Kanbara, Naokazu Yokoya
    Abstract:

    This paper describes a wearable Augmented Reality System with an IrDA device and a passometer. To realize Augmented Reality Systems, the position and orientation of user's viewpoint should be obtained in real time for aligning the real and virtual coordinate Systems. In the proposed System, the orientation of user's viewpoint is measured by an inertial sensor attached to the user's glasses, and the position is measured by using an IrDA device and a passometer. First, the user's position is specified exactly when the user comes into the infrared ray range of the IrDA markers which are set up to the appointed points. When the user goes out of the infrared ray range, the user's position is estimated by using a passometer. The passometer is constructed of an electronic compass and acceleration sensors. The former can detect the user's walking direction. The latter can count how many steps the user walks. These data and the user's pace make it possible to estimate the user's position in the neighborhood of the IrDA markers. We have developed a navigation System based on using the techniques above and have proven the feasibility of the System with experiments.

  • A stereoscopic video see-through Augmented Reality System based on real-time vision-based registration
    Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048), 2000
    Co-Authors: Masayuki Kanbara, Toshiyuki Okuma, Haruo Takemura, Naokazu Yokoya
    Abstract:

    In an Augmented Reality System, it is required to obtain the position and orientation of the user's viewpoint in order to display the composed image while maintaining a correct registration between the real and virtual worlds. All the procedures must be done in real time. This paper proposes a method for Augmented Reality with a stereo vision sensor and a video see-through head-mounted display (HMD). It can synchronize the display timing between the virtual and real worlds so that the alignment error is reduced. The method calculates camera parameters from three markers in image sequences captured by a pair of stereo cameras mounted on the HMD. In addition, it estimates the real-world depth from a pair of stereo images in order to generate a composed image maintaining consistent occlusions between real and virtual objects. The depth estimation region is efficiently limited by calculating the position of the virtual object by using the camera parameters. Finally, we have developed a video see-through Augmented Reality System which mainly consists of a pair of stereo cameras mounted on the HMD and a standard graphics workstation. The feasibility of the System has been successfully demonstrated with experiments

  • ISMAR - A wearable Augmented Reality System for navigation using positioning infrastructures and a pedometer
    The Second IEEE and ACM International Symposium on Mixed and Augmented Reality 2003. Proceedings., 1
    Co-Authors: Ryuhei Tenmoku, Masayuki Kanbara, Naokazu Yokoya
    Abstract:

    This paper describes a wearable Augmented Reality System using positioning infrastructures and a pedometer. To realize Augmented Reality Systems, the position and orientation of user's viewpoint should be obtained in real time. The proposed System measures the orientation of user's viewpoint by an inertial sensor and the user's position using positioning infrastructures in environments and a pedometer. The System specifies the user's position using the position ID received from RFID tags or IrDA markers which are the components of positioning infrastructures. When the user goes away from them, the user's position is alternatively estimated by using a pedometer. We have developed a navigation System using the proposed techniques and have proven the feasibility of the System with experiments.

  • A stereo vision-based Augmented Reality System with marker and natural feature tracking
    Proceedings Seventh International Conference on Virtual Systems and Multimedia, 1
    Co-Authors: Masayuki Kanbara, Naokazu Yokoya, Haruo Takemura
    Abstract:

    Proposes a method to extend the registration range of a vision-based Augmented Reality System. We propose to use natural feature points contained in images captured by a pair of stereo cameras in conjunction with pre-defined fixed fiducial markers. The System also incorporates an inertial sensor to achieve a robust registration method which can handle user's fast head rotation and movement. The System first uses pre-defined fiducial markers to estimate a projection matrix between real and virtual coordinate Systems. At the same time, the System picks up and tracks a set of natural feature points from the initial image. As a user moves around in an AR environment, the initial markers fall out from the camera frame and natural features are then used to recover the projection matrix. Experiments evaluating the feasibility, of the method are carried out and show the potential benefits of the method.

Haruo Takemura - One of the best experts on this subject based on the ideXlab platform.

  • A stereoscopic video see-through Augmented Reality System based on real-time vision-based registration
    Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048), 2000
    Co-Authors: Masayuki Kanbara, Toshiyuki Okuma, Haruo Takemura, Naokazu Yokoya
    Abstract:

    In an Augmented Reality System, it is required to obtain the position and orientation of the user's viewpoint in order to display the composed image while maintaining a correct registration between the real and virtual worlds. All the procedures must be done in real time. This paper proposes a method for Augmented Reality with a stereo vision sensor and a video see-through head-mounted display (HMD). It can synchronize the display timing between the virtual and real worlds so that the alignment error is reduced. The method calculates camera parameters from three markers in image sequences captured by a pair of stereo cameras mounted on the HMD. In addition, it estimates the real-world depth from a pair of stereo images in order to generate a composed image maintaining consistent occlusions between real and virtual objects. The depth estimation region is efficiently limited by calculating the position of the virtual object by using the camera parameters. Finally, we have developed a video see-through Augmented Reality System which mainly consists of a pair of stereo cameras mounted on the HMD and a standard graphics workstation. The feasibility of the System has been successfully demonstrated with experiments

  • A stereo vision-based Augmented Reality System with marker and natural feature tracking
    Proceedings Seventh International Conference on Virtual Systems and Multimedia, 1
    Co-Authors: Masayuki Kanbara, Naokazu Yokoya, Haruo Takemura
    Abstract:

    Proposes a method to extend the registration range of a vision-based Augmented Reality System. We propose to use natural feature points contained in images captured by a pair of stereo cameras in conjunction with pre-defined fixed fiducial markers. The System also incorporates an inertial sensor to achieve a robust registration method which can handle user's fast head rotation and movement. The System first uses pre-defined fiducial markers to estimate a projection matrix between real and virtual coordinate Systems. At the same time, the System picks up and tracks a set of natural feature points from the initial image. As a user moves around in an AR environment, the initial markers fall out from the camera frame and natural features are then used to recover the projection matrix. Experiments evaluating the feasibility, of the method are carried out and show the potential benefits of the method.

Jong-soo Choi - One of the best experts on this subject based on the ideXlab platform.

  • ISMAR - Wearable Augmented Reality System using gaze interaction
    2008 7th IEEE ACM International Symposium on Mixed and Augmented Reality, 2008
    Co-Authors: Hyung-min Park, Seok Han Lee, Jong-soo Choi
    Abstract:

    Undisturbed interaction is essential to provide immersive AR environments. There have been a lot of approaches to interact with VEs (virtual environments) so far, especially in hand metaphor. When the userpsilas hands are being used for hand-based work such as maintenance and repair, necessity of alternative interaction technique has arisen. In recent research, hands-free gaze information is adopted to AR to perform original actions in concurrence with interaction. [3, 4]. There has been little progress on that research, still at a pilot study in a laboratory setting. In this paper, we introduce such a simple WARS (wearable Augmented Reality System) equipped with an HMD, scene camera, eye tracker. We propose dasiaAgingpsila technique improving traditional dwell-time selection, demonstrate AR gallery - dynamic exhibition space with wearable System.

Waikeung Fung - One of the best experts on this subject based on the ideXlab platform.

  • development of Augmented Reality System for afm based nanomanipulation
    IEEE-ASME Transactions on Mechatronics, 2004
    Co-Authors: Waikeung Fung
    Abstract:

    Using atomic force microscopy (AFM) as a nanomanipulation tool has been discussed for more than a decade. However, its lack of real-time visual feedback during manipulation has hindered its wide application. Fortunately, this problem has been overcome by our recently developed Augmented Reality System. By locally updating the AFM image based on real-time force information during manipulation, not only can this new System provide real-time force feedback but also real-time visual feedback. The real-time visual display combined with the real-time force feedback provides an Augmented Reality environment, in which the operator not only can feel the interaction forces but also observe the real-time changes of the nano-environment. This Augmented Reality System capable of nanolithography and manipulation of nano-particles helps the operator to perform several operations without the need of a new image scan, which makes AFM-based nano-assembly feasible and applicable.

  • Development of Augmented Reality System for AFM-based nanomanipulation
    IEEE ASME Transactions on Mechatronics, 2004
    Co-Authors: Waikeung Fung
    Abstract:

    Using atomic force microscopy (AFM) as a nanomanipulation tool has been discussed for more than a decade. However, its lack of real-time visual feedback during manipulation has hindered its wide application. Fortunately, this problem has been overcome by our recently developed Augmented Reality System. By locally updating the AFM image based on real-time force information during manipulation, not only can this new System provide real-time force feedback but also real-time visual feedback. The real-time visual display combined with the real-time force feedback provides an Augmented Reality environment, in which the operator not only can feel the interaction forces but also observe the real-time changes of the nano-environment. This Augmented Reality System capable of nanolithography and manipulation of nano-particles helps the operator to perform several operations without the need of a new image scan, which makes AFM-based nano-assembly feasible and applicable. © 2004 IEEE.Link_to_subscribed_fulltex

  • Augmented Reality System for real-time nanomanipulation
    2003 Third IEEE Conference on Nanotechnology 2003. IEEE-NANO 2003., 1
    Co-Authors: Waikeung Fung
    Abstract:

    By introducing haptic force feedback, nanomanipulation using the AFM becomes easier. However, this technique has been impeded by tip displacement due to the softness of AFM cantilever. In this paper, a novel nanomanipulation System assisted by Augmented Reality is presented. By analyzing the cantilever-tip interaction with the environment, the normal force and the lateral force are obtained. These forces are then fed to a haptic device to provide real-time force feedback for operator. The visual images are also updated based on real-time force and position information, which provides the operator with real-time visual feedback. New calibration methods and compensation algorithm are introduced to compensate for the tip displacement, therefore, the visual display and force feeling from the haptic device become reliable. The real-time force feedback and the real-time visual display provide the operator with an Augmented Reality environment. Experimental results have verified the effectiveness of the proposed novel Augmented Reality System.