Underwater Environment

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 22617 Experts worldwide ranked by ideXlab platform

Hajime Asama - One of the best experts on this subject based on the ideXlab platform.

  • acmarker acoustic camera based fiducial marker system in Underwater Environment
    International Conference on Robotics and Automation, 2020
    Co-Authors: Yusheng Wang, Dingyu Liu, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, Hajime Asama
    Abstract:

    ACMarker is an acoustic camera-based fiducial marker system designed for Underwater Environments. Optical camera-based fiducial marker systems have been widely used in computer vision and robotics applications such as augmented reality (AR), camera calibration, and robot navigation. However, in Underwater Environments, the performance of optical cameras is limited owing to water turbidity and illumination conditions. Acoustic cameras, which are forward-looking sonars, have been gradually applied in Underwater situations. They can acquire high-resolution images even in turbid water with poor illumination. We propose methods to recognize a simply designed marker and to estimate the relative pose between the acoustic camera and the marker. The proposed system can be applied to various Underwater tasks such as object tracking and localization of unmanned Underwater vehicles. Simulation and real experiments were conducted to test the recognition of such markers and pose estimation based on the markers.

  • rotation estimation of acoustic camera based on illuminated area in acoustic image
    IFAC-PapersOnLine, 2019
    Co-Authors: Yusheng Wang, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, Yonghoon Ji, Hajime Asama
    Abstract:

    Abstract In this paper, the concept of illuminated area, an important characteristic in acoustic images, is formalized, which can be applied to tasks such as the 3D mapping of Underwater Environment. Unmanned exploration using Underwater robots is gaining attention among the scientific community. A way to sense the Underwater Environment is to employ the acoustic camera, a next generation forward looking sonar with high resolution even in turbid water. It is more flexible than common Underwater sonars; however, studies on acoustic cameras are still at the early stage. Acoustic cameras have a fixed vertical beam width able to generate a limited bright area on acoustic images which is named illuminated area. In this paper, we propose the concept of illuminated area and the method to detect the illuminated area under a flatness assumption of the ground. Then, it is shown how the knowledge of the illuminated area can be fused with depth information to estimate the roll and pitch angles of the acoustic camera. The estimated quantities can be employed to carry out a 3D mapping process. Experiment shows the validity and effectiveness of the proposed method.

  • Three-dimensional Underwater Environment Reconstruction with Graph Optimization Using Acoustic Camera
    2019 IEEE SICE International Symposium on System Integration (SII), 2019
    Co-Authors: Yusheng Wang, Yusuke Tamura, Atsushi Yamashita, Yonghoon Ji, Hajime Asama
    Abstract:

    In this paper, a three-dimensional (3D) Environment reconstruction framework based on graph optimization is proposed that uses acoustic images captured in an Underwater Environment. Underwater tasks such as unmanned construction using robots are becoming more and more important. In recent years, acoustic cameras which are forward-looking imaging sonars are being commonly used in Underwater inspection. However, the loss of elevation angle information makes it difficult to get a better understanding of Underwater Environments. To cope with this, we apply 3D occupancy mapping method based on the acoustic camera rotating around the acoustic axis to generate 3D local maps. Next, from the local maps and a graph optimization scheme, we minimize the error of camera poses and build a global map. Experimental results demonstrate that our 3D mapping framework for the acoustic camera can reconstruct dense 3D models of Underwater targets robustly and precisely.

  • acoustic image simulator based on active sonar model in Underwater Environment
    International Conference on Ubiquitous Robots and Ambient Intelligence, 2018
    Co-Authors: Yonghoon Ji, Yusuke Tamura, Atsushi Yamashita, Hajime Asama
    Abstract:

    Underwater tasks such as maintenance, inspection, target recognition, or simultaneous localization and mapping (SLAM) require accurate Underwater information. Acoustic cameras are outstanding sensors for acquiring Underwater information because, even in turbid water, they can provide acoustic images with more accurate detail than what other sensors provide. In this paper, we proposes a novel acoustic image simulator based on active sonar model analyzing the correlation between signal processing and an image display mechanism that have not yet been clarified. The results demonstrate that our proposed simulator can successfully generate realistic virtual acoustic images from arbitrary viewpoints.

  • 3d reconstruction of line features using multi view acoustic images in Underwater Environment
    International Conference on Multisensor Fusion and Integration for Intelligent Systems, 2017
    Co-Authors: Ngoc Trung Mai, Yusuke Tamura, Atsushi Yamashita, Hanwool Woo, Hajime Asama
    Abstract:

    In order to understand the Underwater Environment, it is essential to use sensing methodologies able to perceive the three dimensional (3D) information of the explored site. Sonar sensors are commonly employed in Underwater exploration. This paper presents a novel methodology able to retrieve 3D information of Underwater objects. The proposed solution employs an acoustic camera, which represents the next generation of sonar sensors, to extract and track the line of the Underwater objects which are used as visual features for the image processing algorithm. In this work, we concentrate on artificial Underwater Environments, such as dams and bridges. In these structured Environments, the line segments are preferred over the points feature, as they can represent structure information more effectively. We also developed a method for automatic extraction and correspondences matching of line features. Our approach enables 3D measurement of Underwater objects using arbitrary viewpoints based on an extended Kalman filter (EKF). The probabilistic method allows computing the 3D reconstruction of Underwater objects even in presence of uncertainty in the control input of the camera's movements. Experiments have been performed in real Environments. Results showed the effectiveness and accuracy of the proposed solution.

Yusheng Wang - One of the best experts on this subject based on the ideXlab platform.

  • acmarker acoustic camera based fiducial marker system in Underwater Environment
    International Conference on Robotics and Automation, 2020
    Co-Authors: Yusheng Wang, Dingyu Liu, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, Hajime Asama
    Abstract:

    ACMarker is an acoustic camera-based fiducial marker system designed for Underwater Environments. Optical camera-based fiducial marker systems have been widely used in computer vision and robotics applications such as augmented reality (AR), camera calibration, and robot navigation. However, in Underwater Environments, the performance of optical cameras is limited owing to water turbidity and illumination conditions. Acoustic cameras, which are forward-looking sonars, have been gradually applied in Underwater situations. They can acquire high-resolution images even in turbid water with poor illumination. We propose methods to recognize a simply designed marker and to estimate the relative pose between the acoustic camera and the marker. The proposed system can be applied to various Underwater tasks such as object tracking and localization of unmanned Underwater vehicles. Simulation and real experiments were conducted to test the recognition of such markers and pose estimation based on the markers.

  • rotation estimation of acoustic camera based on illuminated area in acoustic image
    IFAC-PapersOnLine, 2019
    Co-Authors: Yusheng Wang, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, Yonghoon Ji, Hajime Asama
    Abstract:

    Abstract In this paper, the concept of illuminated area, an important characteristic in acoustic images, is formalized, which can be applied to tasks such as the 3D mapping of Underwater Environment. Unmanned exploration using Underwater robots is gaining attention among the scientific community. A way to sense the Underwater Environment is to employ the acoustic camera, a next generation forward looking sonar with high resolution even in turbid water. It is more flexible than common Underwater sonars; however, studies on acoustic cameras are still at the early stage. Acoustic cameras have a fixed vertical beam width able to generate a limited bright area on acoustic images which is named illuminated area. In this paper, we propose the concept of illuminated area and the method to detect the illuminated area under a flatness assumption of the ground. Then, it is shown how the knowledge of the illuminated area can be fused with depth information to estimate the roll and pitch angles of the acoustic camera. The estimated quantities can be employed to carry out a 3D mapping process. Experiment shows the validity and effectiveness of the proposed method.

  • Three-dimensional Underwater Environment Reconstruction with Graph Optimization Using Acoustic Camera
    2019 IEEE SICE International Symposium on System Integration (SII), 2019
    Co-Authors: Yusheng Wang, Yusuke Tamura, Atsushi Yamashita, Yonghoon Ji, Hajime Asama
    Abstract:

    In this paper, a three-dimensional (3D) Environment reconstruction framework based on graph optimization is proposed that uses acoustic images captured in an Underwater Environment. Underwater tasks such as unmanned construction using robots are becoming more and more important. In recent years, acoustic cameras which are forward-looking imaging sonars are being commonly used in Underwater inspection. However, the loss of elevation angle information makes it difficult to get a better understanding of Underwater Environments. To cope with this, we apply 3D occupancy mapping method based on the acoustic camera rotating around the acoustic axis to generate 3D local maps. Next, from the local maps and a graph optimization scheme, we minimize the error of camera poses and build a global map. Experimental results demonstrate that our 3D mapping framework for the acoustic camera can reconstruct dense 3D models of Underwater targets robustly and precisely.

  • 3d occupancy mapping framework based on acoustic camera in Underwater Environment
    IFAC-PapersOnLine, 2018
    Co-Authors: Yusheng Wang, Yusuke Tamura, Atsushi Yamashita, Yonghoon Ji, Asama Hajime
    Abstract:

    Abstract In this paper, we present a novel probabilistic three-dimensional (3D) mapping framework that uses acoustic images captured in an Underwater Environment. Acoustic camera is a forward-looking imaging sonar that is commonly used in Underwater inspection recently; however, the loss of elevation angle information makes it difficult to get a better understanding of Underwater Environment. To cope with this, we apply a probabilistic occupancy mapping framework with a novel inverse sensor model suitable for the acoustic camera in order to reconstruct the Underwater Environment in volumetric presentation. The simulations and experimental results demonstrate that our mapping framework for the acoustic camera can reconstruct dense 3D model of Underwater targets successfully.

Stefan B Williams - One of the best experts on this subject based on the ideXlab platform.

  • slam using natural features in an Underwater Environment
    International Conference on Control Automation Robotics and Vision, 2004
    Co-Authors: Ian Mahon, Stefan B Williams
    Abstract:

    This paper presents techniques developed to apply the simultaneous localisation and mapping (SLAM) algorithm to an unmanned Underwater vehicle operating in an unstructured, natural Environment. It is shown that information from on-board sonar and vision sensors can be fused to select and track regions of the Environment that may be used as features to estimate the vehicle's motion. Results including vehicle pose estimates and resulting Environment models are shown for data acquired at the Great Barrier Reef in Australia.

Atsushi Yamashita - One of the best experts on this subject based on the ideXlab platform.

  • acmarker acoustic camera based fiducial marker system in Underwater Environment
    International Conference on Robotics and Automation, 2020
    Co-Authors: Yusheng Wang, Dingyu Liu, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, Hajime Asama
    Abstract:

    ACMarker is an acoustic camera-based fiducial marker system designed for Underwater Environments. Optical camera-based fiducial marker systems have been widely used in computer vision and robotics applications such as augmented reality (AR), camera calibration, and robot navigation. However, in Underwater Environments, the performance of optical cameras is limited owing to water turbidity and illumination conditions. Acoustic cameras, which are forward-looking sonars, have been gradually applied in Underwater situations. They can acquire high-resolution images even in turbid water with poor illumination. We propose methods to recognize a simply designed marker and to estimate the relative pose between the acoustic camera and the marker. The proposed system can be applied to various Underwater tasks such as object tracking and localization of unmanned Underwater vehicles. Simulation and real experiments were conducted to test the recognition of such markers and pose estimation based on the markers.

  • rotation estimation of acoustic camera based on illuminated area in acoustic image
    IFAC-PapersOnLine, 2019
    Co-Authors: Yusheng Wang, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, Yonghoon Ji, Hajime Asama
    Abstract:

    Abstract In this paper, the concept of illuminated area, an important characteristic in acoustic images, is formalized, which can be applied to tasks such as the 3D mapping of Underwater Environment. Unmanned exploration using Underwater robots is gaining attention among the scientific community. A way to sense the Underwater Environment is to employ the acoustic camera, a next generation forward looking sonar with high resolution even in turbid water. It is more flexible than common Underwater sonars; however, studies on acoustic cameras are still at the early stage. Acoustic cameras have a fixed vertical beam width able to generate a limited bright area on acoustic images which is named illuminated area. In this paper, we propose the concept of illuminated area and the method to detect the illuminated area under a flatness assumption of the ground. Then, it is shown how the knowledge of the illuminated area can be fused with depth information to estimate the roll and pitch angles of the acoustic camera. The estimated quantities can be employed to carry out a 3D mapping process. Experiment shows the validity and effectiveness of the proposed method.

  • Three-dimensional Underwater Environment Reconstruction with Graph Optimization Using Acoustic Camera
    2019 IEEE SICE International Symposium on System Integration (SII), 2019
    Co-Authors: Yusheng Wang, Yusuke Tamura, Atsushi Yamashita, Yonghoon Ji, Hajime Asama
    Abstract:

    In this paper, a three-dimensional (3D) Environment reconstruction framework based on graph optimization is proposed that uses acoustic images captured in an Underwater Environment. Underwater tasks such as unmanned construction using robots are becoming more and more important. In recent years, acoustic cameras which are forward-looking imaging sonars are being commonly used in Underwater inspection. However, the loss of elevation angle information makes it difficult to get a better understanding of Underwater Environments. To cope with this, we apply 3D occupancy mapping method based on the acoustic camera rotating around the acoustic axis to generate 3D local maps. Next, from the local maps and a graph optimization scheme, we minimize the error of camera poses and build a global map. Experimental results demonstrate that our 3D mapping framework for the acoustic camera can reconstruct dense 3D models of Underwater targets robustly and precisely.

  • acoustic image simulator based on active sonar model in Underwater Environment
    International Conference on Ubiquitous Robots and Ambient Intelligence, 2018
    Co-Authors: Yonghoon Ji, Yusuke Tamura, Atsushi Yamashita, Hajime Asama
    Abstract:

    Underwater tasks such as maintenance, inspection, target recognition, or simultaneous localization and mapping (SLAM) require accurate Underwater information. Acoustic cameras are outstanding sensors for acquiring Underwater information because, even in turbid water, they can provide acoustic images with more accurate detail than what other sensors provide. In this paper, we proposes a novel acoustic image simulator based on active sonar model analyzing the correlation between signal processing and an image display mechanism that have not yet been clarified. The results demonstrate that our proposed simulator can successfully generate realistic virtual acoustic images from arbitrary viewpoints.

  • 3d occupancy mapping framework based on acoustic camera in Underwater Environment
    IFAC-PapersOnLine, 2018
    Co-Authors: Yusheng Wang, Yusuke Tamura, Atsushi Yamashita, Yonghoon Ji, Asama Hajime
    Abstract:

    Abstract In this paper, we present a novel probabilistic three-dimensional (3D) mapping framework that uses acoustic images captured in an Underwater Environment. Acoustic camera is a forward-looking imaging sonar that is commonly used in Underwater inspection recently; however, the loss of elevation angle information makes it difficult to get a better understanding of Underwater Environment. To cope with this, we apply a probabilistic occupancy mapping framework with a novel inverse sensor model suitable for the acoustic camera in order to reconstruct the Underwater Environment in volumetric presentation. The simulations and experimental results demonstrate that our mapping framework for the acoustic camera can reconstruct dense 3D model of Underwater targets successfully.

Yusuke Tamura - One of the best experts on this subject based on the ideXlab platform.

  • acmarker acoustic camera based fiducial marker system in Underwater Environment
    International Conference on Robotics and Automation, 2020
    Co-Authors: Yusheng Wang, Dingyu Liu, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, Hajime Asama
    Abstract:

    ACMarker is an acoustic camera-based fiducial marker system designed for Underwater Environments. Optical camera-based fiducial marker systems have been widely used in computer vision and robotics applications such as augmented reality (AR), camera calibration, and robot navigation. However, in Underwater Environments, the performance of optical cameras is limited owing to water turbidity and illumination conditions. Acoustic cameras, which are forward-looking sonars, have been gradually applied in Underwater situations. They can acquire high-resolution images even in turbid water with poor illumination. We propose methods to recognize a simply designed marker and to estimate the relative pose between the acoustic camera and the marker. The proposed system can be applied to various Underwater tasks such as object tracking and localization of unmanned Underwater vehicles. Simulation and real experiments were conducted to test the recognition of such markers and pose estimation based on the markers.

  • rotation estimation of acoustic camera based on illuminated area in acoustic image
    IFAC-PapersOnLine, 2019
    Co-Authors: Yusheng Wang, Yusuke Tamura, Hiroshi Tsuchiya, Atsushi Yamashita, Yonghoon Ji, Hajime Asama
    Abstract:

    Abstract In this paper, the concept of illuminated area, an important characteristic in acoustic images, is formalized, which can be applied to tasks such as the 3D mapping of Underwater Environment. Unmanned exploration using Underwater robots is gaining attention among the scientific community. A way to sense the Underwater Environment is to employ the acoustic camera, a next generation forward looking sonar with high resolution even in turbid water. It is more flexible than common Underwater sonars; however, studies on acoustic cameras are still at the early stage. Acoustic cameras have a fixed vertical beam width able to generate a limited bright area on acoustic images which is named illuminated area. In this paper, we propose the concept of illuminated area and the method to detect the illuminated area under a flatness assumption of the ground. Then, it is shown how the knowledge of the illuminated area can be fused with depth information to estimate the roll and pitch angles of the acoustic camera. The estimated quantities can be employed to carry out a 3D mapping process. Experiment shows the validity and effectiveness of the proposed method.

  • Three-dimensional Underwater Environment Reconstruction with Graph Optimization Using Acoustic Camera
    2019 IEEE SICE International Symposium on System Integration (SII), 2019
    Co-Authors: Yusheng Wang, Yusuke Tamura, Atsushi Yamashita, Yonghoon Ji, Hajime Asama
    Abstract:

    In this paper, a three-dimensional (3D) Environment reconstruction framework based on graph optimization is proposed that uses acoustic images captured in an Underwater Environment. Underwater tasks such as unmanned construction using robots are becoming more and more important. In recent years, acoustic cameras which are forward-looking imaging sonars are being commonly used in Underwater inspection. However, the loss of elevation angle information makes it difficult to get a better understanding of Underwater Environments. To cope with this, we apply 3D occupancy mapping method based on the acoustic camera rotating around the acoustic axis to generate 3D local maps. Next, from the local maps and a graph optimization scheme, we minimize the error of camera poses and build a global map. Experimental results demonstrate that our 3D mapping framework for the acoustic camera can reconstruct dense 3D models of Underwater targets robustly and precisely.

  • acoustic image simulator based on active sonar model in Underwater Environment
    International Conference on Ubiquitous Robots and Ambient Intelligence, 2018
    Co-Authors: Yonghoon Ji, Yusuke Tamura, Atsushi Yamashita, Hajime Asama
    Abstract:

    Underwater tasks such as maintenance, inspection, target recognition, or simultaneous localization and mapping (SLAM) require accurate Underwater information. Acoustic cameras are outstanding sensors for acquiring Underwater information because, even in turbid water, they can provide acoustic images with more accurate detail than what other sensors provide. In this paper, we proposes a novel acoustic image simulator based on active sonar model analyzing the correlation between signal processing and an image display mechanism that have not yet been clarified. The results demonstrate that our proposed simulator can successfully generate realistic virtual acoustic images from arbitrary viewpoints.

  • 3d occupancy mapping framework based on acoustic camera in Underwater Environment
    IFAC-PapersOnLine, 2018
    Co-Authors: Yusheng Wang, Yusuke Tamura, Atsushi Yamashita, Yonghoon Ji, Asama Hajime
    Abstract:

    Abstract In this paper, we present a novel probabilistic three-dimensional (3D) mapping framework that uses acoustic images captured in an Underwater Environment. Acoustic camera is a forward-looking imaging sonar that is commonly used in Underwater inspection recently; however, the loss of elevation angle information makes it difficult to get a better understanding of Underwater Environment. To cope with this, we apply a probabilistic occupancy mapping framework with a novel inverse sensor model suitable for the acoustic camera in order to reconstruct the Underwater Environment in volumetric presentation. The simulations and experimental results demonstrate that our mapping framework for the acoustic camera can reconstruct dense 3D model of Underwater targets successfully.