The Experts below are selected from a list of 6468 Experts worldwide ranked by ideXlab platform
Hongyan Liao - One of the best experts on this subject based on the ideXlab platform.
-
improving the performance of gis polygon overlay computation with mapreduce for spatial big data processing
Cluster Computing, 2015Co-Authors: Yong Wang, Zhenling Liu, Hongyan LiaoAbstract:As one of the important operations in Geographic Information System (GIS) spatial analysis, polygon overlay processing is a time-consuming task in many big data cases. In this paper, a specially designed MapReduce algorithm with grid index is proposed to decrease the running time. Our proposed algorithm can reduce the times of calling intersection computation by the aid of grid index. The experiment is carried out on the cloud Framework based on Hadoop built by ourselves. Experimental results show that our algorithm with spatial grid index consumes less time than its peer without spatial index. Moreover, the proposed algorithm has an upward speed-up ratio when more nodes of Hadoop Framework are used. Nevertheless, with the increase of nodes, the upward trend of speed-up ratio slows down.
-
improving the performance of gis polygon overlay computation with mapreduce for spatial big data processing
Cluster Computing, 2015Co-Authors: Yong Wang, Zhenling Liu, Hongyan Liao, Chengjun LiAbstract:As one of the important operations in Geographic Information System (GIS) spatial analysis, polygon overlay processing is a time-consuming task in many big data cases. In this paper, a specially designed MapReduce algorithm with grid index is proposed to decrease the running time. Our proposed algorithm can reduce the times of calling intersection computation by the aid of grid index. The experiment is carried out on the cloud Framework based on Hadoop built by ourselves. Experimental results show that our algorithm with spatial grid index consumes less time than its peer without spatial index. Moreover, the proposed algorithm has an upward speed-up ratio when more nodes of Hadoop Framework are used. Nevertheless, with the increase of nodes, the upward trend of speed-up ratio slows down.
Yong Wang - One of the best experts on this subject based on the ideXlab platform.
-
improving the performance of gis polygon overlay computation with mapreduce for spatial big data processing
Cluster Computing, 2015Co-Authors: Yong Wang, Zhenling Liu, Hongyan LiaoAbstract:As one of the important operations in Geographic Information System (GIS) spatial analysis, polygon overlay processing is a time-consuming task in many big data cases. In this paper, a specially designed MapReduce algorithm with grid index is proposed to decrease the running time. Our proposed algorithm can reduce the times of calling intersection computation by the aid of grid index. The experiment is carried out on the cloud Framework based on Hadoop built by ourselves. Experimental results show that our algorithm with spatial grid index consumes less time than its peer without spatial index. Moreover, the proposed algorithm has an upward speed-up ratio when more nodes of Hadoop Framework are used. Nevertheless, with the increase of nodes, the upward trend of speed-up ratio slows down.
-
improving the performance of gis polygon overlay computation with mapreduce for spatial big data processing
Cluster Computing, 2015Co-Authors: Yong Wang, Zhenling Liu, Hongyan Liao, Chengjun LiAbstract:As one of the important operations in Geographic Information System (GIS) spatial analysis, polygon overlay processing is a time-consuming task in many big data cases. In this paper, a specially designed MapReduce algorithm with grid index is proposed to decrease the running time. Our proposed algorithm can reduce the times of calling intersection computation by the aid of grid index. The experiment is carried out on the cloud Framework based on Hadoop built by ourselves. Experimental results show that our algorithm with spatial grid index consumes less time than its peer without spatial index. Moreover, the proposed algorithm has an upward speed-up ratio when more nodes of Hadoop Framework are used. Nevertheless, with the increase of nodes, the upward trend of speed-up ratio slows down.
Chengjun Li - One of the best experts on this subject based on the ideXlab platform.
-
improving the performance of gis polygon overlay computation with mapreduce for spatial big data processing
Cluster Computing, 2015Co-Authors: Yong Wang, Zhenling Liu, Hongyan Liao, Chengjun LiAbstract:As one of the important operations in Geographic Information System (GIS) spatial analysis, polygon overlay processing is a time-consuming task in many big data cases. In this paper, a specially designed MapReduce algorithm with grid index is proposed to decrease the running time. Our proposed algorithm can reduce the times of calling intersection computation by the aid of grid index. The experiment is carried out on the cloud Framework based on Hadoop built by ourselves. Experimental results show that our algorithm with spatial grid index consumes less time than its peer without spatial index. Moreover, the proposed algorithm has an upward speed-up ratio when more nodes of Hadoop Framework are used. Nevertheless, with the increase of nodes, the upward trend of speed-up ratio slows down.
Zhenling Liu - One of the best experts on this subject based on the ideXlab platform.
-
improving the performance of gis polygon overlay computation with mapreduce for spatial big data processing
Cluster Computing, 2015Co-Authors: Yong Wang, Zhenling Liu, Hongyan LiaoAbstract:As one of the important operations in Geographic Information System (GIS) spatial analysis, polygon overlay processing is a time-consuming task in many big data cases. In this paper, a specially designed MapReduce algorithm with grid index is proposed to decrease the running time. Our proposed algorithm can reduce the times of calling intersection computation by the aid of grid index. The experiment is carried out on the cloud Framework based on Hadoop built by ourselves. Experimental results show that our algorithm with spatial grid index consumes less time than its peer without spatial index. Moreover, the proposed algorithm has an upward speed-up ratio when more nodes of Hadoop Framework are used. Nevertheless, with the increase of nodes, the upward trend of speed-up ratio slows down.
-
improving the performance of gis polygon overlay computation with mapreduce for spatial big data processing
Cluster Computing, 2015Co-Authors: Yong Wang, Zhenling Liu, Hongyan Liao, Chengjun LiAbstract:As one of the important operations in Geographic Information System (GIS) spatial analysis, polygon overlay processing is a time-consuming task in many big data cases. In this paper, a specially designed MapReduce algorithm with grid index is proposed to decrease the running time. Our proposed algorithm can reduce the times of calling intersection computation by the aid of grid index. The experiment is carried out on the cloud Framework based on Hadoop built by ourselves. Experimental results show that our algorithm with spatial grid index consumes less time than its peer without spatial index. Moreover, the proposed algorithm has an upward speed-up ratio when more nodes of Hadoop Framework are used. Nevertheless, with the increase of nodes, the upward trend of speed-up ratio slows down.
Shanti Swarup Medasani - One of the best experts on this subject based on the ideXlab platform.
-
high resolution satellite image processing using Hadoop Framework
International Conference on Cloud Computing, 2015Co-Authors: Roshan Rajak, Deepu Raveendran, Shanti Swarup MedasaniAbstract:Complex image processing algorithms that require higher computational power with large scale inputs can be processed efficiently using the parallel and distributed processing of Hadoop MapReduce Framework. Hadoop MapReduce is a scalable model which is capable of processing petabytes (1015 order) of data with improved fault tolerance and data parallelism. In this paper we present a MapReduce Framework for performing parallel remote sensing satellite data processing using Hadoop and storing the output in HBase. The speedup and performance show that by utilizing Hadoop, we can distribute our workload across different clusters to take advantage of combined processing power on commodity hardware.