下载中心
优秀审稿专家
优秀论文
相关链接
摘要
房屋建筑物作为人类活动的主要场所,快速准确地将其从高分辨率遥感影像中提取出来,对促进遥感信息在防灾减灾、城镇管理等方面的应用具有重要意义。本文基于深度学习,提出了高分辨率遥感影像房屋建筑物像素级精确提取方法。首先,针对样本图像边缘像素特征不足现象,以U-Net模型为基础提出IEU-Net模型,设计了全新的忽略边缘交叉熵函数IELoss并将其作为损失函数,另外添加Dropout和BN层在避免过拟合的同时提高模型训练速度和鲁棒性。其次,为解决模型特征丰富度有限的问题,引入形态学建筑物指数MBI,与遥感影像RGB波段一同参与到模型的分类过程。最后,在模型预测时与IELoss相对应采用忽略边缘预测策略从而获得最佳建筑物提取结果。实验对比分析表明:本文方法能有效克服样本边缘像素特征不足问题并抑制道路、建筑物阴影对结果的影响,提升高分辨率遥感影像中房屋建筑物的提取精度。
House buildings, as the main place for human activities, can be rapidly and accurately extracted from high-resolution remote sensing images, which is of great significance for promoting remote sensing information in disaster prevention and mitigation and town management. On the basis of deep learning, this paper proposes a pixel-level accurate extraction method for house buildings by using high-resolution remote sensing images.First, in consideration of the lack of pixel features at the edge of a sample image, an IEU-Net model is proposed on the basis of the U-Net model. A new ignore-edges categorical cross entropy function, IELoss, is designed as a loss function. Dropout and BN layers are added to help improve the speed and robustness of model training while fitting. Second, to solve the problem of limited feature richness of the model, the Morphological Building Index (MBI) is introduced into the classification process of the model, together with the remote sensing image RGB band. Lastly, when ignoring the edge prediction method, the best building extraction results can be obtained in the model prediction that corresponds to the IEU-Net model. The remote sensing image data used in this study are 0.8 m true color and infrared band data obtained by multispectral and panchromatic fusion from the GF-2 satellite of the Yushu Tibetan Autonomous Prefecture of Qinghai Province. A label image is obtained by visually interpreting the images by using ArcGIS software and performing a feature-to-raster transformation on the obtained vector file. Then, a ground truth image of the location of houses and buildings in the remote sensing image is recorded. The MBI and RGB bands are fused, and the corresponding label images are randomly cropped to obtain 500 pieces of 256×256 training data and 100 pieces of 256×256 verification data. Regarding the IEU-Net model r value of 0.5 and the training data as the model input and using the Adam optimization algorithm for back propagation iteration for 100 times help achieve the optimal parameter results.The Overall Accuracy (OA) is 91.86%, and the kappa value is 0.802. To verify the effectiveness of IELoss in the IEU-Net model and the ignore-edges prediction method relative to CELoss and the ordinary prediction method in solving the problem of insufficient edge pixel features, this study continues the comparison by using the values of r of 1.0, 0.9, 0.8, 0.7, and 0.6 (when r is 1.0, IELoss is equivalent to CELoss, and the ignore-edges prediction method is equivalent to the ordinary prediction method). The results show that the OA of the model using IELoss and the ignore-edges prediction method is 5.03% higher than that of the model using CELoss and the ordinary prediction method, and the kappa value is 0.165 higher. To verify the effectiveness of adding MBI to the training data of the model, a comparative experiment is performed by using RGB three-band data as the training set. The prediction result obtained using the MBI-added data set is 1.55% higher than the OA of the RGB band data set. The kappa value is increased by 0.009.The experimental results show that the IEU-Net model effectively solves the problem of insufficient edge pixel features and achieves house and building extraction accuracy with high OA and kappa values. The addition of MBI data can overcome the influence of roads and house building shadows to a certain extent and accurately extract the edge information of house buildings. Therefore, if significant vegetation exists near the houses and buildings in our area of interest, we can consider adding the normalized difference vegetation index to the training set to reduce the impact of the vegetation.