Paper
UGCNet: An Unsupervised Semantic Segmentation Network Embedded With Geometry Consistency for Remote-Sensing Images
Release time:2022-11-06 Hits:Impact Factor:5.343
DOI number:10.1109/LGRS.2021.3129776
Affiliation of Author(s):Beihang Univ, Image Proc Ctr, Sch Astronaut
Journal:IEEE GEOSCIENCE AND REMOTE SENSING LETTERS
Key Words:Image segmentationSemanticsTrainingAdaptation modelsRemote sensingGeometryDecodingGenerative-adversarial learninggeometry consistency (GC)remote-sensing images (RSIs)semantic segmentationunsupervised
Abstract:In remote-sensing image (RSI) semantic segmentation, the dependence on large-scale and pixel-level annotated data has been a critical factor restricting its development. In this letter, we propose an unsupervised semantic segmentation network embedded with geometry consistency (UGCNet) for RSIs, which imports the adversarial-generative learning strategy into a semantic segmentation network. The proposed UGCNet can be trained on a source-domain dataset and achieve accurate segmentation results on a different target-domain dataset. Furthermore, for refining the remote-sensing target geometric representation such as densely distributed buildings, we propose a geometry-consistency (GC) constraint that can be embedded in both image-domain adaptation process and semantic segmentation network. Therefore, our model could achieve cross-domain semantic segmentation with target geometric property preservation. The experimental results on Massachusetts and Inria buildings datasets prove that the proposed unsupervised UGCNet could achieve a very comparable segmentation accuracy with the fully supervised model, which validates the effectiveness of the proposed method.
Indexed by:Journal paper
First-Level Discipline:Control Science and Engineering
Volume:19
ISSN No.:10.1109/LGRS.2021.3129776
Translation or Not:no
Date of Publication:2022-01-21
Included Journals:SCI
Links to published journals:https://ieeexplore.ieee.org/document/9623453/