DomainDesc: Learning Local Descriptors with Domain Adaptation | |
Rongtao Xu; Changwei Wang; Bin Fan; Yuyang Zhang; Shibiao Xu; Weiliang Meng; Xiaopeng Zhang | |
2022-05 | |
会议日期 | May 22-27, 2022 |
会议地点 | Virtual |
英文摘要 | Robust and efficient local descriptor is crucial in a wide range of applications. In this paper, we propose a novel descriptor DomainDesc which is invariant as much as possible by learning local Descriptor with Domain adaptation. We design the feature-level domain adaptation loss to improve robustness of our DomainDesc by punishing inconsistent high-level feature distributions of different images, while we present the pixel-level cross-domain consistency loss to compensate for the inconsistency between the descriptors corresponding to the keypoints at the pixel level. Besides, we adopt a new architecture to make the descriptor contain as much information as possible, and combine triplet loss and cross-domain consistency loss for descriptor supervision to ensure the distinguished ability of our descriptor. Finally, we give a crossdomain dataset generation strategy to quickly construct our training dataset for diverse domains to adapt to complex application scenarios. Experiments validate that our DomainDesc achieves state-of-the-art performances on HPatches image matching benchmark and Aachen-Day-Night localization benchmark. |
内容类型 | 会议论文 |
源URL | [http://ir.ia.ac.cn/handle/173211/47436] |
专题 | 模式识别国家重点实验室_三维可视计算 |
通讯作者 | Shibiao Xu; Weiliang Meng |
作者单位 | 1.NLPR, Institute of Automation, Chinese Academy of Sciences 2.Zhejiang Lab 3.School of Automation and Electrical Engineering, University of Science and Technology Beijing 4.School of Artificial Intelligence, University of Chinese Academy of Sciences 5.School of Artificial Intelligence, Beijing University of Posts and Telecommunications |
推荐引用方式 GB/T 7714 | Rongtao Xu,Changwei Wang,Bin Fan,et al. DomainDesc: Learning Local Descriptors with Domain Adaptation[C]. 见:. Virtual. May 22-27, 2022. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论