A Subject-Sensitive Perceptual Hash Based on MUM-Net for the Integrity Authentication of High Resolution Remote Sensing Images | |
Ding, Kaimeng1,2; Liu, Yueming2; Xu, Qin1; Lu, Fuqiang3 | |
刊名 | ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION |
2020-08-01 | |
卷号 | 9期号:8页码:28 |
关键词 | perceptual hash integrity authentication subject-sensitive HRRS image deep learning |
DOI | 10.3390/ijgi9080485 |
通讯作者 | Ding, Kaimeng(dkm@jit.edu.cn) ; Xu, Qin(missxuqin@jitedu.cn) |
英文摘要 | Data security technology is of great significance to the application of high resolution remote sensing image (HRRS) images. As an important data security technology, perceptual hash overcomes the shortcomings of cryptographic hashing that is not robust and can achieve integrity authentication of HRRS images based on perceptual content. However, the existing perceptual hash does not take into account whether the user focuses on certain types of information of the HRRS image. In this paper, we introduce the concept of subject-sensitive perceptual hash, which can be seen as a special case of conventional perceptual hash, for the integrity authentication of HRRS image. To achieve subject-sensitive perceptual hash, we propose a new deep convolutional neural network architecture, named MUM-Net, for extracting robust features of HRRS images. MUM-Net is the core of perceptual hash algorithm, and it uses focal loss as the loss function to overcome the imbalance between the positive and negative samples in the training samples. The robust features extracted by MUM-Net are further compressed and encoded to obtain the perceptual hash sequence of HRRS image. Experiments show that our algorithm has higher tamper sensitivity to subject-related malicious tampering, and the robustness is improved by about 10% compared to the existing U-net-based algorithm; compared to other deep learning-based algorithms, this algorithm achieves a better balance between robustness and tampering sensitivity, and has better overall performance. |
资助项目 | National Natural Science Foundation of China[41801303] ; National Natural Science Foundation of China[41901323] ; Jiangsu Province Science and Technology Support Program[BK20170116] ; Scientific Research Hatch Fund of Jinling Institute of Technology[jit-fhxm-201604] ; Scientific Research Hatch Fund of Jinling Institute of Technology[jit-b-201520] ; Scientific Research Hatch Fund of Jinling Institute of Technology[jit-b-201645] ; Qing Lan Project |
WOS关键词 | PERFORMANCE EVALUATION ; DIGITAL SIGNATURE ; U-NET ; CLASSIFICATION ; SEGMENTATION ; SECURE |
WOS研究方向 | Physical Geography ; Remote Sensing |
语种 | 英语 |
出版者 | MDPI |
WOS记录号 | WOS:000565070900001 |
资助机构 | National Natural Science Foundation of China ; Jiangsu Province Science and Technology Support Program ; Scientific Research Hatch Fund of Jinling Institute of Technology ; Qing Lan Project |
内容类型 | 期刊论文 |
源URL | [http://ir.igsnrr.ac.cn/handle/311030/157916] |
专题 | 中国科学院地理科学与资源研究所 |
通讯作者 | Ding, Kaimeng; Xu, Qin |
作者单位 | 1.Jinling Inst Technol, Sch Networks & Telecommun Engn, Nanjing 211169, Peoples R China 2.Chinese Acad Sci, Inst Geog Sci & Nat Resources Res, State Key Lab Resource & Environm Informat Syst, Beijing 100101, Peoples R China 3.Changzhou Inst Technol, Sch Comp Sci & Informat Engn, Changzhou 213022, Peoples R China |
推荐引用方式 GB/T 7714 | Ding, Kaimeng,Liu, Yueming,Xu, Qin,et al. A Subject-Sensitive Perceptual Hash Based on MUM-Net for the Integrity Authentication of High Resolution Remote Sensing Images[J]. ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION,2020,9(8):28. |
APA | Ding, Kaimeng,Liu, Yueming,Xu, Qin,&Lu, Fuqiang.(2020).A Subject-Sensitive Perceptual Hash Based on MUM-Net for the Integrity Authentication of High Resolution Remote Sensing Images.ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION,9(8),28. |
MLA | Ding, Kaimeng,et al."A Subject-Sensitive Perceptual Hash Based on MUM-Net for the Integrity Authentication of High Resolution Remote Sensing Images".ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION 9.8(2020):28. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论