Second-Order Global Attention Networks for Graph Classification and Regression
Hu Fenyu2,3; Cui Zeyu1; Wu Shu2,3; Liu Qiang2,3; Wu Jinlin2,3; Wang Liang2,3; Tan Tieniu2,3
2022-08
会议日期August 27-28, 2022
会议地点Beijing, China
英文摘要

Graph Neural Networks (GNNs) are powerful to learn representation of graph-structured data, which fuse both attributive and topological information. Prior researches have investigated the expressive power of GNNs by comparing it with Weisfeiler-Lehman algorithm. In spite of having achieved promising performance for the isomorphism test, existing methods assume overly restrictive requirement, which might hinder the performance on other graph-level tasks, e.g., graph classification and graph regression. In this paper, we argue the rationality of adaptively emphasizing important information. We propose a novel global attention module from two levels: channel level and node level. Specifically, we exploit second-order channel correlation to extract more discriminative representations. We validate the effectiveness of the proposed approach through extensive experiments on eight benchmark datasets. The proposed method performs better than the other state-of-the-art methods in graph classification and graph regression tasks. Notably, It achieves 2.7% improvement on DD dataset for graph classification and 7.1% absolute improvement on ZINC dataset for graph regression.

内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/52324]  
专题自动化研究所_智能感知与计算研究中心
通讯作者Wu Shu
作者单位1.DAMO Academy, Alibaba Group, Hangzhou, China
2.University of Chinese Academy of Sciences, Beijing, China
3.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China
推荐引用方式
GB/T 7714
Hu Fenyu,Cui Zeyu,Wu Shu,et al. Second-Order Global Attention Networks for Graph Classification and Regression[C]. 见:. Beijing, China. August 27-28, 2022.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace