TLDR: This paper introduces GraDE, a novel diffusion-guided search framework for discovering frequent subgraph patterns in neural architectures. By leveraging graph diffusion models, GraDE effectively balances computational efficiency and discovery capability, outperforming traditional methods in ranking accuracy and enabling the identification of larger-scale frequent patterns.
Back to publications
Feb 03, 2026
1 min read
GraDE: A Graph Diffusion Estimator for Frequent Subgraph Discovery in Neural Architectures
Authors:
Yikang Yang , Zhengxin Yang* , Minghao Luo , Luzhou Peng , Hongxiao Li , Wanling Gao , Lei Wang , Jianfeng Zhan
Corresponding:
Zhengxin Yang*
Publish @
Preprint
Abstract:
Finding frequently occurring subgraph patterns or network motifs in neural architectures is crucial for optimizing efficiency, accelerating design, and uncovering structural insights. However, as the subgraph size increases, enumeration-based methods are perfectly accurate but computationally prohibitive, while sampling-based methods are computationally tractable but suffer from a severe decline in discovery capability. To address these challenges, this paper proposes GraDE, a diffusion-guided search framework that ensures both computational feasibility and discovery capability. The key innovation is the Graph Diffusion Estimator (GraDE), which is the first to introduce graph diffusion models to identify frequent subgraphs by scoring their typicality within the learned distribution. Comprehensive experiments demonstrate that the estimator achieves superior ranking accuracy, with up to 114% improvement compared to sampling-based baselines. Benefiting from this, the proposed framework successfully discovers large-scale frequent patterns, achieving up to 30 times higher median frequency than sampling-based methods.