Official implementation of the paper "Self-Supervised Hypergraph Training Framework via Structure-Aware Learning" published in IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI) 2025.
Authors: Yifan Feng, Shiquan Liu, Shihui Ying, Shaoyi Du, Zongze Wu, Yue Gao.
π Overview
Hypergraphs excel at modeling complex, beyond-pairwise correlations. However, integrating hypergraphs into self-supervised learning (SSL) is challenging due to high-order structural variations. SS-HT introduces:
- "Masking and ReMasking" Strategy: Enhances feature reconstruction in Hypergraph Neural Networks (HGNNs).
- Structure-Aware Learning: A metric strategy for local high-order correlation changes using Wasserstein distance.
- Strong Performance: Significant improvements in low-label settings (e.g., 32% gain on Cora-CC with 1% labels).
Abstract (Click to expand)
Hypergraphs, with their ability to model complex, beyond pair-wise correlations, presents a significant advancement over traditional graphs for capturing intricate relational data across diverse domains. However, the integration of hypergraphs into self-supervised learning (SSL) frameworks has been hindered by the intricate nature of high-order structural variations. This paper introduces the Self-Supervised Hypergraph Training Framework via Structure-Aware Learning (SS-HT), designed to enhance the perception and measurement of these variations within hypergraphs. The SS-HT framework employs a βMasking and ReMaskingβ strategy to bolster feature reconstruction in Hypergraph Neural Networks (HGNNs), addressing the limitations of traditional SSL methods. It also introduces a metric strategy for local high-order correlation changes, streamlining the computational efficiency of structural distance calculations. Extensive experiments on 11 datasets demonstrate SS-HTβs superior performance over existing SSL methods for both low-order and high-order data. Notably, the framework significantly reduces data labeling dependency, achieving a 32% improvement over HGNN in the downstream task fine-tuning phase under the 1% labeled data setting in the Cora-CC dataset. Ablation studies further validate SS-HTβs scalability and its capacity to augment the performance of various HGNN methods, underscoring its robustness and applicability in real-world scenarios.
π Project Structure
SS-HT/
βββ config/ # Configuration management (YAML files)
βββ data/ # Data loading, augmentation, and splitting logic
βββ doc/ # Documentation assets (images)
βββ models/ # Core model architectures (HGNN, Losses, Wasserstein Dis)
βββ train/ # Training and evaluation loops
βββ utils/ # Utility functions (logging, seeding)
βββ main.py # Main entry point for training and evaluation
βββ requirements.txt # Project dependencies
βββ readme.md # This file
π Installation
1. Clone the Repository
git clone https://github.com/iMoonLab/SS-HT.git
cd SS-HT2. Create Environment (Conda Recommended)
conda create -n SS-HT python=3.10 conda activate SS-HT pip install -r requirements.txt
π οΈ Usage
Training & Evaluation
To train the SS-HT model and evaluate it on node classification tasks with default settings:
Configuration
You can customize the experiments by modifying config/config.yaml. Key parameters include:
data_name: Dataset choice (e.g.,CC-Cora,CC-Citeseer,DBLP-Paper).encoder_type: GNN/HGNN architecture (hgnn,hgnnp,gat,gcn).mask_rate: Attribute masking ratio (default:0.7).cl&attr: Weighting factors for contrastive and reconstruction losses.
π Supported Datasets
The framework supports various hypergraph datasets including:
- Citation Networks: Cora, Citeseer, CA-Cora, CC-Cora, CC-Citeseer.
- Academic Databases: DBLP-Paper, DBLP-Conf, DBLP-Term.
- Movie Networks: IMDB-Actor, IMDB-Director.
π Citation
If you find this work useful, please consider citing our paper:
@article{feng2025hypersupervised, title={Self-Supervised Hypergraph Training Framework via Structure-Aware Learning}, author={Yifan Feng and Shiquan Liu and Shihui Ying and Shaoyi Du and Zongze Wu and Yue Gao}, journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, year={2025}, publisher={IEEE} }
π¬ Contact
SS-HT is maintained by iMoon-Lab, Tsinghua University. For questions, please contact Yifan Feng.
