Adaptive Residual Connection in Graph Neural Networks

Picture of ahad-zehmakan.md Ahad N. Zehmakan

27 Jun 2025

Background

Graph Neural Networks (GNNs) have emerged as a powerful framework for extending deep learning to graph-structured data, capable of simultaneously learning from both node features and topological connections (edges). Unlike conventional neural networks that process isolated data points, GNNs leverage message passing mechanisms to aggregate information from local neighbourhoods, enabling them to capture structural dependencies within the graph. This unique capability has made GNNs particularly effective for fundamental graph problems, including node classification, link prediction, and graph-level classification.

A critical advancement in GNN architectures has been the incorporation of residual connections. These connections typically preserve information by combining (through summation or concatenation operations) either the initial or current layer’s node representations with the aggregated neighbourhood information.

Questions

While effective, current implementations primarily utilise static residual connections that maintain uniform connection strengths across all nodes and network layers. This project investigates adaptive residual connections, a novel approach that enables node-specific residual strength modulation based on local graph properties and structural characteristics. We will evaluate these adaptive connections by comparing their performance against standard GNN baselines, using accuracy improvements as our key metric. Hence, two key research questions arise:

  • Graph-Structured Residual Adaptation: How can we adaptively determine node-specific residual strengths based on the underlying graph topology?
  • Centrality-Driven Residual Analysis: Does a measurable correlation exist between a node’s structural centrality (e.g., degree, betweenness, etc) and its optimal residual strength?

This project aims to investigate the aforementioned questions. We will generalise fundamental graph neural network models, such as graph convolutional networks, to incorporate adaptive residual connections. Through extensive experiments and theoretical analysis, we will determine the optimal residual strengths, ensuring that our model consistently outperforms well-studied GNNs.

Requirements

The student should have a strong background in network optimisation, neural networks, Python, and deep learning frameworks (e.g., PyTorch, TensorFlow).

Related References

  • Semi-supervised classification with graph convolutional networks
  • Predict then propagate: Graph neural networks meet personalized pagerank
  • Simple and deep graph convolutional networks
  • Residual connections and normalization can provably prevent oversmoothing in GNNs

Contact

Supervisor: Ahad N. Zehmakan

Email: ahadn.zehmakan@anu.edu.au.com

If you are interested, please write me an email, including (1) what aspects of this project interest you the most, (2) what type of research project you are looking for, 6-unit, 12-unit, or 24-unit, (3) a copy of your transcripts and/or CV, (4) any questions you may have.

arrow-left bars magnifying-glass xmark