Graph and hypergraph learning: theory and applications in computational network analysis

Date
2025
Journal Title
Journal ISSN
Volume Title
Publisher
University of Delaware
Abstract
Neural networks, the computational models inspired by the human brain as powerful and advanced machine learning models, have played a central role in artificial intelligence since the 1980s. Given the power of being universal approximators, learning with neural networks has achieved remarkable performance on wide-range tasks, excelling across numerous applications such as computer vision, natural language processing, and graph-structured data analysis. In particular, modern architectures of neural networks applied in graphs and hypergraphs, combined with scalable optimization techniques and the availability of large-scale data, have further amplified their practical effectiveness across diverse domains. However, understanding their generalization capabilities, which refers to a model’s ability to perform accurately on previously unseen data, remains a fundamental challenge, particularly when modeling complex higher-order structures and competitive dynamics inherent in many real-world scenarios, i.e., social network analysis. ☐ This dissertation addresses these challenges by providing a rigorous generalization and learnability analysis of neural networks explicitly designed for higher-order relational data, with a specific application on computational network analysis. In particular, we combine the theoretical analysis with the conditions and mechanisms that enable efficient learning, which are then complemented by extensive empirical studies to validate these theoretical insights. Specifically, our work first study the learnability of the competitive threshold model from a theoretical perspective. We demonstrate how competitive threshold models can be seamlessly simulated by artificial neural networks with finite VC dimensions, which enables analytical sample complexity and generalization bounds. Based on the proposed hypothesis space, we design efficient algorithms under the empirical risk minimization scheme. For worm dissemination across the wireless sensor networks, we design a communication model under various worms, which are considered vulnerable to attacks by worms and their variants. We learn our proposed model to analytically derive the dynamics of competitive worm propagation. We develop a new searching space combined with complex neural network models. Finally, we develop margin-based generalization bounds for four representative classes of hypergraph neural networks, including convolutional-based methods (UniGCN), set-based aggregation (AllDeepSets), invariant and equivariant transformations (M-IGN), and tensor-based approaches (T-MPHN). Through the PAC-Bayes framework, our results reveal the manner in which hypergraph structure and spectral norms of the learned weights can affect the generalization bounds, where the key technical challenge lies in developing new perturbation analysis for hypergraph neural networks, which offers a rigorous understanding of how variations in the model's weights and hypergraph structure impact its generalization behavior. ☐ By utilizing PAC learnability and PAC-Bayesian frameworks, this work offers theoretical foundations for analyzing the relationship between the neural network architectures and their generalization performance, which shows potential to utilize the theoretical results for model optimization. In addition, with empirical validation on both synthetic and real-world datasets, this work reinforces the applicability of the theoretical findings and demonstrates the models’ ability to handle complex high-order data and competitive dynamics. Altogether, these contributions advance the foundational understanding of neural networks and offer practical methodologies for improving their generalization in structured, competitive environments.
Description
Keywords
Neural networks, Machine learning, Hypergraphs, Social network analysis
Citation