TopoTune logo

TopoTune

Make any GNN Go Topological

Mathilde Papillon1, Guillermo Bernárdez1, Claudio Battiloro*2, Nina Miolane*1
*Indicates Equal Contribution
1University of California, Santa Barbara, 2Harvard University
Accepted to ICML 2025

Have you ever been curious to try higher order relationships in your graph neural network? Do the theoretical and practical burdens of such topological deep learning seem daunting? We built TopoTune for you.

In short, TopoTune takes any (graph) neural network as input, whether it be message-passing or otherwise, and builds a general and flexible topological neural network that can work on any topological domain, like a simplicial complex or hypergraph. We call this network a Generalized Combinatorial Complex Neural Network. Simply put, this network breaks down a higher order complex, like a simplicial complex or hypergraph, into an ensemble of graphs that are then processed by an ensemble of synchronized GNNs. In the work, we show this model to be the most general TDL model to date, complete with rank-level permutation equivariance and unparalleled expressivity.

GCCN

Abstract

Graph Neural Networks (GNNs) excel in learning from relational datasets, processing node and edge features in a way that preserves the symmetries of the graph domain. However, many complex systems -- such as biological or social networks--involve multiway complex interactions that are more naturally represented by higher-order topological domains. The emerging field of Topological Deep Learning (TDL) aims to accommodate and leverage these higher-order structures. Combinatorial Complex Neural Networks (CCNNs), fairly general TDL models, have been shown to be more expressive and better performing than GNNs. However, differently from the GNN ecosystem, TDL lacks a principled and standardized framework for easily defining new architectures, restricting its accessibility and applicability. To address this issue, we introduce Generalized CCNNs (GCCNs), a novel simple yet powerful family of TDL models that can be used to systematically transform any (graph) neural network into its TDL counterpart. We prove that GCCNs generalize and subsume CCNNs, while extensive experiments on a diverse class of GCCNs show that these architectures consistently match or outperform CCNNs, often with less model complexity. In an effort to accelerate and democratize TDL, we introduce TopoTune, a lightweight software for defining, building, and training GCCNs with unprecedented flexibility and ease.

Getting Started

Where is TopoTune hosted?

TopoTune is hosted inside TopoBench, a fully integrated benchmarking platform for topological deep learning. Specifically, the GCCN is implemented as a backbone model, one of many implemented on this platform. As a backbone model, it can be paired with any dataset / dataset lifting / encoder / decoder / sub-model GNN / readout etc.

How can I use TopoTune?

As a user, you can either:

  1. Make your own GNN go topological directly in your own infrastructure by importing TopoTune from TopoBench, or
  2. Leverage TopoBench's large scale automated process to rapidly define and test many GCCNs.

We cover both of these options in our comprehensive tutorial.

Quick Start Guide

  • Install the tb environment of TopoBench as per the instructions.
  • Check out our comprehensive tutorial to get started with TopoTune. Select tb as your kernel when running the notebook.

Recorded talk: Introduction to TopoTune

BibTeX

If TopoTune was useful for your work, please consider citing the work:

@misc{papillon2025topotuneframeworkgeneralized,
  title={TopoTune : A Framework for Generalized Combinatorial Complex Neural Networks},
  author={Mathilde Papillon and Guillermo Bernárdez and Claudio Battiloro and Nina Miolane},
  year={2025},
  eprint={2410.06530},
  archivePrefix={arXiv},
  primaryClass={cs.LG},
  url={https://arxiv.org/abs/2410.06530},
  note={Accepted to ICML 2025}
}

Get in Touch

We would be happy to hear from you! You can reach us at papillon@ucsb.edu or via an opened Issue on TopoBench.