topobench.nn.backbones.graph package#

Submodules#

topobench.nn.backbones.graph.graph_mlp module#

Graph MLP backbone from yanghu819/Graph-MLP.

class topobench.nn.backbones.graph.graph_mlp.GraphMLP(in_channels, hidden_channels, order=1, dropout=0.0, **kwargs)[source]#

Bases: Module

“Graph MLP backbone.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

orderint, optional

To compute order-th power of adj matrix (default: 1).

dropoutfloat, optional

Dropout rate (default: 0.0).

**kwargs

Additional arguments.

forward(x)[source]#

Forward pass.

Parameters:
xtorch.Tensor

Input tensor.

Returns:
torch.Tensor

Output tensor.

class topobench.nn.backbones.graph.graph_mlp.Mlp(input_dim, hid_dim, dropout)[source]#

Bases: Module

MLP module.

Parameters:
input_dimint

Input dimension.

hid_dimint

Hidden dimension.

dropoutfloat

Dropout rate.

forward(x)[source]#

Forward pass.

Parameters:
xtorch.Tensor

Input tensor.

Returns:
torch.Tensor

Output tensor.

topobench.nn.backbones.graph.graph_mlp.get_feature_dis(x)[source]#

Get feature distance matrix.

Parameters:
xtorch.Tensor

Input tensor.

Returns:
torch.Tensor

Feature distance matrix.

topobench.nn.backbones.graph.identity_gnn module#

This module contains the implementation of identity GNNs.

class topobench.nn.backbones.graph.identity_gnn.IdentityGAT(in_channels, hidden_channels, out_channels, num_layers, norm, heads=1, dropout=0.0)[source]#

Bases: Module

Graph Attention Network (GAT) with identity activation function.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

out_channelsint

Number of output features.

num_layersint

Number of layers.

normtorch.nn.Module

Normalization layer.

headsint, optional

Number of attention heads. Defaults to 1.

dropoutfloat, optional

Dropout rate. Defaults to 0.0.

forward(x, edge_index)[source]#

Forward pass.

Parameters:
xtorch.Tensor

Input node features.

edge_indextorch.Tensor

Edge indices.

Returns:
torch.Tensor

Output node features.

class topobench.nn.backbones.graph.identity_gnn.IdentityGCN(in_channels, hidden_channels, out_channels, num_layers, norm, dropout=0.0)[source]#

Bases: Module

Graph Convolutional Network (GCN) with identity activation function.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

out_channelsint

Number of output features.

num_layersint

Number of layers.

normtorch.nn.Module

Normalization layer.

dropoutfloat, optional

Dropout rate. Defaults to 0.0.

forward(x, edge_index)[source]#

Forward pass.

Parameters:
xtorch.Tensor

Input node features.

edge_indextorch.Tensor

Edge indices.

Returns:
torch.Tensor

Output node features.

class topobench.nn.backbones.graph.identity_gnn.IdentityGIN(in_channels, hidden_channels, out_channels, num_layers, norm, dropout=0.0)[source]#

Bases: Module

Graph Isomorphism Network (GIN) with identity activation function.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

out_channelsint

Number of output features.

num_layersint

Number of layers.

normtorch.nn.Module

Normalization layer.

dropoutfloat, optional

Dropout rate. Defaults to 0.0.

forward(x, edge_index)[source]#

Forward pass.

Parameters:
xtorch.Tensor

Input node features.

edge_indextorch.Tensor

Edge indices.

Returns:
torch.Tensor

Output node features.

class topobench.nn.backbones.graph.identity_gnn.IdentitySAGE(in_channels, hidden_channels, out_channels, num_layers, norm, dropout=0.0)[source]#

Bases: Module

GraphSAGE with identity activation function.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

out_channelsint

Number of output features.

num_layersint

Number of layers.

normtorch.nn.Module

Normalization layer.

dropoutfloat, optional

Dropout rate. Defaults to 0.0.

forward(x, edge_index)[source]#

Forward pass.

Parameters:
xtorch.Tensor

Input node features.

edge_indextorch.Tensor

Edge indices.

Returns:
torch.Tensor

Output node features.

Module contents#

Graph backbones with automated exports.

class topobench.nn.backbones.graph.GraphMLP(in_channels, hidden_channels, order=1, dropout=0.0, **kwargs)#

Bases: Module

“Graph MLP backbone.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

orderint, optional

To compute order-th power of adj matrix (default: 1).

dropoutfloat, optional

Dropout rate (default: 0.0).

**kwargs

Additional arguments.

forward(x)#

Forward pass.

Parameters:
xtorch.Tensor

Input tensor.

Returns:
torch.Tensor

Output tensor.

class topobench.nn.backbones.graph.IdentityGAT(in_channels, hidden_channels, out_channels, num_layers, norm, heads=1, dropout=0.0)#

Bases: Module

Graph Attention Network (GAT) with identity activation function.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

out_channelsint

Number of output features.

num_layersint

Number of layers.

normtorch.nn.Module

Normalization layer.

headsint, optional

Number of attention heads. Defaults to 1.

dropoutfloat, optional

Dropout rate. Defaults to 0.0.

forward(x, edge_index)#

Forward pass.

Parameters:
xtorch.Tensor

Input node features.

edge_indextorch.Tensor

Edge indices.

Returns:
torch.Tensor

Output node features.

class topobench.nn.backbones.graph.IdentityGCN(in_channels, hidden_channels, out_channels, num_layers, norm, dropout=0.0)#

Bases: Module

Graph Convolutional Network (GCN) with identity activation function.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

out_channelsint

Number of output features.

num_layersint

Number of layers.

normtorch.nn.Module

Normalization layer.

dropoutfloat, optional

Dropout rate. Defaults to 0.0.

forward(x, edge_index)#

Forward pass.

Parameters:
xtorch.Tensor

Input node features.

edge_indextorch.Tensor

Edge indices.

Returns:
torch.Tensor

Output node features.

class topobench.nn.backbones.graph.IdentityGIN(in_channels, hidden_channels, out_channels, num_layers, norm, dropout=0.0)#

Bases: Module

Graph Isomorphism Network (GIN) with identity activation function.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

out_channelsint

Number of output features.

num_layersint

Number of layers.

normtorch.nn.Module

Normalization layer.

dropoutfloat, optional

Dropout rate. Defaults to 0.0.

forward(x, edge_index)#

Forward pass.

Parameters:
xtorch.Tensor

Input node features.

edge_indextorch.Tensor

Edge indices.

Returns:
torch.Tensor

Output node features.

class topobench.nn.backbones.graph.IdentitySAGE(in_channels, hidden_channels, out_channels, num_layers, norm, dropout=0.0)#

Bases: Module

GraphSAGE with identity activation function.

Parameters:
in_channelsint

Number of input features.

hidden_channelsint

Number of hidden units.

out_channelsint

Number of output features.

num_layersint

Number of layers.

normtorch.nn.Module

Normalization layer.

dropoutfloat, optional

Dropout rate. Defaults to 0.0.

forward(x, edge_index)#

Forward pass.

Parameters:
xtorch.Tensor

Input node features.

edge_indextorch.Tensor

Edge indices.

Returns:
torch.Tensor

Output node features.

class topobench.nn.backbones.graph.Mlp(input_dim, hid_dim, dropout)#

Bases: Module

MLP module.

Parameters:
input_dimint

Input dimension.

hid_dimint

Hidden dimension.

dropoutfloat

Dropout rate.

forward(x)#

Forward pass.

Parameters:
xtorch.Tensor

Input tensor.

Returns:
torch.Tensor

Output tensor.