GAT
- class dhg.models.GAT(*args, **kwargs)[source]
Bases:
torch.nn.Module
The GAT model proposed in Graph Attention Networks paper (ICLR 2018).
- Parameters
in_channels (
int
) – \(C_{in}\) is the number of input channels.hid_channels (
int
) – \(C_{hid}\) is the number of hidden channels.num_classes (
int
) – The Number of class of the classification task.num_heads (
int
) – The Number of attention head in each layer.use_bn (
bool
) – If set toTrue
, use batch normalization. Defaults toFalse
.drop_rate (
float
) – The dropout probability. Defaults to0.5
.atten_neg_slope (
float
) – Hyper-parameter of theLeakyReLU
activation of edge attention. Defaults to 0.2.