Zhiqiang Chen, Ting-Bing Xu, Jinpeng Li, Huiguang He. Sharing Weights in Shallow Layers via Rotation Group Equivariant Convolutions[J]. Machine Intelligence Research, 2022, 19(2): 115-126. DOI: 10.1007/s11633-022-1324-5
Citation: Zhiqiang Chen, Ting-Bing Xu, Jinpeng Li, Huiguang He. Sharing Weights in Shallow Layers via Rotation Group Equivariant Convolutions[J]. Machine Intelligence Research, 2022, 19(2): 115-126. DOI: 10.1007/s11633-022-1324-5

Sharing Weights in Shallow Layers via Rotation Group Equivariant Convolutions

  • The convolution operation possesses the characteristic of translation group equivariance. To achieve more group equivariances, rotation group equivariant convolutions (RGEC) are proposed to acquire both translation and rotation group equivariances. However, previous work paid more attention to the number of parameters and usually ignored other resource costs. In this paper, we construct our networks without introducing extra resource costs. Specifically, a convolution kernel is rotated to different orientations for feature extractions of multiple channels. Meanwhile, much fewer kernels than previous works are used to ensure that the output channel does not increase. To further enhance the orthogonality of kernels in different orientations, we construct the non-maximum-suppression loss on the rotation dimension to suppress the other directions except the most activated one. Considering that the low-level-features benefit more from the rotational symmetry, we only share weights in the shallow layers (SWSL) via RGEC. Extensive experiments on multiple datasets (i.e., ImageNet, CIFAR, and MNIST) demonstrate that SWSL can effectively benefit from the higher-degree weight sharing and improve the performances of various networks, including plain and ResNet architectures. Meanwhile, the convolutional kernels and parameters are much fewer (e.g., 75%, 87.5% fewer) in the shallow layers, and no extra computation costs are introduced.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return