Designing Concise ConvNets with Columnar Stages

0
citations
#2434
in ICLR 2025
of 3827 papers
2
Top Authors
4
Data Points

Abstract

In the era of vision Transformers, the recent success of VanillaNet shows the hugepotential of simple and concise convolutional neural networks (ConvNets). Wheresuch models mainly focus on runtime, it is also crucial to simultaneously focuson other aspects, e.g., FLOPs, parameters, etc, to strengthen their utility further.To this end, we introduce a refreshing ConvNet macro design called ColumnarStage Network (CoSNet). CoSNet has a systematically developed simple andconcise structure, smaller depth, low parameter count, low FLOPs, and attention-less operations, well suited for resource-constrained deployment. The key noveltyof CoSNet is deploying parallel convolutions with fewer kernels fed by inputreplication, using columnar stacking of these convolutions, and minimizing the useof 1×1 convolution layers. Our comprehensive evaluations show that CoSNet rivalsmany renowned ConvNets and Transformer designs under resource-constrainedscenarios. Pretrained models shall be open-sourced.

Citation History

Jan 25, 2026
0
Jan 27, 2026
0
Jan 27, 2026
0
Jan 28, 2026
0