STEG-AIW: Spatio-Temporal Gating and Adaptive-Timestep Inference for Efficient Spiking Neural Networks
Abstract
Spiking neural networks (SNNs) are efficient, yet modern systems still waste compute by propagating redundant activations within a timestep and by using a fixed temporal horizon regardless of input difficulty. We present STEG-AIW, a training and inference framework that addresses both issues. The Spatio-Temporal Efficient Gate (STEG) is a lightweight gating module placed at residual stages. It suppresses non-salient activations while preserving temporal dynamics. The Adaptive Inference Window (AIW) module accumulates per-timestep evidence and converts it to halting probabilities for sample-wise early termination. We train the model end-to-end with a loss that balances task accuracy, an efficiency term proportional to the expected number of timesteps, and a sparsity term on gate activations. A simple complexity analysis links these choices to fewer synaptic operations. On static image benchmarks, STEG-AIW attains state-of-the-art accuracy with 34-88% fewer timesteps than the strongest baselines. On neuromorphic datasets, it matches or exceeds the best accuracy with 43-73% fewer timesteps and reduces synaptic operations accordingly. Overall, STEG-AIW provides a backbone-agnostic path to accurate, low-power inference. This moves SNNs closer to practical deployment. Code will be released upon acceptance.