Construction of Differential-Neural Distinguishers Based on SKNet for Speck and Speck
-
Abstract
Recent advancements in differential-neural cryptanalysis have highlighted the effectiveness of deep learning-based distinguishers in capturing subtle statistical irregularities in ciphertexts. However, their performance is highly sensitive to architectural parameters, particularly the convolutional kernel size. To address this sensitivity, we propose SKNet, a novel neural architecture that employs soft attention to adaptively integrate features extracted with multiple kernel sizes. This design avoids the need for exhaustive hyperparameter tuning and enhances multiscale feature representation. When applied to the Speck and Simon cipher families, SKNet consistently surpasses existing architectures. For example, on 9-round Speck128/256 and 14-round Simon128/256, SKNet achieves accuracy improvements of 1.03% and 1.00%, over the current state-of-the-art neural networks. These findings demonstrate SKNet’s effectiveness and broad applicability in constructing robust neural distinguishers for a variety of block ciphers.
-
-