Linguistic Steganalysis via Fusing Multi-Granularity Attentional Text Features
-
Graphical Abstract
-
Abstract
Deep learning based language models have improved generation-based linguistic steganography, posing a huge challenge for linguistic steganalysis. The existing neural-network-based linguistic steganalysis methods are incompetent to deal with complicated text because they only extract single-granularity features such as global or local text features. To fuse multi-granularity text features, we present a novel linguistic steganalysis method based on attentional bidirectional long-short-term-memory (BiLSTM) and short-cut dense convolutional neural network (CNN). The BiLSTM equipped with the scaled dot-product attention mechanism is used to capture the long dependency representations of the input sentence. The CNN with the short-cut and dense connection is exploited to extract sufficient local semantic features from the word embedding matrix. We connect two structures in parallel, concatenate the long dependency representations and the local semantic features, and classify the stego and cover texts. The results of comparative experiments demonstrate that the proposed method is superior to the state-of-the-art linguistic steganalysis.
-
-