Abstract: With the evolution of existing modeling languages and the emergence of more and more new modeling languages, it is necessary to rapidly build the corresponding software modeling tools with good quality. However, modeling tools for larger modeling languages are usually diversity in function and complexity in implementation technology. Taking building modeling tools as a domain, this paper presents an approach to building software modeling tools based on metamodeling and product line technologies. The paper provides the concept system of the approach and a feature model from diverse functions of modeling tools in order to specify the commonality and variability of the tools by deeply making the domain analysis, discusses the design and implementation of a general tool framework that provides the conveniences for reusing components and generating code for components, and specifies the mapping between the feature model and the components for modeling tools.
Abstract: For several ontology constructing reasons of migration, extending or merging, the size of ontologies grows and applications become more complex, it is inevitable for us to receive an inconsistent ontology. Measuring ontology inconsistency is the basis inconsistencydealing, which can help us decide how to act on an inconsistency. Measuring ontology inconsistency with weighted formulae in numerous applications is a significant but difficult task for most efforts which only deal with the flat ontology. This paper improves inconsistency measure based on evidence theory and proposes a novel Dempster-Shafer ontology inconsistency measure method. The logical properties of these measures are studied. The properties show how to look inside the formulae and how to indicate the contribution of each formula to the overall inconsistency in the ontology set. This approach would provide us with a stronger base of inconsistency handling.
Abstract: Towards data leak caused by misoperation and malicious inside users, we proposed a multilevel security model based on Bell-lapadula (BLP) model. In our model each subject was assigned with a security level. Subjects can read objects only when their security levels are not less than objects' security levels, and subjects can write objects only when their security levels are not more than objects' security levels. The current security level in our model can be dynamically changed when users read sensitive data, since users can access data with different security levels in private cloud. Our model use mandatory access control method to control user's operation and can guarantee that users can not leak sensitive data after they read them. Our model can be proved secure by mathematical method, and we implemented a prototype system of our model and the experimental results show that it is secure.
Abstract: As the development of Internet of things (IOT), massive sensors have been deployed as the public infrastructure. With the development of in-depth applications in IOT, service discovery and composition are challenges to the end users. To handle this challenge, we develop a service mining scheme based on semantic for IOT to provide users with interesting composite services. In this scheme, services can be combined and recommended to users actively according to the calculation of service similarity and an updatable semantic database. By the results of service similarity, useless compositions can be filtered out so that energy consumption on service flooding will be reduced. The update strategy of semantic database is also given out, by which the composite services can keep up with time and be more applicative. The benefits of the proposed method are that all operations such as calculating, filtering, and updating are simple enough to be performed in sensor networks.
Abstract: Document format translator can achieve information sharing and interoperability between the office document national standard UOF (Uniform office format) and the international standard ISO29500 (Open office XML, OOXML). There are several difficult issues in the manual test of document format translator: the manual test case design is complicated due to the complexity of document standard; test case coverage rate is hard to evaluate and test blind spots are hard to avoid. This paper brings about methods to solve these problems by achieving automated test case generation. Analyzing the XML Schema of document standard in logical level, building UML object model in conceptual level, designing objectoriented algorithm based on the .Net platform, operate the underlying XML code of spreadsheet in physical level, and then compress underlying XML code to generate new spreadsheet documents as test case. A formal definition about the UOF document standard is presented and statistics of functional test point is conducted so as to measure test case coverage rate. Experiments have shown that this method can realize the automated generation of test cases, improve the efficiency of test work, decrease test case redundancy and avoid test blind spots.
Abstract: To improve the performance of threshold proxy re-signatures, the notion of on-line/off-line threshold proxy re-signatures is introduced. The bulk of re-signature computation can be done in the off-line phase before the message arrives. The results of this pre-computation are saved and then utilized in the on-line phase once a message to be re-signed is known. Based on any threshold proxy re-signature scheme and a threshold version of chameleon hash function, we present a generic on-line/off-line threshold proxy re-signature scheme that can convert any existing secure threshold proxy re-signature scheme into an on-line/off-line one. The on-line phase of our scheme is efficient: computing a re-signature share requires one round of communication, two modular additions and one modular multiplication. Our scheme is provably secure under the discrete logarithm assumption without random oracles. Moreover, our scheme can achieve robustness in the presence of [n/3] malicious proxies.
Abstract: The problem of mapping application tasks is one of key issues in 3D Network on chip (3D NoC) design. A novel Logistic function based adaptive genetic algorithm (LFAGA) is proposed for energy-aware mapping of homogeneous 3D NoC. We formulate the mapping problem and show the Standard genetic algorithm (SGA). The LFAGA is presented in detail with the goal of obtaining higher convergence speed while preventing the premature convergence. Experimental results indicate that the proposed LFAGA is more efficient than previously proposed Chaos-genetic mapping algorithm (CGMAP). In the experiments, a randomly generated task graph of size 27 is mapped to a 3D NoC of size 3×3×3, the convergence speed of LFAGA is 2.55 times faster than CGMAP in the best condition. When the task size increases to 64 and the 3D NoC size extends to 4×4×4, LFAGA is 2.31 times faster compared to CGMAP. For the NoC sizes in the range from 3×3×2 to 4×4×4, solutions obtained by the LFAGA are consistently better than the CGMAP. For example, in the experiment of size 4×4×4, the improvement of final result reaches 30.0% in term of energy consumption. For a real application of size 3×4×2, 18.6% of energy saving can be achieved and the convergence speed is 1.58 times faster than that of the CGMAP.
Abstract: Conventional bilingual word alignment is conducted on sentence pairs with single word segmentation for languages such as Chinese, viz. Single-segmentationbased word alignment (SSWA). However, SSWA may run the risk of losing optimal word segmentation granularities or causing data sparseness in word alignment. This paper proposes Multiple-segmentation-based word alignment (MSWA). In MSWA, diverse and complementary knowledge in multiple word segmentations can be employed to lower the above risks in word alignment. Given k word segmentations of a Chinese sentence, a skeleton segmentation is firstly constructed. The alignment between the skeleton segmentation and the parallel English sentence is loglinearly modeled, where various features defined over multiple word segmentations are incorporated. The Viterbi alignment, the alignment with the highest score, is mapped back to k word alignments based on k segmentations respectively. Experimentally, MSWA outperformed SSWA on all k segmentations in both alignment quality and translation performance.
Abstract: Automatic test case generation from a prepost style formal specification must deal with the issue of how to generate test cases from a conjunction of atomic predicate expressions, but unfortunately this problem has not been effectively solved due to its intrinsic difficulty. We describe a practical approach to tackling this problem by utilizing the model checking technique. An algorithm that converts test case generation from a conjunction of atomic predicate expressions into model checking is proposed. We discuss how the algorithm deals with atomic predicate expressions involving only variables of numeric types, and extend the discussion to variables of compound types such as set, sequence, and composite types. Case studies are presented to assess the feasibility and effectiveness of our approach.
Abstract: Boolean and relational operations, which are defined for solving mathematically logical problems, are always required in computing models. Membrane computing is a kind of distributed parallel computing model. In this paper, we design different membranes for implementing primary Boolean and relational operations respectively. And based on these membranes, a membrane system can be constructed by a present algorithm for evaluating a logical expression. Some examples are given to illustrate how to perform the Boolean, relational operations and evaluate the logical expression correctly in these membrane systems.
Abstract: With the rapid development of Mobile ad hoc network (MANET), secure and practical authentication is becoming increasingly important. There still exist several unsolved problems. It may suffer from cheating problems and Denial of services (DoS) attacks towards authentication service. Most existing schemes do not have satisfactory efficiency due to the exponential arithmetic complexity of Shamir's scheme. We explore the property of Verifiable secret sharing (VSS) schemes with Chinese remainder theorem (CRT), then propose a secret key distributed storage scheme based on CRT-VSS and trusted computing techniques. We discuss the homomorphism property of CRT-VSS and design a secure distributed Elliptic curve-digital signature standard (ECCDSS) authentication scheme. We provide formal guarantees towards the scheme proposed in this paper.
Abstract: Test data generation, the premise of software testing, has attracted scholars in the software engineering community in recent years. Influenced by task partitioning, process scheduling, and network delays, parallel programs are executed in a non-deterministic way, which makes test data generation of parallel programs different from that of serial programs in essence. This paper investigated the problem of generating test data for multiple paths coverage of message-passing parallel programs. A mathematical model of the above problem was built based on each given path and its equivalent ones. It was solved by using a genetic algorithm to generate all desired data in one run. The proposed method was applied to five benchmark programs, and compared with the existing methods. The experimental results show that the proposed method greatly shortens the number of iterations and time consumption without reducing the coverage rate.
Abstract: Many schemes have been present to tackle data integrity and retrievability in cloud storage. Few of existing schemes support data dynamics, public verification and protect data privacy simultaneously. We propose a public auditing scheme which enables privacy-preserving, data dynamics and batch auditing. A data updating information table is designed to record the status information of the data blocks and facilitate data dynamics. Homomorphic authenticator and random masking technologies are exploited to protect data privacy for data owners. The scheme employs a Trusted third party auditor (TTPA) to verify the data integrity without learning any information about the data content during the auditing process. The scheme also allows batch auditing so that TTPA can process multiple auditing requests simultaneously which greatly accelerates the auditing process. Security and performance analysis show that our scheme is secure and feasible.
Abstract: To solve the programmability of a membrane system, this paper presents an automatic design method of a cell-like P system framework for performing five basic arithmetic operations. This method proved that different P systems can be designed with same framework by programming. A technique is introduced to remove redundant rules in the process of the design of a P system. The effectiveness and feasibility of this method is verified by experimental results.
Abstract: We consider the scenario where two variables need to be optimized simultaneously. The minimization over one variable has an analytical solution, while it is intractable for the other. Under the Lagrangian dual framework, we propose two iterative optimization algorithms, which make partial minimization and gradient descent alternatingly over two variables. The first algorithm asserts that the iteration result converges to a KKT point under proper stepsize rules, which only needs the augmented Lagrangian function to be convex over partial variable. The second algorithm provides the local attraction property around the KKT point. Our algorithms provide a general solution to parallel and distributed optimization with summable objective functions. Simulation results on parallel and distributed logistic regression classification are present, which show faster convergence rate with less computational complexity compared with other methods.
Abstract: Distributed data stream processing system is NP-complete problem to assign tasks to any number of nodes handling the task scheduling. Even for substantially reducing scheduling scale, the problem still cannot be avoided. This paper takes advantage of the classical algorithm (ant colony optimization) of heuristic methods to simulate the global task scheduling problem of distributed system. Rational improvement on ant colony optimization path-finding for the memory and CPU usage of each node achieves load balancing in a short time. It gives the suboptimal solution of the global task scheduling. The experiments show that the data stream processing system we proposed has good real-time characteristics and stability.
Abstract: With the rapid development of information technology, short texts arising from socialized human interaction are gradually predominant in network information streams. Accelerating demands are requiring the industry to provide more effective classification of the brief texts. However, faced with short text documents, each of which contains only a few words, traditional document classification models run into difficulty. Aggressive documents expansion works remarkably well for many cases but suffers from the assumption of independent, identically distributed observations. We formalize a view of classification using Bayesian decision theory, treat each short text as observations from a probabilistic model, called a statistical language model, and encode classification preferences with a loss function defined by the language models and the external reference document. According to Vapnik's methods of Structural risk minimization (SRM), the optimal classification action is the one that minimizes the structural risk, which provides a result that allows one to trade off errors on the training sample against improved generalization performance. We conduct experiments by using several corpora of microblog-like data, and analyze the experimental results. With respect to established baselines, results of these experiments show that applying our proposed document expansion method produces better chance to achieve the improved classification performance.
Abstract: Text is very important to video retrieval, index, and understanding. However, its detection and extraction is challenging due to varying background, low contrast between text and non-text regions, and perspective distortion. In this paper, we propose a novel two phase approach to tackling this problem by discriminative features and edge density. The first phase firstly defines and extracts a novel feature called edge distribution entropy and then uses this feature to remove most non-text regions. The second phase employs a Support vector machine (SVM) to further distinguish real text regions from nontext ones. To generate inputs for SVM, additional three novel features are defined and extracted from each region: a foreground pixel distribution entropy, skeleton/size ratio, and edge density. After text regions have been detected, texts are extracted from such regions that are surrounded by sufficient edge pixels. A comparative study using two publicly accessible datasets shows that the proposed method significantly outperforms the selected four state of the art ones for accurate text detection and extraction.
Abstract: This paper extends the application of Block-matching (BM) and 3D transform-domain collaborative filtering (BM3D) to the noise reduction in Interferometric synthetic aperture radar (InSAR) phase imagery, and proposes a denoising algorithm which can effectively remove noise and preserve fringes. Since the noise level estimated by a median estimator is not always optimal for proposed algorithm in wavelet domain, a method of calculating optimal noise standard deviation is also developed. The proposed algorithm is efficient and robust. Experimental results show that the visual quality and evaluation indexes of proposed algorithm outperform other filters for both simulated and real InSAR images.
Abstract: An intelligent frequency fitting algorithm is presented for continuous-wave radar to track human movers through walls. With the proper fitting dimension, which is usually determined by the practical detection requirements, this technique can improve the localization accuracy and the tracking performance without introducing too much calculation burden, and more precisely identify different targets even in the frequency ambiguous areas. This technology can also help to match the estimation results to their corresponding targets automatically, which is of great importance to certain urban sensing applications, such as specific target surveillance and tracking. To further improve the detection performance and better identify the weak targets in the presence of strong noise, CLEAN algorithm and adaptive filter technology are also involved. Experimental results are provided to illustrate the algorithm performance.
Abstract: Developing algorithms based on lip contour estimation is a distinctive trend in lip segmentation which is the first step of visual speech recognition. In order to establish an optimized estimation of lip contour that is complex enough to describe the principal features of the lip but at the same time simple enough to be implemented, the selection of lip model, estimator as well as parameters for features of lips, including the horizontal length of snake of feature points and horizontal distance between these feature points will be optimized. Experimental result demonstrates that the optimized estimation method of lip contour provides more accurate and more stable results of lip segmentation.
Abstract: The number of identified integer overflow vulnerabilities has been increasing rapidly in recent years. In this paper, a smart software vulnerability detection technology is presented, which is used for the identification of integer overflow vulnerabilities in binary executables. The proposed algorithm is combined with Target filtering and dynamic taint tracing (TFDTT). Dynamic taint tracing is used to reduce the mutation space and target filtering function is used to filter test cases during the process of test case generation. Theory analysis indicates that the efficiency of TFDTT is higher than NonTF-DTT and random Fuzzing technology. And the experiment results indicate that the detection technology based upon TFDTT can identify the possible integer vulnerabilities in binary program, meanwhile, it is more efficiency than other two technologies.
Abstract: To strengthen a chaotic system, a state feedback controller is applied to generate hyperchaotic behaviors based on the simplified Lorenz system, and a coordinate transformation is used for converting topological structure of the hyperchaotic system from two wings to four wings. Complex dynamics of the hyperchaotic and four-wing attractor system are analyzed and verified by Lyapunov exponent spectrums, bifurcation diagrams, phase portraits, Poincaré sections and circuit realization. The circuit experiment results are agreed well with the simulation results, and it lays a good foundation for the chaotic secure communication.
Abstract: Impossible differential cryptanalysis is a powerful tool to evaluate the strength of a block cipher structure, and the key step of this cryptanalysis is to find the longest impossible differential. Recently a series of generalized Feistel structures named New-structure Ⅰ, Ⅱ, Ⅲ and Ⅳ were proposed, which were designed with full consideration of differential and linear cryptanalysis security. In this paper, we investigate the impossible differential properties of New-structure series, and we show that there always exists 14/∞/19/15 rounds impossible differential for New-structure Ⅰ, Ⅱ, Ⅲ and Ⅳ respectively.
Abstract: Discrete cosine transform (DCT) is an effective method to extract proper features for face recognition. Discrete cosine transform can only map the resource data to another data field instead of compress data. How to select the DCT coefficients that are most effective for classification is an important problem. This paper proposes a novel method to search the best discriminant combination of DCT coefficients. A feature selection algorithm according to the separability criterion is used to preselect the DCT coefficients, and then follows a search algorithm based on binary particle swarm optimization and support vector machine to find an optimal combination of the DCT coefficient. The performance of the algorithm is assessed by computing the recognition rate and the number of selected features on ORL database and Cropped Yale database.
Abstract: Most existing noise reduction methods in side-channel cryptanalysis treat all noises as a whole, instead of identifying and dealing with each of them individually. Motivated by this, this paper investigates the feasibility and implications of identifying trend noise from any other noises in side-channel acquisitions and then dealing with it accordingly. We discuss the effectiveness of applying least square method to remove inherent trend noise in side-channel leakages, and also clarify the limited capability of existing noise reduction methods in dealing with trend noise. We performed a series of correlation power analysis attacks against real power traces publicly available in DPA contest v2 from an unprotected FPGA implementation of Advanced encryption standard (AES) encryption. The experimental results firmly confirmed the soundness and validity of our analysis and observations.
Abstract: The queuing performance of the cross-layer system with Space-time block code (STBC) over Multiinput-multi-output (MIMO) channel is analyzed. The Adaptive modulation and coding (AMC) in physical layer and finite length of buffer in data-link layer are combined with this model. A simple cross-layer design and the queue state process are proposed based on finite state Markov channel model, which is to solve the service process. The proposed cross-layer design is also incorporated with STBC to further improve the system performance. Simulation results illustrate the dependence of system performance on various parameters, and quantify the performance gain due to cross-layer optimization. The adjustment of target packet loss rate of adaptive modulation coding in physical layer can maximize average throughput of the system, meanwhile a significant improvement of the system performance is brought.
Abstract: This paper proposes an enhanced Interference rejection combining (IRC) algorithm for Long term evolution (LTE) downlink receiver in multi-cell communication systems. In this algorithm, a proper Multiple input multiple output (MIMO) receive method is adopted according to Generalized likelihood ratio test (GLRT) intercell interference detection. Iteration between channel estimation and data detection is carried out to improve the performance of IRC algorithm. Simulation results show that this proposed algorithm can effectively detect intercell interference and improve Block error rate (BLER) performance and channel estimation Mean squared error (MSE) compared to non-iterative IRC algorithm, making it suitable for LTE downlink receiver in multi-cell cellular systems.
Abstract: Virtual network embedding (VNE) is a crucial technology for network virtualization to allocate network resource. Virtual network request in which node and link resource have great disparity will lead to unbalanced resource distribution and lower acceptance ratio of virtual network requests. In this paper, we provide a virtual network embedding algorithm for load balance with various requests. It maps virtual nodes to substrate nodes of which node link residual resource ratio is closest to that of virtual nodes, and then maps the virtual link to physical path using shortest path first algorithm with the link node residual resource ratio constraint. Simulation shows that the provided algorithm can get higher acceptance ratio and network resource utilization for load balance.
Abstract: This paper has proposed a novel method to reduce the multi-keyword query traffic in Kademlia-like Peer-to-peer (P2P) networks by optimizing the Bloom filter settings. We build some models to estimate the communication cost, the union set size, and the loss rate of performing union and intersection operations. We implement a Kademlia-like system and generate a group of datasets. We use one part of the datasets to obtain the functions how to compute the optimal parameters and use another part of datasets to verify our method. Each query can determine the optimal settings of Bloom filter with no extra configuration. Our simulation experimental results show that with optimal Bloom filters settings, we can greatly reduce the communication cost under an acceptable loss rate.
Abstract: A novel cooperative spectrum sensing order which utilizes inactive Secondary users (SUs) efficiently based on maximum throughput has been proposed in Cognitive radio networks (CRNs). In order to predict the states of Primary users (PUs), we build the PU's traffic pattern as a Continuous-time Markov chain (CTMC) process. CRNs obtain the maximum throughput while SUs sense the licensed channels with the optimal order. The numerical simulation results show that the proposed order based on spectrum sensing scheme can achieve larger channel utilization and lower sensing overhead as compared with the spectrum sensing scheme without using the optimal order sensing. After considering the report overhead of SUs, the optimal number of inactive SUs for the maximum throughput can be found.
Abstract: Cooperative spectrum sensing has been shown to be an effective method to mitigate the impact of hidden terminal and shadow fading in cognitive radio networks. Currently most works focused on either cooperative sensing or sensing scheduling as a viable means to improve the detection performance without studying their interactions. This paper proposed a dynamic and variable time-division multiple-access scheduling mechanism that was incorporated into a Dual-stage collaborative spectrum sensing (DCSS) model in a less ideal radio environment. Moreover, this paper derived closed-form expressions of sensing time for DCSS, and addressed the critical range of timeslot length. An optimized algorithm to minimize sensing time was deduced and verified. The simulation results indicate that average sensing time with DCSS is shortened by 11.5% when compared with that of the Single collaborative spectrum sensing (SCSS) while guaranteeing detection error rate requirement of less than 1%.
Abstract: Image registration is widely used in image processing. Researchers have introduced image registration techniques based on the log-polar transform for its rotation and scale invariant properties. It suffers from nonuniform sampling which makes the registration results susceptible to interference. To address the problems of traditional log-polar transform, a Complete polar transform (CPT) method is proposed, which samples the image evenly to preserve the whole information of original image. An innovative projection transform is applied after CPT to obtain the rotation invariant property. We pre-matched feature points using the Scale invariant feature transform (SIFT)algorithm and re-matched them based onCPT to improve matching speed and accuracy. Experimental results show that the proposed method is accurate and robust to noise and alteration.
Abstract: Modern societies increasingly rely on automatic control systems. These systems are hardly pure technical systems; instead they are complex socio-technical systems, which consist of technical elements and social components. It is necessary to have a systematic approach to analyze these systems because it is growing evidence that accidents from these systems usually have complex causal factors which form an interconnected network of events, rather than a simple cause-effect chain. We take railway Train control systems (TCS) as an example to demonstrate the importance of the socio-technical approach to analyze the system. The paper presents an investigation of recent high-speed railway accident by applying STAMP -one of the most notable socio-technical system analysis techniques, outlines improvements to the system which could avoid similar accidents in the future. We also provide our valuable feedback for the use of STAMP.
Abstract: A The resonance of theModified dumbbellshaped defected ground structure (MD-DGS) resonator on coplanar waveguide is investigated systematically based on the phase coherence of electromagnetic waves. Through the analysis of two presented interference modes, the resonant frequencies can be fully predicted, and the physical understanding behind these resonances is clearly revealed. The proposed analysis is verified by the simulation and measurement results of the MD-DGS resonator. Both the electric field distribution and the Poynting vector distribution of the MD-DGS resonator at the resonant frequencies further demonstrate the validity of the theoretical analysis.
Abstract: In radar target recognition based on kernel method, Support vector data description (SVDD) has been applied to High resolution range profiles (HRRPs) recognition. In this paper, first, three distribution models, i.e., Membership model, Cloud model, Gaussian mixture model, are developed to describe distribution characteristics of HRRPs in extended space. Secondly, test HRRPs in multi-target hypersphere spaces are classified, in accordance with their multi-space distributing characteristics, into two types, i.e., shrink sample and slack sample. Determination of the property of a test sample is achieved by using the minimum relative distance for shrink samples and three distribution models for slack samples, respectively. Thus three HRRP recognition methods based on dual space SVDD are formed. The extensive experiment results for HRRPs of four planes show that the proposed methods have better recognition performance than the recognition method based on single space SVDD.
Abstract: To fully exploit the limited flight-time of the flying robot, and ensure the successful visibility of target, viewpoint optimization is proposed in this paper for the inspection of electricity transmission tower equipment with an optimization function to determine the best viewpoints in a local viewpoint region. The local viewpoint regions are generated from the local objective regions which are determined by the geometrical structure of a priori 3D model for the electricity transmission tower equipment. The optimization function is structured based on three factors including visibility, viewing quality and observation distance. In addition, the fitness function of genetic algorithm is used to find the optimal viewpoint. The experimental results demonstrate the effectiveness and efficiency of the proposed viewpoint selection algorithm.
Abstract: A novel surface treatment method of plating Cu+PPS film/coating on a mobile phone's stainless steel frame for improving the antenna system efficiencies is proposed. The mobile phone was measured in free space, in a silicon cover, and in the hand and cover simultaneously. It's found that with this surface treatment, the total efficiency of the antenna system can be improved in all the four cases respectively by 14.22%, 1.38%, 15.19% and 1.72% at 940MHz (GSM900:880-960MHz), 2.59%, 3.21%, 4.81% and 1.43% at 1720MHz (DCS:1710-1880MHz) and 6.34%, 2.85%, 9.83% and 2.32% at 2100MHz (WCDMA:1920-2170MHz). This lowcost surface treatment method is an important breakthrough to improve antenna system performance of mobile phones especially for those with a stainless steel frame, and suitable for mass production.