2019. 4th Issue
Volume XI, Number 4
Full issue (9,6 MB)
MESSAGE FROM THE GUEST EDITORS
Vaclav (Vashek) Matyas, Pavol Zajac, Jan Hajny and Marek Sys
Special Issue on Cryptology – Guest Editorial
This special issue brings selected papers from the 2019 Central European Conference on Cryptology, held in Telč, June 12-14, 2019.
Michal Andrzejczak and Wladyslaw Dudzic
SAT Attacks on ARX Ciphers with Automated Equations Generation
We propose a novel and simple approach to algebraic attack on block ciphers with the SAT-solvers. As opposed to a standard approach, the equations for key expansion algorithms are not included in the formulas that are converted to satisfiability problem. The lack of equations leads to finding the solution much faster. The method was used to attack a lightweight block ciphers - SIMON and SPECK. We report the timings for roundreduced versions of selected ciphers and discuss the potential factors affecting the execution time of our attack.
Mithilesh Kumar, Havard Raddum, and Srimathi Varadharajan
Reducing Lattice Enumeration Search Trees
We revisit the standard enumeration algorithm for finding the shortest vectors in a lattice, and study how the number of nodes in the associated search tree can be reduced. Two approaches for reducing the number of nodes are suggested. First we show that different permutations of the basis vectors have a big effect on the running time of standard enumeration, and give a class of permutations that give relatively few nodes in the search tree. This leads to an algorithm called hybrid enumeration that has a better running time than standard enumeration when the lattice is large. Next we show that it is possible to estimate the signs of the coefficients yielding a shortest vector, and that a pruning strategy can be based on this fact. Sign-based pruning gives fewer nodes in the search tree, and never missed the shortest vector in the experiments we did.
Pawel Augustynowicz and Krzysztof Kanciak
The search of square m-sequences with maximum period via GPU and CPU
This paper deals with the efficient parallel search of square m-sequences on both modern CPUs and GPUs. The key idea is based on applying particular vector processor instructions with a view to maximizing the advantage of Single Instruction Multiple Data (SIMD) and Single Instruction Multiple Threads (SIMT) execution patterns. The developed implementation was adjusted to testing for the maximum-period of m-sequences of some particular forms. Furthermore, the early abort sieving strategy based on the application of SAT-solvers were presented. With this solution, it is possible to search m-sequences up to degree 32 exhaustively.
Pavol Zajac, and Peter Spacek
A New Type of Signature Scheme Derived from a MRHS Representation of a Symmetric Cipher
We propose a new concept of (post-quantum) digital signature algorithm derived from a symmetric cipher. Key derivation is based on a system of Multiple-Right-Hand-Sides equations. The source of the equations is the encryption algorithm. Our trapdoor is based on the difficulty of creating a valid transcript of the encryption algorithm for a given plaintext (derived from the signed message): the signer can use the encryption algorithm, because he knows the secret key, and the verifier can only check that the solution of the equation system is correct. To further facilitate the verification, we use techniques from coding theory. Security of the system is based on the difficulty of solving MRHS equations, or equivalently on the difficulty of the decoding problem (both are NP hard).
PAPERS FROM OPEN CALL
Ádám Vécsi, Attila Bagossy, and Attila Pethő
Cross-platform Identity-based Cryptography using WebAssembly
The explosive spread of the devices connected to the Internet has increased the need for efficient and portable cryptographic routines. Despite this fact, truly platformindependent implementations are still hard to find. In this paper, an Identitybased Cryptography library, called CryptID is introduced. The main goal of this library is to provide an efficient and opensource IBC implementation for the desktop, the mobile, and the IoT platforms. Powered by WebAssembly, which is a specification aiming to securely speed up code execution in various embedding environments, CryptID can be utilized on both the client and the server-side. The second novelty of CrpytID is the use of structured public keys, opening up a wide range of domain-specific use cases via arbitrary metadata embedded into the public key. Embedded metadata can include, for example, a geolocation value when working with geolocation-based Identitybased Cryptography, or a timestamp, enabling simple and efficient generation of singleuse keypairs. Thanks to these characteristics, we think, that CryptID could serve as a real alternative to the current Identitybased Cryptography implementations.
Zakir Hussain, Asim ur Rehman Khan, Haider Mehdi and Aamir Ali
Performance Analysis of Communication System with Fluctuating Beckmann Fading
In this paper, performance of device-to-device (D2D) communication system over Fluctuating Beckmann (FB) fading channels is analyzed. FB fading model is a novel generalized fading model that unifies various fading models such as Rayleigh, Nakagami, one-sided Gaussian, Rician, Rician shadowed, κ-μ, κ-μ shadowed, η-μ and Beckmann. The considered D2D system is assumed to be affected by various FB faded co-channel interferers. Using the characteristic function (CF) approach outage probability and success probability expressions are given. These expressions are functions of D2D and interference path-loss exponents, distance between the D2D devices, distances between interferers and the D2D receiver and, interference and D2D fading channel conditions. Maximum ratio combining (MRC) and selection combining (SC) based diversity schemes are considered to mitigate channel fading effects. D2D communication system under various conditions of channel fading and interference is numerically analyzed and discussed.
Sara El Gaily and Sándor Imre
Quantum Optimization of Resource Distribution Management for Multi-Task, Multi-Subtasks
This paper proposes a new optimization strategy for resource distribution management based on a quantum algorithm, as a way to reduce the computational complexity in finding the optimum deployment scenario, taking into consideration the required conditions and constraints of the resource distribution system. We show that the quantum method computes the results in minimum time and outperforms on the other classical algorithms in terms of computational complexity.
CALL FOR PAPERS
43rd International Conference on Telecommunications and Signal Processing (TSP)
IEEE TSP 2020, Milan, Italy
International Conference on Sensors and Sensing Technologies
IEEE SENSORS 2020, Rotterdam, The Netherlands
28th European Signal Processing Conference
EUSIPCO 2020, Amsterdam, The Nertherlands
2019. 3rd Issue
Volume XI, Number 3
Full issue (7,5 MB)
MESSAGE FROM THE EDITOR-IN-CHIEF
Indexing current advances with DOI – at the Infocommunications Journal
The vast domain of Infocommunications reach from the physics of wireless and wired communication channels, through traversing the information – in a secure way – to its destination(s) to analyzing the characteristics of that transmission.
Since the area is huge, categorizing advances is hard. We operate with keywords – index terms –, text-mining of research papers, and creating clusters based on similar set of areas involved in these papers. The survey papers that keep appearing in our journal is useful in this sense as well: connecting and summarizing the current knowledge of a field – even if it has just emerged. In order to help indexing of our journal papers and the ones cited inside, we encourage our authors to reference the DOI – Document Object Identifier – of their cited articles, and we make sure these DOIs point to the source of the document, making it easier for the readers to reach it directly. This activity is animated by DOI commissioners such as the Hungarian Academy of Sciences, who helps us assigning DOIs through the original DOI provider, CrossRef.
INVITED SURVEY PAPER
Gábor Fodor, László Pap and Miklós Telek
Recent Advances in Acquiring Channel State Information in Cellular MIMO Systems
In cellular multi-user multiple input multiple output (MU-MIMO) systems the quality of the available channel state information (CSI) has a large impact on the system performance. Specifically, reliable CSI at the transmitter is required to determine the appropriate modulation and coding scheme, transmit power and the precoder vector, while CSI at the receiver is needed to decode the received data symbols. Therefore, cellular MUMIMO systems employ predefined pilot sequences and configure associated time, frequency, code and power resources to facilitate the acquisition of high quality CSI for data transmission and reception. Although the trade-off between the resources used user data transmission has been known for long, the near-optimal configuration of the vailable system resources for pilot and data transmission is a topic of current research efforts. Indeed, since the fifth generation of cellular systems utilizes heterogeneous networks in which base stations are equipped with a large number of transmit and receive antennas, the appropriate configuration of pilot-data resources becomes a critical design aspect. In this article, we review recent advances in system design approaches that are designed for the acquisition of CSI and discuss some of the recent results that help to dimension the pilot and data resources specifically in cellular MU-MIMO systems.
PAPERS FROM OPEN CALL
Cebrail ÇiFTLiKLi, Musaab AL-OBAIDI, Mohammed FADHIL and Wael AL-OBAIDI
Cooperative OSIC System to Exploit the Leakage Power of MU-MIMO Beamforming based on Maximum SLR for 5G
This study investigated the crucial—but not welldiscussed—issues involved in designing beamforming for all receivers, subject to leakage power constraints. Our assumption is that all users use ordered successive interference cancellation (OSIC) detection when the channel state information (CSI) is available. The problem of interest is to find beamforming that can improve OSIC performance of a multi-user scheme without significantly increasing the complexity. This study considers the transceiver design for multi-user MIMO (MU-MIMO) communications, in which a single transmitter adopts beamforming to simultaneously transmit information at first time-slot. During the second time-slot, receivers cooperate to share specific results of OSIC detection in each user. We propose the maximum-likelihood (ML) to estimate the received symbols.The estimated symbols will be used in OSIC detection to detect interference symbols. Promising results show that our cooperative OSIC scheme of the MU-MIMO beamforming system based on maximum signal-to-leakage ratio (SLR) realizes the diversity order of OSIC. Also, by utilizing leakage power as a useful power and not just as an interference power, the performance of the proposed scheme over Rayleigh and Rician channels is significantly better than the performance of classical MU-MIMO beamforming system based on SLR at a high signalto-noise ratio (SNR).
Roman N. Ipanov
Polyphase Radar Signals with ZACZ Based on p-Pairs D-Code Sequences and Their Compression Algorithm
In modern synthetic-aperture radars, signals with the linear frequency modulation (LFM) have found the practical application as probing signals. Utilization of LFM-signals was formed historically since they were the first wideband signals, which found application in radar technology, and their properties have studied a long time ago and in detail. However, the LFM-signals have the “splay” ambiguity function, which results the ambiguity in range. The question of the probing signal choice is also relevant in connection with the problem of weak echoes detection, which are closed by the side lobes of ACF of the strong echoes. In this paper, the polyphase (p-phase, where p is the prime integer number) radar signal, which has an area of zero side lobes in a vicinity of the central peak of autocorrelation function, has been synthesized. It is shown that this signal represents a train from p coherent phase-code-shift keyed pulses, which are coded by complementary sequences of the p-ary D-code. The method of ensemble set formation of the p-ary D-code for signal synthesis is suggested. Correlation characteristics of the synthesized signal are discussed. The compression algorithm of this signal is considered including in its structure the combined algorithm of Vilenkin-Chrestenson and Fourier fast transform.
Dmitrii I. Popov and Sergey M. Smolskiy
Synthesis and Analysis Non-recursive Rejection Filters Transient Mode
The non-recursive rejection filter (RF), which is improved with the purpose of transient acceleration at arriving of the passive interference edge caused by disturbing reflections from fixed or slow-moving objects, is synthesized by the state-variables method. The structural diagram is offered of the tunable RF in the transient with the purpose of improvement of signal extraction effectiveness from moving targets on the background of the passive interference edge. The comparative analysis is performed of RF effectiveness for fixed and tunable structure in the transient according to the criterion of the normalized interference suppression coefficient and the improvement coefficient of the signal-to-interference ratio. The essential increase of the signal extraction effectiveness from the moving objects on the background on the interference edge for the wide class of the spectral-correlation characteristics at RF structure modification.
Aymen Hasan Alawadi, Maiass Zaher and Sándor Molnár
Methods for Predicting Behavior of Elephant Flows in Data Center Networks
Several Traffic Engineering (TE) techniques based on SDN (Software-defined networking) proposed to resolve flow competitions for network resources. However, there is no comprehensive study on the probability distribution of their throughput. Moreover, there is no study on predicting the future of elephant flows. To address these issues, we propose a new stochastic performance evaluation model to estimate the loss rate of two state-of-art flow scheduling algorithms including Equalcost multi-path routing (ECMP), Hedera besides a flow congestion control algorithm which is Data Center TCP (DCTCP). Although these algorithms have theoretical and practical benefits, their effectiveness has not been statistically investigated and analyzed in conserving the elephant flows. Therefore, we conducted extensive experiments on the fat-tree data center network to examine the efficiency of the algorithms under different network circumstances based on Monte Carlo risk analysis. The results show that Hedera is still risky to be used to handle the elephant flows due to its unstable throughput achieved under stochastic network congestion. On the other hand, DCTCP found suffering under high load scenarios. These outcomes might apply to all data center applications, in particular, the applications that demand high stability and productivity.
Yuancheng Li, Guixian Wu and Xiaohan Wang
Deep Web Data Source Classification Based on Text Feature Extension and Extraction
With the growth of volume of high quality information in the Deep Web, as the key to utilize this information, Deep Web data source classification becomes one topic with great research value. In this paper, we propose a Deep Web data source classification method based on text feature extension and extraction. Firstly, because the data source contains less text, some data sources even contain less than 10 words. In order to classify the data source based on the text content, the original text must be extended. In text feature extension stage, we use the N-gram model to select extension words. Secondly, we proposed a feature extraction and classification method based on Attention-based Bi-LSTM. By combining LSTM and Attention mechanism, we can obtain contextual semantic representation and focus on words that are closer to the theme of the text, so that more accurate text vector representation can be obtained. In order to evaluate the performance of our classification model, some experiments are executed on the UIUC TEL-8 dataset. The experimental result shows that Deep Web data source classification method based on text feature extension and extraction has certain promotion in performance than some existing methods.
CALL FOR PAPERS
IEEE 20th Mediterranean Electrotechnical Conference
IEEE MELECON 2020, Palermo, Italy
International Federation for Information Processing Networking 2020
IFIP Networking 2020, Paris, France