Section 4 - Track 2

On this page, talks in the format of posters are posted. They can be viewed during the whole time of the Conference. To send your question to the authors, please follow to Miro and write your question in the comments. You can also use the feedback form on our website. Authors will answer questions using Miro comments or via email. To get manual go to Instructions (eng / rus).

Sergey Vostokin and Irina Bobyleva
Implementation of frequency analysis of Twitter microblogging in a hybrid cloud based on the Binder, Everest platform and the Samara University virtual desktop service
The paper proposes the architecture of a distributed data processing application in a hybrid cloud environment. The application was studied using a distributed algorithm for determining the frequency of words in messages of English-language microblogs published on Twitter. The possibility of aggregating computing resources using many-task computing technology for data processing in hybrid cloud environments is shown. This architecture has proven to be technically simple and efficient in terms of performance.

Alnajjar Khaled and Igor Anikin
Secure gamma generation for stream cipher based on fuzzy logic
In this paper, we propose a new approach for secure bit stream gamma generation based on concept of fuzzy logic and linear feedback shift registers (LFSRs), combined in pseudorandom number generator based on fuzzy logic (FPRNG). The original idea for ensuring security for gamma generation is based on the initial states (the seeds) of LFSRs used in the generator. In this paper we suggested the updated version of FPRNG. We also improve the gamma security by accompaniment seed some valuable secure information. We found that configurations of membership functions (MFs) in FPRNG can significantly affects the randomness of the output stream of the developed generator. We can increase the gamma security as well as stream cipher security by keeping these configurations in secret additionally to seeds.

Anna Grevtseva and Vadim Davydov
A method for increasing the speed of processing the results of measurements of the parameters of a quantum frequency standard to increase the speed of information transfer in satellite communication systems
The article considers a new method for calculating the frequency characteristics of a quantum standard of frequency for rubidium atoms - 87. The implementation of this method in the developed software is presented. The experimental results were processed using the developed software. The proposed method allows you to adjust the frequency of the standard when it deviates in a shorter period than before. During this time interval, the information transfer rate does not deteriorate.

Oleg Golovnin and Ekaterina Sidorova
Operational forecasting of road traffic accidents via neural network analysis of Big Data
The paper proposes an approach to the operational forecasting of traffic accidents with the separation of accident types based on the multilayer Rumelhart perceptron. The approach is applied to analyze Big Data consists of data collected from external heterogeneous data sources. Weather, road, organizational factors and traffic flow parameters are taken into account. The software implementation of the approach uses the TensorFlow framework and the Keras library. The experiments showed that the approach provides a 90% accuracy in recognizing situations. The software implementation is intended to function as part of accident prevention systems.

Victor Tsvetov
Wireless channel noises and data protection
The paper proposes an application of a quadrature amplitude modulation (QAM) scheme using orthogonal frequency division multiplexing (OFDM) transmission for data protection. Our approach bases on the symmetric encryption and uses a natural or induced noise in a wireless channel as a random part of a secret key. The main idea for this is the sensitivity of the signal decoding model to variations of measured values. This kind of protection does not require additional computational costs and does not affect the bandwidth of the channel.

Anastasiya Alekseeva, Irina Karpunina and Vladimir Klyachkin
Analysis of the stability of the hydraulic unit according to the results of vibration monitoring
During stationary operation of the hydraulic unit, the stability of its operation must be ensured. The analysis of vibration monitoring data for correlated indicators is carried out using methods of multivariate statistical process control: the average level of the process is controlled based on the Hotelling algorithm, multivariate scattering control is performed using the generalized dispersion algorithm. The paper investigates the effectiveness of the generalized dispersion algorithm: how quickly the control chart of the generalized dispersion responds to possible disturbances in the vibration stability of the hydraulic unit. The study showed that the stability of the operation of a hydraulic unit by the criterion of multivariate scattering is not always adequately estimated using the standard generalized dispersion algorithm. To increase the sensitivity of the control to possible process disturbances, it is advisable to modify this algorithm by searching for non-random structures on the corresponding chart, applying a warning boundary, or using exponentially weighted moving averages on the generalized variance.

Yuliya Kuvayskova, Victor Krasheninnikov, Vladimir Klyachkin and Anastasia Alekseeva
Fuzzy models for predicting the technical state of objects
In order to ensure reliable operation of the facility, it is advisable to carry out diagnostics and predicting of its technical state. Often, obtaining information about the state of an object is difficult. The article proposes the use of fuzzy logic models to recognize and predict the technical state of an object in conditions of limited information. To assess the quality of predicting results by fuzzy models, criteria such as the percentage of correct predictions, the AUC criterion, and the F-measure are used. The proposed models, algorithms and criteria are programmatically implemented in the form of an information-mathematical system that can be used in the production and scientific activities of enterprises to increase the efficiency of various technical objects. Experimental studies were conducted to test and analyse the effectiveness of the proposed models, algorithms and the information-mathematical system on real technical objects (a water treatment system for drinking water, a hydrounit control system).

Dmitry Rodin, Marina Rodina, Alexey Telegin and Igor Piyakov
Simulation of a dust impact time-of-flight dust particle sensor
One-dimensional and two-dimensional axisymmetric models of a dust-shock time-of-flight sensor for analysis of the chemical composition of micrometeoroids and particles of space debris are considered. The results of calculating the design parameters and functional characteristics of the sensor are presented. The algorithm of the program for modeling mass spectra is described. Model mass spectra obtained for the axial case in the one-dimensional approximation, as well as for the two-dimensional case and various coordinates of the impact interaction, are presented. The comparison of model spectra with experimental data obtained at the electrodynamic accelerator of dust particles is given.

Aleksandr Kolpakov, Yuriy Kropotov and Alexey Belov
Investigation of the RAM access model in a heterogeneous computing system
The issue of creating high-performance computing systems based on heterogeneous computer systems is topical, since the volumes of processed information, calculations and studies with large data sets are constantly increasing. The aim of the work is an experimental study of previously developed models for predicting the performance of heterogeneous computer systems in telecommunications. As a result, the study showed that the developed models allow us to obtain an adequate estimate of the possible time of the algorithm for various parameters of the GPU with some limitations.

Valery Zasov
Algorithm for verification the stability of signal separation for objects with changing characteristics
This paper proposes an algorithm for verification the stability of a solution to the inverse problem of extracting individual signals from an additive mixture of several signals, intended for objects with changing characteristics, depending on a certain vector of parameters. A variant of the algorithm is considered for objects whose changes in characteristics are described by deterministic functions. A feature of the proposed algorithm is its preliminary training, which can significantly reduce the computational complexity and stability control time by constructing a singularity boundary separating the spaces of stable and unstable solutions. The results of computer simulation of the proposed algorithm are presented.

Diera Pirova, Borislav Zaberzhinskiy and Andrey Mashkov
Detecting Heart Disease Symptoms Using Machine Learning Methods
This paper explores the possibility of using machine learning methods for detecting cardiovascular diseases. 187 electrocardiography (ECG) recordings were taken for analysis, of which 80 records are the results of healthy people, 90 ones correspond to patients with myocardial infarction and 17 ones – to patients with cardiomyopathy. The signal of each recording has been pre-processed. The pre-processing resulted in a common segment consisting of 600 samples. Such methods as the random forest algorithm, classical logistic regression, the support vector method and a neural network consisting of three layers were used for detecting heart disease symptoms.

Sergeу Smirnov and Alexander Samoilov
Properties Existence Constraints in Fuzzy Formal Concept Analysis
The field of research is the creating of fuzzy concept lattices based on fuzzy data “objects - properties”. Our contribution is the account of existential relations on the set of observed and / or measured properties, i.e. “properties existence constraints”. Two most well-known approaches to c creating of fuzzy concepts lattices are considered: the one-sided threshold and fuzzy closure methods. It is shown that for the more popular one-sided threshold method, potential violations of properties existence constraints in the concept lattice are countered by the rational threshold cut method, previously developed to elicitation crisp formal concepts from fuzzy initial data. However, this way is fundamentally unacceptable for the fuzzy closure method. For this case, the idea of special preliminary processing of the initial data is put forward - the “normalization” of the fuzzy set of properties for each object in the training sample. The practical importance of the study is to increase an adequacy of Fuzzy Formal Concept Analysis.

Egor Karlin, Vladimir Fursov and Valentin But
Information technology for measuring velocity and visualizing the structure of fluid and gas flows.
The work is devoted to the problem of reconstructing three-dimensional models of flow profiles in the channels of hydrodynamic systems. A multichannel ultrasonic flowmeter is investigated, built on the basis of the time-pulse method of measuring velocity and visualizing the structure of liquid and gas flows. The apparatus determines the time of flight of the ultrasonic pulse with the flow and against the flow in the three channels and converts a first flow velocity and then to flow rate. The system is implemented on a microcontroller. To organize the stable operation of the microcontroller, technology is used that allows you to synchronize parallel tasks. The main problems in implementing this technology are the large amount of occupied memory and the low "transparency" of the processes taking place. Solved the problem of optimizing the used memory and minimizing power consumption

Anton Valov, Nikita Lukashev and Vadim Davydov
On the need to use the median signal filtering method to improve the metrological characteristics of the rubidium frequency standard when processing and transmitting large data arrays
The article describes the construction of the rubidium - 87 quantum frequency standard for satellite infocommunication and navigation systems for various purposes. The necessity of improving the metrological characteristics of the quantum frequency standard for long-term transmission of large amounts of data is substantiated. A new algorithm is proposed for processing data of a large amount of data about the error signal using the median filtering method to improve the short-term and long-term stability of the frequency standard. The results of experimental studies of the metrological characteristics of the quantum frequency standard are presented. The improvement in long-term frequency stability was found to be 7%, which reduces the number of bit errors during long-term transmission of information by at least 3%.

Maxim Bobyr and Valentin Bulatnikov
Modeling of a fuzzy filter and Kalman filter for processing the input signal
This article discusses a fuzzy filter model based on fuzzy logic, and also creates a Kalman filter model. Models are tested on signals whose laws of change are unknown to filters. A comparative analysis of the output signals from the models after filtering is performed. The error was calculated between the ideal generated signal before noise and the output signal from the model based on the RMSE and MAPE method.

Sergey Demin, Oleg Panischev, Natalya Demina and Ruslan Latypov
Application of statistical memory functions formalism in search of pathological brain activity diagnostic criteria
Applying the Time Series Analysis for physiological and biomedical data allows to extract the principal information about the states of human organism and its separate systems. In paper we study the biomedical signals from the human cerebral cortex and neuromuscular system by means of the Memory Function Formalism. We use the localized parameters (window-time representation of the memory functions power spectra and the statistical measures of memory) to derive the periodic properties in studied dynamics. The “generalized” and localized parameters are very useful to create the diagnostic criteria in cardiology and neurophysiology.

Sergey Demin, Oleg Panischev, Ruslan Latypov and Sergey Timashev
Flicker-noise spectroscopy analysis of magnetoencephalogram signals in diagnosis of photosensitive epilepsy
In paper we use the Flicker Noise Spectroscopy (FNS) to study the induced human neuromagnetic signals from healthy subjects and patient with photosensitive epilepsy. This approach allows to derive the unique information about the epilepsy - related pathological abnormalities. In earlier authors works the diagnostic criteria for photosensitive epilepsy have been founded, associated by determining the abnormalities of the frequency-phase synchronization between the cerebral cortex areas. Our further FNS studies have revealed the pathological changes in frequency-phase synchronization manifested in high frequency dynamics and in structure of 3D cross correlation. The revealed laws enable to create the new diagnostic criteria and to examine the treatment efficiently of this pathology.

Vitali Kuzmin and Dmitrii Elenev
Monitoring and forecasting the operations of the transport complex of the enterprise
An automated system of a transport complex of a nuclear power plant is used to model the transport complex of the enterprise basing on the example of nuclear power plant. The complex provides transportation of containers with fuel and radioactive elements through the sealed enclosure of the reactor building. The parameters of the transport complex are determined by its constituent objects, namely gates, roads, transportation devices and controllers. A web-based intranet application was developed for monitoring and control purposes. The application works on the basis of the values of current conditions. The operator panel allows to notify if the current state requires intervention of staff. The system processes signals of analog and discrete origin, as well as system signals from the programmable controller. The operator panel allows to notify if the current state requires intervention of staff. Basing on the changes of main parameters, it is possible to predict the service lifetime of individual elements.

Valeriy Tutatchikov
Implementation of a parallel version of the algorithm for calculating a two-dimensional FFT using an analog of the Cooley-Tukey algorithm
Currently, two-dimensional fast Fourier transform (FFT) is widely used in digital signal processing. It is usually calculated using a combination of one-dimensional FFTs, first for each row, then for each column. In this case, the algorithm is easily adapted for parallel computing. The paper presents a method of parallelizing the algorithm for calculating a two-dimensional FFT using an analogue of the Cooley-Tukey algorithm, which requires fewer complex operations of addition and multiplication than the standard method. The main difficulty in parallelizing this algorithm is that the two-dimensional analog of the Cooley-Tukey algorithm is iteratively performed in s passes with the size of the initial signal 2 ^ s * 2 ^ s. Moreover, in each of s iterations, the entire set of used data is recalculated. Methods for solving this problem during parallelization are described in the case of implementing an algorithm in the C ++ programming language using OpenMP and MPI parallel computing libraries. The results of a numerical experiment are presented, in which it is shown that parallelization achieves an acceleration of computations up to 4 times in comparison with the standard method for calculating two-dimensional FFT.

Vladimir Kostin and Aleksandr Borovsky
Definition of basic violators for critically important objects using the information probability method and cluster analysis
One of the approaches to analyzing the relationships between the characteristics of critical objects and typical intruders using the information-probabilistic method is considered. Categories of objects and typical intruders are described by many common heterogeneous characteristics. The probabilistic information method ensured the homogeneity of the entropy potential of the offender training characteristics and the characteristics of the consequences of the offender actions according to the Pearson chi-square criterion and on this basis the characteristics of the offenders and critical objects are summarized in a common information field in single sixpoint measurement scales. Using the information probabilistic method and the cluster analysis method for the general information field, we obtained the basic type of intruder for each category of objects. The results can be used to determine the requirements for physical protection systems of critical facilities.

Il'Ya Katanov
Application of a perceptron to solve the problem of analyzing the fluorescence spectrum of a DBMBF2 sensor in a mixture of aromatic hydrocarbons
The article deals with the problem of choosing the neural network architecture in the problem of analyzing signal data obtained by shooting spectra from fluorescent sensors, which are based on the formation of exciplexes between the boron Dibenzoyl methanate fluorophore (DBMBF2) and aromatic compounds. Attention is paid to the problem of selecting the structural features and parameters of the network in the process of training and testing on available data.

Back to Top