The results of the study of the influence of the characteristics of convolution and subsampling layers (sub-sampling layer) at the input of a deep convolutional neural network on the quality of pattern recognition are presented. For the convolution layer, the variable parameters were the size of the convolution kernel; the varied parameters of the architecture of the down sampling layer were the size of the receptive field, which determines which region of the input feature will be processed to form the output of the layer. All the parameters listed that determine the architecture of the input layers of convolution and subsampling, the neural network developers have to select, based on their experience, known good practices. This choice is influenced by a preliminary analysis of the parameters of the processed images: image size, number of color channels, features of signs determining the classification of recognizable objects in different classes (recognition of silhouette, texture) and more. To take into account the noted factors when creating the architecture of the input convolution and subsampling layers, it is proposed to use numerical characteristics calculated based on the analysis of histograms of input images and pixel color intensity dispersions. A histogram of both the entire image and the fragments is constructed, as well as the calculation of the total variance and local variances of the fragments, compared with the total dispersion. Based on these comparisons, recommendations were developed for choosing the size of the convolution kernel, which will reduce the time needed to search for a suitable neural network architecture. A study of the influence of the above parameters on the quality of image recognition by a convolutional neural network was carried out experimentally, using a network created in Python using the Keras and Tensorflow libraries. To visualize and control the learning process of the neural network, the TensorBoard cross-platform solution was used. Network training was carried out on the Nvidia GeForce GTX 1060 GPU, supporting CUDA technology for hardware and software parallel computing architecture. Continue... | |
Paper presents the results of the study of the possibility and efficiency of computational procedures for constructing autoregressive statistical models and their close derivatives, as well as their ability to solve practical problems of constructing a forecast of electricity prices. Sufficiently detailed results of numerical construction of ARIMA-models are presented and supplied with options for preprocessing the initial data, taking into account the regularities characterizing the functioning of the energy complex. Adequacy verification of forecast mathematical models with reference to historical natural data in the form of time series was carried out on the basis of numerical estimation of the standard error. The achieved accuracy level for the designed predictive models for electrical energy day-ahead market which were found through Russian Belgorod region data 2016-2018 matches already published results over international energy markets in Europe. America and Australia. Comparative analysis and interpretation of mathematical models for prediction of the accuracy and adequacy of the field data, both published and obtained in this work leads to the conclusion that increasing complexity of statistical autoregressive forecast models (complexity of structures, the number of unknown parameters, the combination of heterogeneous components, the introduction of correction coefficients) only in individual cases and slightly increases the prediction accuracy. It is concluded that it is expedient to introduce additional information about significant factors affecting the full-scale time series of the predicted variable into the mathematical models of the forecast. Note for more information about influencing factors by the introduction of appropriate computing method algorithm changes and the use of somehow combined prediction mathematical model structure supposed to be possible directions for further research. Continue... | |
The «1C:ERP» application solution consists of functional blocks, each of which is a subsystem that includes a set of specific tools and settings that provide a certain number of functions. Due to the flexibility of all settings, the standard configuration of 1C: ERP can be adapted to different features of different enterprises. One of the interesting mechanisms that are implemented in this applied solution is the mechanism of intercompany sales of "Intercompany". This mechanism is very relevant in the modern realities of trade. For companies represented by several legal entities, "intra-company trade" is available for the sale of goods belonging to another organization. Here is one possible example of such a sales scheme: the purchase of goods to the warehouses of an enterprise is handled by one holding company (organization); sales of goods can be performed by the holding company (organization) that is allowed to sell the goods of the purchasing company; at the time of sale, the final buyer is controlled by the possibility of selling goods to another organization in accordance with the configured sales scheme between organizations; sales documents between the purchasing firm and the selling firm are drawn up based on the results of sales of goods to the end user. Making a sale of a product on behalf of another organization is no different from selling your own product. If the merchant organization does not have the required quantity, the system automatically writes off the missing items from the organization for which intra-company operations are allowed in the transfer settings. Despite the beauty of the description of this mechanism, there are a number of problems that users face when using it. In this regard, the purpose of the work is to describe the "Intercampany" mechanism and present a way to solve some of the problems that arise when using it. An important element of the Intercampany subsystem is the process of forming reserves (creating records in the accumulation register "reserves of goods of organizations"). A special feature of the new accounting register is that it does not store a history of data. This is not very logical for the accumulation register, and should be taken into account when configuring the system. At the same time, the register "reserves of goods of organizations" may contain entries related to unverified documents. This may occur after the goods receipt document is canceled, if this cancellation results in negative balances for the organization. As a result, records that form a reserve that covers these negative balances will be linked to the undelivered goods receipt document. Therefore, one of the methods to solve this problem is to forcibly delete entries from this register.
Continue...
|
|
The article is devoted to the possibility of sharpening images intended for electronic publications through the use of mathematical filtering algorithms. It is believed that the processing of images with special filters can increase their sharpness, but in conditions of visual perception from the screen of an electronic device, the use of some filters does not always give a significant increase in sharpness. This is due to the physiological features of the structure of the human visual system, which works as a low-pass filter. If the degree of filter impacts is insufficient or the mathematical algorithm underlying the filter itself will show low efficiency under visual perception conditions, the sharpening results of image details can be completely lost. On the other hand, for the final consumer it is important to reproduce precisely those image details that relate to informative areas, which are perceived by the human eye and which determine the plot content of visual information. Thus, the effectiveness of application sharpening filters is determined by the effectiveness of their impact on the informative areas of the image. For reaching the purposes of this article, evaluation of the effectiveness of sharpening filters in terms of visual perception of final images by the consumer is carried out by the method of spectral analysis. A comparative analysis of the spectra obtained for images processed using various sharpening algorithms allows us to judge the real change in sharpness, as well as the effectiveness of the implementation of these algorithms in terms of visual perception of the image. The results of the study can be used to develop recommendations for choosing the necessary sharpening algorithm, depending on the nature of theimagesd. Continue... | |
The article examines the relevance of the development of the scientific social environment in the form of a ConfID Web application. One of the important aspects is the implementation of the mechanism for the automated accumulation of achievements, which may include patents, personal electronic certificates, output data of publications, and participation in conferences. The ConfID project plays the role of a portfolio, on the basis of which the profile of each participant is formed. Any participant has the opportunity to look for fellow scientists in order to conduct cooperative research to help with reviews on dissertations and act as their opponents. The proposed configuration model of electronic certificates will optimize resources for data storage, as well as their processing. The ecosystem under development consists of two main projects - ConfID and ConfLab. Due to the possibility of interserver interaction between projects, the full life cycle of scientific events is provided. An important aspect of the problem being solved is the documentary support of scientific flows at all stages of the life cycle. Particular attention was paid to the description of the algorithm for generating and storing graphic documents such as letters and other related materials. It is such documents that form the basis for filling the scientific portfolio of both a young scientist and a senior researcher. The project presented in this work demonstrates a complex technical solution in the form of an architecture that ensures the interconnectedness of disparate scientific communities by aggregating heterogeneous data on the scientific achievements of practical scientists. Among other things, the architectural solution of the system itself is of scientific interest as a model for applying the best practices for constructing systems of this kind.
Continue...
|
|
The Terms of reference is an important document that accompanies any development in all areas of activity, including in the field of IT. A high-quality Terms of reference determines the success of future development and customer satisfaction with the results obtained. This determines the relevance of the research topic.The purpose of this article is to determine the features of the Terms of Reference for the design of a blockchain management system so that it accurately provides the functionality of the future blockchain management system and the necessary degree of information security. The authors were faced with the task of reviewing the structure of the Terms of reference, identifying sections that should take into account certain features of the blockchain system. Tasks of developing recommendations on the choice of the development platform, the composition of necessary actions at different stages of development to improve the quality of the Terms of reference. As a result of the study, specific features of the technical task of the designed system were determined in terms of choosing the configuration and architecture , user roles, and creating user groups for exchanging confidential information. The proposed approach to the development of high-quality Terms of Reference for the design of a blockchain management system will allow us to avoid errors at the very beginning of the design and create an effective solution to the blockchain management system. This approach can be used by developers of distributed registry systems to effectively launch and implement projects.
Continue...
|