№ 4(88)
01 september 2020 year
Rubric: Resource management Authors: Keyno P., Khoroshko L., Rudko I. D. |
Any activity of the organization is reflected in documents, and to improve the quality of business processes, it is necessary to maintain constant improvement of the information processing process. In this regard, I consider it relevant to consider the process of developing an electronic document management module based on the integration platform for business solutions Directum. The purpose of the research is to analyze the possibilities of forming business solutions in the Directum system, which allows you to build a corporate content management system, as well as to build a full-fl electronic document management system [2]. A step-by-step analysis of the development of the electronic document management module «supplier selection Protocol» using the tools and capabilities of the Directum platform was carried out. For this purpose, a structural method was used, by splitting the problem into a set of independent stages available for understanding and solving, and hierarchical ordering. The author developed a solution for putting information from an electronic document card into a template for an electronic Word document based on the platform, since the standard Directum solution has some disadvantages. Automation of routine user actions for initialization and start of a typical route is demonstrated. As a result, it was concluded that the Directum system, using its built-in components, allows you to develop automation for any business processes of the organization, as well as to develop any additional functionality using the platform tools Continue... |
---|---|
№ 4(88)
01 september 2020 year
Rubric: Algorithmic efficiency Authors: Kuznetsova A. A., Maleva T. V., Soloviev V. I. |
The development of robotic harvesting can help reduce the share of heavy manual labor in horticulture that reaches now 40%, as well as crop shortages, which reach up to 50%. Fruit picking robots have been developing since the late 1960s. However, no existing prototype is used in practice due to the low speed of harvesting and the large proportion of unrecognized fruits remaining on trees. The paper aims to develop an algorithm for detecting apples in images that can work quickly and find as many apples as possible. For this purpose, the use of the YOLOv3 convolutional neural network has been proposed, accompanied by special pre- and post-processing procedures. The procedures aim to improve the quality of apple recognition, including situations of the presence of shadows, glare, various damages to apples, empty gaps between the leaves, which could be mistaken for apples, overlapping apples by branches, leaves, and other apples. The algorithm recognizes both red and green apples. It can work with images of single apples in close-up photographs as well as with images of many apples in general pictures. The algorithm quality was evaluated on a test set of 818 images of red and green apples (5142 apples in total). The average apple detection time was 19 ms, the percentage of objects mistaken for apples turned out to be at the level of 7.8%, and the share of undetected apples at 9.2%. Both the average detection time and the error rates turned out to be noticeably shorter than in all known similar systems. Continue... |
This study discusses the process of designing a software module for modifying mesh models by forming partitions and integrating the module code into the source code of the OpenFOAM software environment. In the existing versions of graphical shells for the OpenFOAM software environment, all the necessary capabilities for pre-processing, and solutions, and post-processing of the numerical solution are implemented. Such graphical shells include: Salome, Helyx-OS, Visual-CFD. But they have drawbacks: the lack of full documentation, the English interface, the need to pay for consulting services, and in some cases the need to pay a license to use. Thus, the problem of creating a graphical shell for the OpenFOAM software environment remains relevant, especially in terms of creating a graphical shell for domestic specialists. The subject of the study is the process of preparing computational meshes as part of the preprocessing stage during the numerical simulation of continuum mechanics problems based on the OpenFOAM soft environment. The object of study is the mechanism for preparing computational mesh models using the basic utilities included in the OpenFOAM software environment, as well as utilities responsible for modifying computational meshes. The work aims to implement a graphical interface for working with the createBaffles utility, which provides the formation of partitions, in the process of setting up numerical experiments as applied to problems of continuum mechanics (CM). A chart describing the algorithm for working with the module is given, a stack of tools for writing the program code of the module is defined. The results of the study, its practical significance and the results of module testing are formulated using the example of one of the tasks of the CM. Continue... | |
The procedure for the synthesis of a neural network, which is based on a complex structure of paired neurons-oscillators that function in a certain topological map, is discussed. The use of neurons-oscillators with a variable activation threshold makes it possible to create reconfigurable neural memory circuits that act as a memory structure that can recreate the solution of a problem based on individual environmental signals. The use of mirror differential neurons that implement this principle of memory operation is proposed. This method of organizing a neural network makes it possible to implement an approach to neural network training that involves the reconfi ation of all variable parameters of neurons. Setting up individual cluster groups and their further interaction leads to the formation of a set of samples that correspond to the training sample. The use of interneuron switches, based on the use of an acoustic metamaterial, which properties can be changed by using electrocapillary phenomena, is reviewed. Switches are able to simultaneously accumulate multiple neural signals and then process them through an intermediate conversion into acoustic waves that propagate over the surface and through the volume of the metamaterial. Setting the parameters of switching elements using the optical diffusion tomography method makes it possible to create artificial neuristor lines and arrange signal processing in the interneuron space. The procedure for configuring and adapting the neural network architecture to solve the problem of increasing the reliability of transmitted information using the technology of multiple transmission of duplicate messages, is considered. Control of the method of access to the data transmission medium, as well as determination of the optimal number of frequency channels used, is carried out using the developed neural network of paired oscillators based on the analysis of the noise-signal environment. The effectiveness of the proposed neural network management is justified and the effectiveness of solving the task is evaluated. Continue... | |
With large amounts of data as well as methods and tools for their analysis become available in the public domain, data analysis is increasingly being used to solve problems in all areas of human activity. However, the prevalence and ease of use of tools for analysis has certain negative aspects: the attitude to the analytical problem as a trivial procedure, ignoring the important theoretical limitations of mathematical methods, and insuffi ntly thorough verification of assumptions about the data. In this regard, there is a legitimate need to return practical analysis to the theoretical framework, if possible, inscribing it in the concept of solving more voluminous and complex problems, in the methodology of scientific research in general. The article proposes modeling the data analysis process as a multi-level system of interconnected procedures and data manipulations that differ in complexity, requirements and assumptions. The aim of the work is to structure the process of data analysis, regardless of the specific task and software tool for its solution. Thus, the object of research is the process of data analysis within the framework of the analytical problem, and the subject is the generalized structure of this process. In addition, arguments are given that justify the usefulness and content of this model. Each highlighted level of analysis is illustrated by examples of practical problems that are solved on the basis of a specific set of data on completed scientific works, their contents, and the team of authors. The model can be practically useful when planning a scientific research, calculating its complexity, determining the composition of the creative team, developing curricula. Continue... | |
The traditional method of data quality improvement assumes the number of characteristics or dimensions of data, which are defined statically and then measured and used to improve the corporate data architecture. To use this approach one first needs to define data objects that are of principal importance for the organization i.e. align improvement activity with organization’s business strategy. Obviously, this is very complicated and error-prone task. Moreover, the resuling errors may appear very hard to fi later on. A new approach to data quality management is proposed which builds upon the view on data quality improvement being the IT service provided by the IT-department to business users. The principal difference between that approach and the traditional one is that our approach does not use statically defi properties of data, such as consistency or completeness but applies to particular context-related requirements which stem from specific situations of data use. The focus of the approach is therefore the particular user-oriented SLA, that defines the quality of data from the viewpoint of the user. The well-known ITSM processes along with configuration database are used then to improve data quality. In conclusion some ideas concerning the data architecture maturity model based on the proposed approach are presented. Continue... |