№ 5(59)
27 october 2015 year
Rubric: Information sistems Authors: Volkova V., Efremov A., Paklin N., Vasiliev A., Yuriev V. N. |
Information technology (IT) is a set of methods, processes, tools and hardware that provide for
the decision makers in the various fields of activity the ability to access any required information.
Currently, there is no established classification IT. A variety of technologies are studied in different
disciplines. At the same time, in order to help to navigate the variety of information technologies they
need to be ordered in some way. This paper offers a multi-level classification of the IT based on the
allocation to strata from the bottom up techniques, tools and equipment to the work with information
depending on their degree of complexity — from the means of communication human with a
computer, collection, storage, retrieval, different ways of processing information to the IT for extract
the knowledge and to the emergence of new information as a result of the complex methods. The
paper uses the information models of F. E. Temnikov and A. A. Denisov to describe the principles
of the proposed classification. The hierarchical structure of the IT in this paper is presented within
two levels of itemization — a general one which comprises the whole specter of the existing IT and
a detailed one which consists of the second level technologies that are specific for every particular
general technology branch.
Continue...
|
---|---|
A company’s ability to change increasingly depends on the ability to change its IT, something
referred to as «IT agility» here. High IT agility can contribute to increased business agility and
thus create a competitive advantage. In this paper we look at which factors influence IT agility
and how the IT agility can be increased. The main body of the paper, however, is devoted to the
research question how IT agility can be measured and actively managed. Here, the focus is on
the IT application systems landscape, a resource of significant importance for the IT agility and
competitiveness of a company.
Continue...
|
|
The study explores characteristics of the German academic discipline «Wirtschaftsinformatik
». It is based on a literature review of fourteen publications on the history of the discipline and
on a comparison of research approaches of «Wirtschaftsinformatik» and the North-American
sister discipline «Information systems». The study identifies four characteristics of the academic
discipline «Wirtschaftsinformatik» and derives six challenges and opportunities for the business
informatics community worldwide.
Continue...
|
|
authentication
server. The method of two-factor authentication of users of computer systems on the remote
server using personal biometric data is proposed. The method based on error-correcting coding and
other conversion of biometric data. The developed method is based on «fuzzy extractors» and allows
to store only fragments of biometric standard on the server and does not allow to restore the standard
if this fragments were stolen. As the biometric features of a person is proposed to use the keystroke
dynamics: duration of retention and the time intervals between keystrokes as a person type the passphrase
on the keypad. An original way to use information about the stability of biometric features is
proposed. The information about biometric features stability is used to choose the best ones for preparing
a cryptographic key and decrease errors of key generation. Also it is a part of a secret information
that storages on the server side and used in key recovery procedure. As a part of the future research
for «combining» and «subtraction» bit sequences of PRN code and biometric data for cryptographic
key generation it is planned to use fuzzy implication operation, adapting one of the fuzzy inference
algorithms (Tsukamoto, Sugeno, Mamdani, Larsen et al.)
Continue...
|
|
This article describes the work performed by the author on the modernization of software of the
standards complex of time and frequency. New software is developed; there were made improvements
to the algorithm of formation of the analytical time scales. This work was performed with the
aim of developing technical means for fundamental support of the GLONASS system for modernization
of systems for national time scale keeping on the basis of hydrogen masers to achieve the tactical
and technical characteristics of the GLONASS system on the harmonization of the national time
scale with the International scale of coordinated universal time. Developing standards complex of time
and frequency must be part of modernized complexes of national time scale and is intended to ensure
that time scales of modernized complexes for national time scale keeping. In the process of upgrading
software of the standards complex, there was done the work on improving the methods of forming
group hydrogen keeper of the frequency and calculation of the national time scale. There were applied
weight coefficients to reduce the error of the mean relative changes in the frequency of the hydrogen
standard, on a monthly measurement interval. The sliding interval estimation of the frequency
model parameters of hydrogen masers was used for reducing the frequency instability of the analytical
frequency of group hydrogen keeper. During use of the program in the State metrological centre
«State service of time, frequency and the Earth rotation parameters determination», it was concluded
that the application of weight coefficients when determining the parameters of the regression model
of frequency changing of the reference standard allows slightly reduce the average relative errors on
annual time interval of the observations on average by 1%. It was found that the use of a sliding interval
estimation of the frequency model parameters of hydrogen masers at each day of measurements
can considerably reduce the instability of the analytical frequency of group hydrogen keeper. Using
the new method the relative decrease of the frequency instability was approximately 10% for a threemonth
time interval of observations.
Continue...
|
|
As the title implies the paper considers 64‑bit enhancement of x86 (x86-64) and features of the 64‑bit
operating systems. There are some new characteristics of the architecture x86–64 such as 64‑bit integer
capability, additional registers, larger physical address space, RIP-relative addressing, additional
XMM registers, etc. Some of these issues are reflected in our article. We focus on the main features of
this specification with the point of view of programming. Much attention in our paper is given to such
concepts as «red zone» and «shadow space», which are associated with calling conventions. The paper
speaks in detail on the differences in the approaches (regarding calling conventions) adopted in
families of 64‑bit operating systems Windows and UNIX. We compare the low-level program architecture
for different operating systems supporting the x86–64 specification. The paper gives examples
of analysis of the executable code, which are compiled for operating systems with different calling
conventions. For the purity of the experiment the programs for different operating systems are translated
by compilers of C from the GNU Compiler Collection (GCC). For the study of executable code
for 64‑bit operating systems Windows and UNIX, we used the disassembler IDA PRO version 5.5.
Continue...
|