|
Prof. Boris Ryabko
Scientific Biography
My main scientific fields of interest are Information Theory, Cryptography and Steganography,
Mathematical Statistics and Prediction,
Complexity of Algorithms and Mathematical Biology.
The main results are as follows.
INFORMATION THEORY and PREDICTION
1979---
I proved the result equating the minimax redundancy of a universal code to the channel capacity.
It's very important for source coding theory because the channel capacity is the main concept of Information Theory.
This result was independently discovered by R. Gallager ,but was not published.
(see my note in : IEEE Trans.Inform Theory,1981,v.27,n 6,pp.780-781,1981. and the editor's comment after that note.).
Now this result is called ''Gallager- Ryabko Theorem '' in many papers.
1980---
the "bookstack" code was suggested and published (This code was re-discovered by Bently J.L., Sleator D.D.,Terjan
R.E., Wei V.K. and often it is called "move-to-front" code, see my letter in : Comm.ACM,v.30,n9,p.792,1987.).
1984-
The '' Twice- universal code'' was suggested. This code can compress sources with unknown probabilities and unknown
memory (or connectivity). Nowadays this construction is widely used in source coding.
1986---
The theory of coding of combinatorial sources was suggested.
(a combinatorial source is defined simply as a set of infinite sequences; so this is non-probabilistic
definition.
For example, the set of all infinite text in English (or C++) is a combinatorial source).
It is shown that the Hausdorff Dimension plays the role of the Shannon entropy for combinatorial sources (when the entropy does not exist).
1988---
It was shown that the problems of universal coding and (universal) prediction are very close from mathematical point of
view.
In fact, it was the first paper where universal prediction was considered.
Besides, the following fact was proven: it is impossible to predict the output of an
ergodic stationary source with precision which goes to 0 monotonically.
(Later it was called an open problem by T. Cover in his book ).
1989 ---
The new structure for maintaining commutative probabilities was suggested
in my papers [12-14].
Later this structure and algorithm were applied to the arithmetic code by Fenwick, Moffat and others and now the 'standard' arithmetic code is based on that structure.
(See Moffat, Neal and Witten, "Arithmetic coding revisited." ACM Trans Information Systems 16(3):pp. 256-294; ).
1990-2002
A new fast and efficient algorithms have been suggested for homophonic codes , low-entropy sources and many other codes and problems.
(See the list of publications).
A new approach to prediction and source coding, based on Hausdorff Dimension and Complexity Theory, was suggested and developed.
MATHEMATICAL STATISTICS
2005--
It was shown in [55,56,60,61,63] how universal codes can be used for solving some of the most important
statistical problems for time series. By definition, a universal code (or a universal
lossless data compressor) can compress any sequence generated by a stationary
and ergodic source asymptotically to the Shannon entropy, which, in turn, is the
best achievable ratio for lossless data compressors.
We considered finite-alphabet and real-valued time series and the following problems:
estimation of the limiting probabilities for finite-alphabet time series and
estimation of the density for real-valued time series, the on-line prediction, regression,
classification (or problems with side information) for both types of the time
series and the following problems of hypothesis testing: goodness-of-fit testing, or
identity testing, and testing of serial independence. It is important to note that
all problems are considered in the framework of classical mathematical statistics
and, on the other hand, everyday methods of data compression (or archivers) can
be used as a tool for the estimation and testing.
It turns out, that quite often the suggested methods and tests are more powerful
than known ones when they are applied in practice.
COMPLEXITY OF ALGORITHMS
1998 ---
Fast algorithms were discovered for ranking (enumeration) of combinatorial objects (permutations, combinations, etc.).
The speed of the suggested algorithms is exponentially superior to known ones.(see [29,30] ).
MATHIMATICAL BIOLOGY (see [10,16,17,24,25, 27,28,35,38,42])
The main difficulties in the analysis of animal ''languages'' appear
to be methodological. Many workers have tried to directly decipher
animal language by looking for 'letters' and 'words' and by compiling
'dictionaries'. The fact that investigators have managed to compile
such for a few species only, seems to indicate not
that other animals lack 'languages', but that adequate methods are lacking.
Prof. Zh.Reznikova and I have suggested a principally new approach to study
quantitative
characteristics of communicative systems and important properties of animal
intelligence. The main point of this approach is not to decipher signals
but to investigate the very process of information transmission by measuring
the time duration which the animals spend on transmitting messages of definite
lengths and complexities. This allows to estimate intellectual potentials by
observing the communicative process. Our experiments based on ideas of
Information Theory have shown that ants probably have an even more intricate
form of communication than the honeybee. We also succeeded in studying some
properties of insect cognitive capacities , namely their ability to perform
limited counting and to memorize simple regularities, thus compressing the
information available. The analysis of time duration of ants' contacts enable
us to create a hypothesis of how they use numbers and coordinates in their
communication. At last, the ants seem to be able to add and subtract small
numbers and this result is proved using ideas of Information Theory.
|