Abstract:
The word "martingale" has related, but different, meanings in
probability theory and theoretical computer science. In computational
complexity and algorithmic information theory, a martingale is
typically a function d on strings such that E(d(wb) | w) = d(w) for
all strings w, where the conditional expectation is computed over all
possible values of the next symbol b. In modern probability theory a
martingale is typically a sequence
ξ_{0},ξ_{1},ξ_{2}, ... of random
variables such that E(ξ_{n+1} |
ξ_{0},...,ξ_{n}) = ξ_{n} for all n.
This paper elucidates the relationship between these notions and proves that the latter notion is too weak for many purposes in computational complexity, because under this definition every computable martingale can be simulated by a polynomial-time computable martingale.
Journal Version: