Let X
1
,X
2
,…,X
n
be independent random variables. Denote
The well known Kolmogrov inequality can stated as for all ε≥0
P(max
1≤j≤n
|S
j
|≥ε)≤Var(S
n
)
ε
2

.
The one side kolmogrov type ineqalites are stated as for all ε≥0
P(max
1≤j≤n
S
j
≥ε)≤Var(S
n
)
ε
2
+Var(S
n
)
.
We will prove this inequality in the following.
Proposition. LetX
be a random variable with Var(X)<∞
. Then for all ε≥0
Proof. Without loss of generality, we may assume that E(X)=0
. Then
ε=E(ε?X)=E{(ε ?X)I
X<ε
}+E{(ε ?X)I
X≥ε
}≤E{(ε ?X)I
X<ε
}.
By Cauchy-Schwardz‘s inequality, We have
ε
2
≤[E{(ε ?X)I
X≤ε
}]
2
≤E(ε +X)
2
P(X≤ε)=[ε
2
+Var(X)][1?P(X>ε)].
Therefor,
Proof of the one side Kolmogorov type inequality. Let Λ={max
1\lej≤n
S
j
≥ε}
and Λ
k
={max
1≤j<k
S
j
<ε,S
k
≥ε}
, then Λ=?
n
k=1
Λ
k
. Without loss of generality, we assume that E(X
j
)=0,j=1,…,n.
Then by the independence of the random variables,
ε
=
E[ε?S
n
]=E[(ε?S
n
)I
Λ
]+[(ε?S
n
)I
c
Λ
]
=
∑
k=1
n
[(ε?S
n
)I
Λ
k
]+[(ε?S
n
)I
Λ
c
]=∑
k=1
n
E[{(ε?S
k
)?(S
n
?S
k
)}I
Λ
k
]+[(ε?S
n
)I
Λ
c
]
=
∑
k=1
n
E[(ε?S
k
)I
Λ
k
]+∑
k=1
n
[E(S
n
?S
k
)I
Λ
k
]+E[(ε?S
n
)I
Λ
c
]
=
∑
k=1
n
E[(ε?S
k
)I
Λ
k
]+E[(ε?S
n
)I
Λ
c
]
≤
E[(ε?S
n
)I
Λ
c
.


By Cauchy-Schwardz‘s inequality, we have
ε
2
≤{E[(ε?S
n
)I
Λ
c
]}
2
≤E[(ε?S
n
)]
2
P(I
Λ
c
)=[ε
2
+Var(S
2
n
)][1?P(Λ)].
Therefore,
as the inequality claimed.
The inequality is also true for martingale difference sequence, the
proof is samilar.
On One Side Kolmogorov type inequalities
原文:http://www.cnblogs.com/levin2013/p/3528824.html