• Sonuç bulunamadı

Non-stationary Markov chains

N/A
N/A
Protected

Academic year: 2021

Share "Non-stationary Markov chains"

Copied!
37
0
0

Yükleniyor.... (view fulltext now)

Tam metin

(1)

NON-STATIOl

l i i S S

A T H E S IS

SU B M ITT'ED TO TH E D EPA BTl^Ef^T OF M A T H E M A T IC S

A N D TH E II4S T IT U T E D r

A N D S C IE N C E S

DF BİÜCSMT U lâlV E R S fTY

IN

p a r t

:

a l

FU LFILLM S İ2T O f T H E

FO R

E?SOP.HE D r

M & < 5 T P P «JiTIClNr’C i V'4 W ' ft «»la · 'i*;* ·> ’w ''sw '> ¿rt· i ^ ·*»·#««*

Saed M a lla k

Ju ly, 1 9 9 6

9 A

2 .7 4 - 9

. M S S İ 9 3 6

(2)

α Α

2 1 4 1

-• Μ 3 5

■ (606

(3)

NON-ST^TIONARY MARKOV CHAINS

A THESIS

SU BM ITTED TO THE DEPARTMENT OF M ATHEMATICS

AND THE INSTITUTE OF ENGINEERING AND SCIENCES OF BILKENT UNIVERSITY

IN PARTIAL FULFILLMENT OF THE REQUIREM ENTS FOR THE DEGREE OF

MASTER, OF SCIENCE

By

Saed Mallak

July. 1996

I B L S L · C c : · . J . ;·. yJJC.·-. cO / i/

(4)

I certify that 1 have read iJiis thesis and that in my opinion it is Fnlly adequate, in scope arid in quality, as a thesis (or tlie degree o(‘ M.aster oF Science.

I certify that 1 have rc'a.d tliis thesis and tliat in my o[)inion it is fully adequate, in scope and in quality, as a thesis lor t.lu' d(^gree ol Master of Science.

U (

Frol. Dr. Melliai('t Kora.l,e|)('

i certily that. I havoj r<'a,(l l.lii.s tliesis and tlial. in uiy opinion il, i« iully adequate, in scope and in quality, as a thesis I’or tlie (k'gree of Master of Science.

Assoc. Prof. Dr. Karliad lliiseyinov

Approved For tln^ institute oF hhigineering and Sciences:

il. Dr. iVlc'bmet

(5)

ABSTRACT

NON-STATIONARY MARKO\* CHAINS

Sci

0

ci iv'Icillcik

M . v S .

in Mathematics

Supervisor: Asst. Prof. Dr. Azer Kerimov

.Julv. 1996

In thi.s work, we studierl the Ergodicilv of Non-Stationary .Markov chains. We gave several e.xainples with different cases. We proved that given a sec[uence of Markov chains such that the limit of this sec|uence is an Ergodic Markov chain, then the limit of the combination of the elements of this sequence is again Ergodic (under some condition if the state space is infinite). We also proved that the limit of the combination of an arbitrary sequence of Markov chains on a finite state space is Weak Ergodic if it satisfies some condition. Under the same condition, the limit of the combination of a doubly stochastic sequence of Markov chains is Ergodic.

Keywords : Markov chain, Stochastic. Doubly stochastic. Irreducible, Ape­ riodic matrix, Persistent. Transient. Ergodic. Ergodic Theorem.

(6)

ÖZET

DURAĞAN OLMAYAN M ARKOV ZİNCİRLERİ

Saed Mallak

Matematik Bölümü Yüksek Lisans

Tez Yöneticisi: Asst. Prof. Dr. A zer Kerimov

Temmuz. 1996

Bu çalışmada durağan olmayan Markov /dncirlerinin Ergodikliğini inceledik. Bazı farklı durumlardan örnekler verdik. Limiti Ergodik Marko\' zinciri olmak üzere \’erilen bir .Markov zincirleri dizisi için bu dizinin kombinasyonunun lim ­ itinin de gene bir Ergodik Markov zinciri olduğunu ispatladık (.Ancak durum uzayının sonsuz olması halinde, bu durum bazı koşullar altında mümkündür). A yrıca sonlu bir durum uzayında verilen herhangi bir Markov zincirleri dizisi için bu dizinin bazı koşulları sağlaması halinde kombinasyonunun limitinin zayıf Ergodik olduğunu gösterdik. Gene bazı koşullar altında bu dizinin çifte stokiistik Markov zincirilerinde oluşması halinde limitinin Ergodik olduğunu gördük.

Anahter Kelimeler: Markov zinciri. StokaMik. Çifte stoka.stik. Indigenemez, Periyodik olmayan matris. Devamh, Geçici. Ergodik. Ergodik Teorem.

(7)

ACKN OAAXEDGMENT

As soon as I had started my work in this thesis, I was shocked by the sudden death o f m y father to whom I owe e\-erything in my life. To liis spirit I pray and to his love m em ory

1

dedicate this thesis.

It is my pleasure to thank my superwisor Asst. Prof. Dr. .Azer K erim ov for his supervision, guidance and encouragement during rny research in this thesis.

Words are not enough to e.xpress my thanks to ni}· family who are on my side for good and bad times.

(8)

TABLE OF C O N TE N TS

1 Introduction 1

2 stationary Markov Chains 3

2.1 D e fin itio n s...

3

2.2 Cla.ssifica.tions O ! The Chains

.4

2.3 Convergence Theorems (Ergodic Theorem s)

.5

2.-1 Summary G

3 Non-Stationary Markov Chains And Examples 7

3.1 Introduction to Non-Stcitionary Markov Chains 7

3.2

Examples of Stationary Marko\· C h a in s ...

8

3.3 Examples of Non-Stationary Markov C h a i n s ... 9

3.4 Examples o f Sequences of Markov Cdiciins... 14

4 Convergence Theorems Related W ith Nom-Stationary Markov

Chains 18

5 Conclusion And Comments 28

(9)

Chapter 1

Introduction

T h e basic property characterizing Markov chains is a probabilistic analogue of a familiar propert}· of dynamical systems. If one has a system of particles and the positions and \'elocities of all particles are gi\'en at time t. the ecpiations o f m otions can be com pletely solved lor the luture fie\'elopment of the system. Therefore, any other information given concerning the past o f the process up to time t is superfluous as far as future development is concerned. The present state o f the system contains all rele\’ant information concerning the future.

Markov chains are stochastic processes which are ways of ((uantifying the dyiicimic relationships of secpiences of random variables. Stochastic models play an important role in many areas ol the natural and engineering sciences.

Indeed, if we have a seciuence of random variables with c'alues in a discrete set, a countable set, then any such a. sec[uence can form a Markov chain, which is conditional probabilities relating the elements of this secpience.

The most interesting object of the theory o f Markov chains is the asym p­ totic behavior of these probabilities. The most interesting case when we have independence of the initial state, that is starting from any state, the particle reaches the desired state almost with the same probability. .-V Markov chain satisfying this is called an Ergodic Markov chain. We may characterize Ergodic Markov chains by the saying:"All The Ways Lead To R om e".

In the second chapter, we give a general review o f the theory of Stationary Markov chains, definitions, classifications of the chains and main th(

5

orems.

In the third chapter, we introduce another situation, that is combinations from transition probabilities of different Markov chains. That is, we have

(10)

different transition matrix in each transition (step). We give several examples to illustrate and explain this idea. Let us call such a situation Non-Stationary Markov chains.

Ill the fourth chapter, we state and [irove some facts al)out Non-Stationary Markov chains.

(11)

Chapter 2

Stationary Markov Chains

2.1

Definitions

Definition 1 Let S be a jinite or countable set. suppose that to each pair

i , j G S, there is assiejned a non-neejatire· number p;, such that these members

satisfy the constraint:

= (2.1)

.ies

I^et .Vq. A 1. A'o. · · · be a sequernce oj random rariabhs whose ranqes eere con­ tained in S , the sequence is a Meerkov chain if:

f [A

'„+1

= i|A'o = /'u· · · ·, A'„ = /„] := P [X„ + 1 = .i\Xn = ■/'»] = Pi„j (2.2)

for all n and every sequence in S for ivhich i^[A'o / o , - - - , A '„ =

> 0, this property is called Markov Property.

D e fin it io n

2

S is culled the state space or the phase space of the Markov chain.

D e fin it io n 3 /1 square, matrix P is culled a stochastic matrix if all its entries are non-negative and the summation of the elements of each row is 1.

D e fin it io n

4

Let P = P called the one .step tran.sition (probability) matrix o f this Markov chain.

P'^ = [p\f]ij^s· p\f means starting from i reaching j in two steps, P^ i the second step tran.sition. matrix.

(12)

And for any positive integer n. P" = [ p , p]'·' means starting from, i

reaching j in n steps. F'* is the n-th step transition matrix.

D e fin it io n 5 A sequence of random variahlfs i.X„)n>i culled a sta­

tionary sequence if for each natural numbers h and n. (A 'l. · · · . A',() and

(A'a:+i, · · ·, A\.+„) have the same distribution.

,('0

R e m a r k

1

Since we hare the same transition matri.r in each step, it is cleeir that the seejuence: of random variables which forms a Markov chain is station­ ary.

2.2

Classifications O f The Chains

Let Jii ^ F ,(A i / , · · · , A „_ i ^ L A „ :

/',■:= « / / ,

f

('0

' ÍI

D e fin it io n

6

I. .'1 state t G S is called persistenl if fa = ,/u <

1

.

transient if

2. .-I state i G S is called null persiste nt if the mean recurremce time /t,· = 'Oc·.

3. A state i G S is called perioelic if3t > 1 such that p\'f'^ = 0 unless n = rt, otherwise it is cedlewl aperioelic.

/1 Markov cheiin is called irreducible if 3n such th

otherwise it is ceilled reducible .

in)

> 0 , Vi.,j G ,5',

■5. .4 Markov chain is cedled Ergodic if edl its states are per.si.stent, aperiodic and non-null persistenit states, there exists a stationary elistribution.

6.

4

set of probabilities (

7

r,)jg

5

satisfying nip;j = kj is cedlad a .sta­

tionary distribution.

Theorem

1

4 state j is persistent if ernd only / / F,[A„ = j /.o.] = 1, eind

(13)

/! .slate j is transient if and onlij if P¡[Xn = j i.o] = 0, and Jf n P jf < oc.

Where i.o. stands for infinitely often.

Proof:

See, for example, [3], [

1

].

L e m m a

1

Pj[Xn = ./ '■<>■] P tithe rt.) or I,

Proof:

See [3].

T h e o r e m 2 If a Markör chain is irrtduciblc. then cither all slates are trail- sient. ,P,[U;(A'„ = j İ.O.)] =

0

. 'ii.j G S' and f f n P p ’ < oc·.

Or. all states are persistent, -P [n /(X i = j Le·)] = I. V /.y G S' and

T.nPl,i = oo·

Proof:

See [3], [

6

], [9].

R e m a r k

2

Since fZje.s p]'¡'' = L //>f first alte matine above is impossible if S is a finite set, that is a finite irreducible Markov chain is persistant.

2.3

Convergence Theorems (Ergodic Theorems)

T h e o r e m 3 Suppose of an irreAhicible. aperiodic Markov chain that there ex­

ists a .stationary distribution, that is a solution o /E ,'

6

.',· ^ S,n —

1

,

2

, · · · satisfying tt; >

0

and Ei'es — 1-· ^ ^ , then the Chain is persistent

and ■tt/, V i,j G S.

T h e o r e m 4 In the previous theorem, if the state space is finite, then:

(14)

Proof:

,

6

>9

R e m a i'k 3 Tht main point of thf ronrlimion is that since p ]f reaches Wj for lai'fje n. the effect of the initial state wears off. that is the chain is very stable.

C o r o lla r y

1

Let fi .f> .··· be a see¡iience of ranelom variables which forms an Eryodic Markov chain.

Let

1

- i f f n = j .

0

. otherwise

f;(sl ) W ■ ■ ■ + fjifn)

Then Vj {n) ^ Kj a.s., where a.s. stands for almost sure.

PiooF:

See [5].

2.4 Summary

For an irreducible, aperiodic Markov chain there exist three possibilities:

1

. The chain is transient, G S. liin„_.x, pj'9 =

0

and p\'-'‘ < oo.

If the state space is finite, then this case is impossible.

2

. The chain is persistent, there exists no stationary distribution, 'i i .j G 5',

=

0

and = '^'·

The null persistent case, if the state space is finite again this case is impossible.

3. The chain is Ergodic, there exists a stationary distribution, the chain is non -null persistent, lim„_,x, p·"* = - ¡ > 0 and p¡ = < cc. Vj G S.

(15)

Non-Stationary Markov Chains And

Examples

Chapter 3

3.1

Introduction to Non-Stationary Markov Chains

Assume we have diftereiit· Markov chains with different transition matrices, we will consider combinations ot the probabilities ol these chains. In other words, to get the higher j)robabilities of these coml)inations. we will use different transition matrices.

So, in one step, if we denote the probability of starting from state i reaching state j in one step

1

)\' . then <-/,■,'* is the same as the one step transition probability of the first chain, denote it by and the transition matri.x by Py.

In two steps, denote it by q^\ then = Ere.s· Æ ’ply : where b ! j ’ ]¿b€

5

, denote it by Pi, is the one step transition matrix of the first chain and [pp ]iV/eS) denote it by P-2, is the one step transition matrix of the second chain.

And in general, for any positive integer n. the n-th step probcibility o f the combina.tion, is Eres' ^Ák~^^Pkf ^ wliere q\'¡~^'^ is the (n-l)tlx step probability of the com bination and p·'·'* is the one step probability of the n-th chain, denote its matrix by In matrix form, Qa — P\Pi· · ■

Pa-Obviously, since we consider combinations to find the higher order transi­ tion proljability of a particular state, this means that we are in the same state space, that is all the chains have the same state space.

We will use the same definitions of the original case for irreducible, re­ ducible, periodic, aperiodic, transient, persistent, null persistent and Ergodic

(16)

state (chain).

The main question will be about Ergodicit.y of such com binations, that is whether the limit of c/· ·'* cwists or not and the effect of the initial state whether it wears off or not for large n. In particular we will consirlei· a sequence of Markov chains which tends to some .Markov chain.

3.2

Examples of Stationary Markov Chains

E x a m p le

1

Covsidf r a Markov chain whose trans/lion mafri.v is:

P = Pou Poi /Pi) pii ) pij >

0

. Vi,,y = 0.1.

Such a chain is Ergodic with the unique stationary distribution:

_ l-p i 1 — _ ^ ~Poo

~ 2 - p o o - P i i ’ * 2 - / .) o o - / M i

E x a m p le 2 --4 stochashr matrix is rallrd doubly stochastic if all its columns

sum to 1, i.e, Pti ~ ^ S. It is clear that the n-th power of a doubly

stochastic meitrix is again doubly stochastic.

If an irreehicible. aperiodic finite Markov chain heis a doubly stochastic tran­ sition matrix then this chain is Ergodic with the unique steitioneiry distribution:

7T,j = E, Vj € S, where N is the e:emEmedity of S .

If the state space is infinite, then} either all states ai'e null persistent or all e)f them are transient.

E x a m p le 3 A.s.sume we have an iri-eelucible. aperiodic Men'kov chain on an infinite .state space. .Assuene 3jo C S such that pij^ > >

0

, V/

6

-S', then .swe/i

a chain is Ergoelic, indeed:

PS = Er-e.s PiicPkio > T,ke.s Pikh >

6

>

0

and by induction p·'·’ * > f.io 8. \/n -

1.2

(17)

Since the chain is aperiodic the limit exists, thus jo is Ergodic. Hence, since the chain is irreducihle, edl states are Ergodic: that is this Markov chain is Ergodic.

3.3

Examples of Non-Stationary Markov Chains

E x a m p le 4 Let /1 and. B be two transition matrices of two different Markov cheiins. where:

/1

0 1

P (I

B = P\ (l\ L f)

Notice that both /1" and B^ are witl) non-.:ero enti-ies. so both eff them are

transition matrie-es e)f Ergeydic chenns.

Now. if we consider the trivied combination eff A and B. that is A B A B ■ ■ ■,

then this combination is not Ergodic: indeed:

AB 0 1 P d Pi Hi 1 0

1

0

PP\ + d Pdi

Accoreling to this combination, the first state is absorbent, that is once the peirticle is in the first steite it eloes not teeive d.

Ineleed, the limit eff A B A B ··· eloes not exist, since:

0 1 lim A B A B · · · .4 = t

1

0

•Si

¿1

This is an ex is not Ergodic. liin AB A B · · · B — n

(18)

E x a m p le

5

Let A and B be two Imnsition matrices of two Markov chains, where: 1 / 2 0 0 1 / 2 \ 0 1/2 1 / 2 0 0 1 / 2 1 / 2 0 1 / 2 0 0 1/ 2 ; / 1 / 2 1 / 2 0 0 \ 1 / 2 1 / 2 0 0 0 0 1 / 2 1 / 2 0 0 1 / 2 1/ 2 ; AB = BA = B =

Notice that .4" = -4 and in fjeneral for any n. .1'' = /

1

. also the same for B.

So. both /1 and B are transition inalrices of Non-Biyodic chains. Now:

( i/4 i / 4 I/4 j / l ^ 1/4 1/4 1/4 1./1-1/4 1/4 1/4 l / l \ 1/4 L /l L/4 I / l ;

Moreover. ABA = A B B - AB = BA = A BAB = A B B A, so if we consider

any combination of thfsr chains (proridiny that iisiny both .4 and B at least

one time) is Eryodte. indeed:

( 1/4 1/4 1/1: 1/4 ^ 1/4 1/4 1/4 1/4 1/4 1/4 1/4 1/4 \ 1/4 1 /4 1/4 1/4 y

cornhination) =

This is an example such that the chains used are not Eryodic while any combi­ nation is Eryodic.

E x a m p le

6

(a) Let .4 be the identity matri.r. the transition mutrix of a Markov chain whose all states are absorbent.

Let B be any transition matrix of any Markov chain which is not Eryodic. It is obvious that any combination of these chains is not Eryodic.

(b) Ayain let A be the ideentity matrix.

(19)

Let B be the traiiiiitton matrix oj any Markov chain which is Eryodic. Consider any combination of these chains such that nsiny B is infinitely often. Ayain it is obvious that any combination is Eryodic. Moreover, the limit of emy is the same eis the limit of B ".

This is ein exeimple: of a combination of an Erejexlic chain with a \on-E}rgoeU.c one such that the combination is Eryodic.

E x a m p le 7 Let A and B be two transition matrices of two Markov cheiins. wh ere:

/1

= 1/2 1/2 1/2 1/2

B = i / l :i/-l

:i/-l l/-t

Both A emel B eire with non-xero entries, so both A and B are transition, rnei- trices of Erejoelic Markov cheans.

Consider any combination of /1 and B. then if is Eryodic and the limit of

any combination is the same as the limit of .

1

" the same as the limit of B " .

Indeed this is not stranye .since both A and B are doubly stochastic, so they

hare the same unie[ue stationary distribution (

1/2

1

/

2

)

This an example e)f a combineition of two Eryoelic chains such that the com­ bination is Eryoelic and heis the setme .stedionary distribution as the first chain which is the same as the second cheiin.

E x a m p le

8

let A einel B be two transition matrices of two Meirkov chains, where:

/1

= 0 1

1 0

B = 0 1

p ei

Notice that A is a treinsition medrix of a Non-Eryodic Markov cheiin. It is perioelic with period 2. While B is a transition medrix of an Eryoelic Markov chain (since B^ is with non-zero entries).

(20)

Consider the trivial (■ombination A B A B ···, then it is not Ergodic; indeed the limit of A B A B ··· does not exist.

This is an example of a combination of a Non-Ergodic chain with an Ergodic one such that the combination is not Ergodic.

E x a m p le 9 Let .4 and B be: two transition matrices of two Markov chains where:

1/2

1/2

0

0

\ I./2

0

1/2

0

/1

:

1/2

0

0

1/2

1/2

0

0

0

1

/

2

· .. V 1/4 3 /4

0

0

1/9

0

8

0

B = 1/16

0

0

lol / l

6

. i /.s the. miniher o f the row.

1

0

0

(

1

+D

--1

('■+!)-V ■■■ . . .

.4 is a trails:ition matrix c Ergodic Meirk:or chain (the first column is

bounded). B \is a trunsit ion matrix oj a 2son-Br(jodic Markov chain. Since:

lim , ■ / ¿ ^ / n < l ·

So the first state is tra.n.sient. Since the chain is irreducible. edl steites are tmnsient.

A'ow. any combination of A and B is irreducible. It is obvious since: ciij > 0 bij > 0, V/’,.y

6

S and both A and B are irreelucible.

Next, any combination of A and B such that using /1 infinitely often is non-

null persistent. Since we are: using /1 infinitely often, once we use /1, ry·"' = 1/2

is the n-th step probability of the combination). so:

E r = i «!;·' > E r= ,

1/2

=

So, the first state is non-null persistent. The combination is irreducible, so all states are non-null persistent.

(21)

Next, any conibinalloii of /1 and B micii that iisiny both /1 and B infinitely often is not Ergodic. indeed the limit does not exist. Let C\ be the class of

any combination such that in the n-ih step we hare .4 and C) be the class of

any combination such that in the n-lh step we hare B. both C\ and CB heive

probability

1

/

2

. Now . for C\. =

1

/

2

. for C>· (¡¡\^ < 1/1. V/ € S. Hence

the limit of any such a eombinution does not exist.

This is ein exermple of a combination of Erejodic chain with Eon-Ergodic öne

such that the combination is not Ergodic. moreorer the limits of both /1'* and

B" exist while the limit of the combination does not exist.

E x a m p le

10

Let .4 and B be two transition matrices of two Meirkov chains, me: .4 = B = 1. 0 0 0 0 1 / 2 1 / 2 0 0 1 / 2 1 / 2 0 (.) 0 0 1 / 2 1 / 2 0 0 0 1 / 2 1 / 2 1 / 2 1 / 2 0 0 0 1 / 2 1 / 2 0 0 0

n

0 1 / 2 1 / 2 0 0 0 1 / 2 1 / 2 0 0 0 0 0 1 / 2 /

be)th /1 and B eirt transition matrices e>f reelucible chains, .so they eire not Er-

ejoelic / 4 " =z A , = B. Vn = 1,2. ·· ·). Indeed, any combination o/ A eind B is not Eregoelic. If we use one of them just finitely often, then the combinatton will be reducible, that is. it is not Ergedic. If we use both of them, infimtely often, then the combination, unit be irreelucible but the limit of the combination ■will tend to the zero matrix (both. A and B are doubley stochastic ), that is, it ■is not Ergodic.

This is an example of a combination of Non-hrgodic chains .such that the combination is not Ergodic.

R e m a r k 4 From the prerious e-.ratnple.s, if we consider arbitrary tran.sition matrices of arbitrary Markov Chains and we consider combinations of the.se

(22)

chains, then we have all Ihe possibililies. We can Jin el cejinbinalions of Erejoelic cheiins which eire not hbejoelic. other conibhialions which are h'rejoelic. We cein fine! combinations e)f Won-Erejoeiic chains which are Erejoelic. oilier conibinatiems

which are not Erejoelic. MV can Jlnel combineitions eif Xon-Ere/oelic cherins with.

Erejoelic onees which are Erejeyelic. other combination.s which are not Erejoelic. .So. for .such a case we eliel not reach any conclusion about the tiniit of the combination.

3.4

Examples of Sequences of Markov Chains

E x a m p le

1 1

Let be a see/uence e>f transition matrices e)f Markov cheiins. where:

L/2 - \/n I./2 + l/ii

0

\

4 . = 0 I

\

0

I )

Such a sequence lends In n Markov chain whose Iran.^ilion malriv Is:

/

1/2

1/2

0

\ 0 0 I 0 I 0 /

I'he limit of this seejue ne-e m not Erejoelic. the seconel anel thirel stales eire perioelic

with period 2. Thus. does not exist. Moreover. Iini„ / I

1

/I

2

· · · .4,, does

not .exist.

E x a m p le

12

Led be a sequeence.

0

/ transition matrices of Markov chains, where:

Ifn

1

l/n

1

-

1

/».

1

/ »

This seejuence tends to a Markexv chain whose transition matrix is:

0 1 1 0

/1

=

Jlie limit of this seejuence is not Erejoelic. it is perioelic with period

2

, thus

linirt/1" does next exi.st while l i m „ /l i

42

· · · /l„, exisis and Erejoelic. Indeed:

I ¡

111

.1

1

.-1

2

· · · /1,1. =

1 / 2 1/2 1/2 1/2 II

(23)

E x a m p le 13 be a sequence of tmri.silion tnulrices of Markov chains, where:

..-I·, =

An =

.4 =

exists and Ergodic, indeed:

l i m / l i /

1'2

· · · /l;i / 1/2 0 0 17 2 \ 0

1/2

1/2

0 0

1/2

1/2

0 V 1/2 0 0 1/2 1/2 1/2 0 0 \ 1/2 1/2 0 0 0

0

1/2

1/2

0

0

1/2

1/2

/ 0 0 1 - l/n \

/2

1/2

0

V

/2

1/2

0

^ V 0

0

l / „ /

Markov chain ivhost Iran

/ 0 0 0 1 '

0 1/2 1/2 0 0 1/2 1/2 0

1 1 0 0 0 ^

of the sequence are not and /1" does not exis

1/4 1/4 1/4 1

1 — 1/4 1/4 1/4 1

1/4 1/4 1/4 1

. V;/. = 3,4. · · · .

1/-1 i/-t 1/-1 1/J

Example 14

Let « sequence of transition matrices of Markov

chains, where:

An =

1/3

+

1

/n 2 / 3 - 1 / « 2 / 3 - 1 / n 1/3 +

1

/n

(24)

Such a sequc-.ucc lends lo a Markör rlxiin (rhnsr Irans/fion malri.r is:

i/3 -2/:)

2/:5 l./;{

All the chdins ii.^fd (ire rrfiodic. Iln liiiiil 1··^ l·lı■(|od¡(■. mort o n i·:

( 1/-^ C'd \

lim .1" = lim . li- lj ■ · ·. 1„ =

l/'J

1/2

/

E x a m p le 15 i f ! /•'f <> sniiicnrc of IrauA/lion malriccs of Markov chains, when:

1-1// /- 1///'-

1-1/ / / -

1///'-Sarh a .^cqumict: Ifnds lo a .Markov chain trho.st lran.'<ilion malri.r i··?:

' " ) ■

I i) )

For each fi.ved /?. .4» /■''· <' Iran.silion malri.r of an Fiyodic chain, while the limit of the .■sequence i.< no! Frfiodic and the limil of the combination /.■>· not Ergodic. it l.s the .^ame a.^ .4. that i.s:

11

111

.4

1

2

' ' ' -4/

1 0

I 0

E x a m p le 16 Let (/l„);v=, be a sequence of transition niatrices of Markov chains, where:

\

=

1/2 {\/2f (1/2)-'-^ 0 (1 /2 ) 1/2 (1-/2)'^ ( 1 / 2 ) '- ' 0 (1 /2 ) 1/2 (1/2)·^ ( 1 /2 ) " - ' 0 (1 /2 ) 1/2 (1/2)·^ ( 1 /2 ) " - ' 0 (1 /2 ) 1/2 (1/2)·^ ( 1 /2 ) " - ' 0 (1 /2 )

... j

That Is. in the n-th chain, the n-th stale Is isolated, it is not reached from any slate. For each fired //. the n-th chain is not Frgodic. while the limit o f the

(25)

.4

-.sequence is Ergodic. l'h( Irniit of Hie se.(iuence lia··^ llie Iransitiori matrix /1.

where:

( L/2 (1/2)-^ (1/2)··' ^ 1/2 (1/2)·^ (1/2)··'

1/2 (1/2)·-’ (1/2)··'

V

··· 1

Moreover. /I

1

/I

2

di-i-

4

i / l

-2

■ ■ ■ •4,,. thus:

lini„ / I

1

/I

2

· · · An = lim<. ^'1,. = <1· «'liicli- '-s Ergodic.

E x a m p le 17 Let (/1»),/=] « sequence of transition matrices of Markov chains, wherre: 1/2 (1/2)'*+'

1/2 (1/2)'* + '

1/2 (1/2)'* + ' /in = (1 /2 )" ( 1 /2 ) " - ' (1 /2 )" ( 1 /2 ) " - ' (1 /2 )" ( 1 /2 ) " - '

(1/2)" (1/2)"-'

\

1/2 (1/2)"+'

That is.

At =

[

1/2

(1/2)-' (1/2)"

1/2 (1/2)-' (1/2)··' 1/2 ( 1 /2 )' (1/2)··'

/1. =

. . . y (

1

/

2

F

1/2

(

1

/

2

)-' . . . \ (

1

/

2

/

1/2

(

1

/

2

)·’ (

1

/

2

1/2

(

1

/

2

)’ . . . J d n, the )n-tli chain is Ergenli

seeiuence is not Ergoelic (the sequence ienels to the ..eio malit.i). Moreover, /I

1

/I

2

‘M <mel A\Ai ■ ■ · .4,, = A,i.

Thus, linin /I

1

/I

2

· · · /In = lim« -in· combination is neyt Ergodic.

R e m a r k 5 Frerm the exeimples of this section, if the limit of the sequence is not Ergodic, then the limit of the combinatiern mag not exist, may exist and not Ergodic, may exist and Ergodic.

(26)

Chapter 4

Convergence Theorems Related W ith

Non-Stationary Markov Chains

T h e o r e m 5 Assume we have a finite state space S. Assume we have a se­ quence of Markov chains such that the li-mit of this sequence is an Ercjodic Markov chain. Then the limit of the combination of the elements of this se­ quence exists and Erejodic.

Proof:

Let (y4n)n=i transition matrices of this sec|uence.

Denote the transition matrix of the n-tli chain by .4,, and its entries by b o ’’]<..;e5·

Let l i m „ /l „ = 4 , denote its entries by [a,-,],je.s·, let 6 be the minimum over all the entries of /1, let N be the carclinalitj' ol S and (\i — nnv pij .

Let Qn — AiA-2 ■ ■ ■ 4 „ . denote its entries by

Assume without loss ot generality that /1 is with non-zero eiitiies (otheiwise 3/?o such that is with non-zero entries V/i > rio).

Assume without loss ot generality that V /7 ·, A , , is with non-zeio entiies

(otherwise 3ko such that A,,, is with non-zero entries V/?. > ¿

0

).

over i\jow, tor any stochastic matrix A with entries and minimum all its entries

5

, the tollowing relations are valid:

Denote the summation over j in

5

satisfying > Puj by ^'he sum m ation over j in S satisfying puj < Pvj l^y E arbitrary states a and v.

(27)

+

'y^Álhij ~ Pvj) T ~ Pi’j) = I ~ 1· 0

and since puj + Y.~ Pu,i > then:

J2İP-.İ ' ^ ~ Σ P".i - Σ P'-i ^ ^ I

-iNext, we will use iiicluctioii on n to ι)ιό\ό that: Then: {max, ( η ) - mim < Π ? .,d - Ш ,).

h·./

For η = 1: пни·;, < ( i - İ N min;,, />ÍJ’ > δι

{wax; p\]^ - rnin; p]j') <

(1

- A T i).

and since Qi = Ai

For n =

2

:

{wax; q]j - win; ( I ; ¡ ) < - .VF,

A-G5

( í S ’ - ¿ 3 ’ ) < Σ ( < / ϋ ’ - £ ! w r + Σ ( ί ϋ ’

-where, = max; p\f , = min; p\‘.

'’íj

(2)

^ ( F ? - ¿ 3 ’ ) < Σ ( ? ; ΐ ’ - ¿ í’hF " - - - f )

(4.2)

^ [ ä p - r f ) < (1 - A^¿i)(l - A T

2

)

where = rnax; q ¡ f , i'j ^ ' min; q\j\

(4.4)

(28)

/■=1

where = maxi ('/,■;*” '* ■ = niivi

Now, we want to |;ro\'e that it is correct for n.

N ext, assum e that:

n— [

A-e5

where [

6

··^ ]/.je

5

of A>A:] · · · A,,.

+

(C - « : ■ ’ ) < Z ('/:a’ - - i i i v -r - · '

where hi]·"” ’ * — mr/,r, hi:' '* ■ ~ nr/ii, h]‘· **.

^ («!,"’ - '/.'E £ - ' « ' ) l i e - = r i n - -'■«.)

/=1

where = rnaxi . r = 'm·//;?, c/J-'r

Thus, for all n , - if^ ) < n ,= i(l

-Hence, since n " = i(l ' —>■ 0 as n 'CC'.

Thus, if + X/. then rj” * indeed;

|<

7

·"^ — /■;"*) < Ca'\ where C is constant and

0

< a <

1

. v?:,i,/r· G S.

That is, ~ '^j\ ^ f o " , for each jS and for any i G S.

Notes:

If /1 has zero entries, then 3??o such that .4“ is with non-zero entries Wn >

?7o, so 3 / such that / l i . l i · · · / ! / is with non-zero entries, so we may consider / l [ / l

.2

· · · /

1

;./ and take the limit as k —>■ oo.

i=l

(4.5)

(4.6)

(29)

on n, that is, the limit exists. If we have the same transition matrix in each step, then is rion-increasino· (with respect to n) and r·'' is non-decreasing. If \ve do not have the same transition matri.x, then they are almost m onotoiuc (since we har’e a. convergent sec[uence o f transition matrices).

C o r o l l a r y 2 Assume wf have a sequence of Markov chains on a finite state

space S. Asstcme that the limit of this sequence is Ergodic. Let V)(n) be the

average number of staging in state j . Let (-,),g .s be the stationarg distribution

of the combination of this seeiuenct. Then:

n —^ oc·. Vf > 0.

Proof:

f^et · · · •‘f/i· · · · random variable.x which form the.se Markov chains (the \'alnes of them are in S).

Let

W > ) = 1 hi' M = .)■

0 othcru'ise

/1+1

Let ql’l'' be the n-th step probability of the com bination.

Let i , j be two states in S.

W e want to prove that P[|V)(n) -

7

r,| > e|ifo = 0 ^ 0, as n oc·.

N otice that EVj{n) — Z)m=o which tends to tt,· as n tends to oo.

By Chebyshev’s Inecpiality :

P llV (n ) - ir,| > e|i„ = i) < g !S M y P le .= i|

Thus, we have to prove that i f [ ( l j ( n ) — = /] —>■ 0.

E [ { V i ( n ) - = ,:| = p ^ e ( ( E L „ ( /,( i t ) - /r,)|'^|i„ = ,|

m!P> = E ( ( /,( { i .) /,( i /) ) i i o = <1 - » , E i / , ( a ) i i „ = <1 - > /,£ + ,( 6 )iio = i|+ g

(30)

m(^•.0 _ , » „ ( 0 - ~ ..J k ) ~ where .s = min{k\ 1) , t = |A-; - /|

and since vve do not have the same transition matrix in each step, (¡¡f stands for the probability of the combination from s +

1

to rnax{k\ /).

But we have r/·"^ = tt; + e·“ ^ , e·'·'* < C o "O ’ 1.1

k _i_ ^1]

77

) . · < d /[a * + o ' + + o ']. M is constant. 1____ A7 r .s· I t I A· I l i) 2 Z ^ A - = 0 ¿ ^ / = 0 — ( ,1 + 1 ) 2 L · A · = 0 z J / = o i ^ ' + ^ ' + O' + ( n + 1 ) / AM 2{n+l) (n+l

)2

(

1

-

0

·) _ S M 1 0 a s 7/. — )· CO (1 —a) n + 1 ~ C)l > '^ + 0 = 7] o as n — >■ 'DC .

T h e o r e m

6

A-asumt· u’c have an arbitrary sequence of Markov chains on a

finite state space S with correspondiny transition matrices (/b, )',f:i;i.

Denote the entries of An by [«|j*]/jg.s··

Assume that: m ivij a\'j^ = 6n e > 0 for infinitely many ids.

Then, lim „ y fi/l

'2

is Weak Eryodic.

That is: Vj G S. lim„ |r/,·'*^ — </[.']*] = 0, Vi, A· G .S' . where [ql'j^]i.je.s are the entries of Qn = A1A2 ■ ■ ■ /!„ .

Proof:

Let lY be the cardinality of the state space S.

Recall that for any stochastic matrix P with entries [PiyjtPG.?· if we denote the sum mation over j G S satisfying Puj > puj by and the sum mation over

j G S satisfying Puj < Puj by X]“ , for arbitrary states u and v G .S', then the following relations are valid:

- Pvj) + Y^iVuj - Puj) = 0

Y,iPuj - Pvj) < (1 - NS)

where 6 = m inij p,j, i , j G S.

Next, we will use induction on

7

? to prove that:

29

(4.7)

(31)

[maXi qlj'* - mirii q j f ) < n r= i(^ “ For ?i = 1:

(maXi qj}’ - min,: qjj’ ) = (maxi ajj’ - min, a·'·') < (1 - (-V - l.)6'i - ¿

1

)

(/nax; qjp - mini qjj^) < (1 - N S,). For /

7

. = 2:

(«■fi’.’ - £ ') =

- i!l')N

0)i (V - ,(1) .

7

. «g». Applying the first two equations (4.7.4.S) we get:

( q f f - ¿ f ) < E + (f/il’ - «1? - '»''»A· « h )

< (1 - N S i)(l - N S.)

(max,· qjp - mz??, qjj ) < (1. - 'VcS)( 1 - N S.)·o./

Now, assume that for

/1

= m - 1 it is correct tliat:

(max, - nun, < I I d “ « < ) · U -U )

/=l

Next, we want to prove it for n = m.

(¿7 ^ - ( f f ) = EA:Gsy/ll’ - ¿A [^^!r"‘ 4 ,i€ 5 tlie entries of

/I2/I3 * * * ^fm·

Again, applying the first two equations (4.7,4.8), we get:

( ? ! " ' - ¿ 7 ’ ) < E V i l l ’ - 4 ; · · ' ’ - 4 " · ' ’ )

< (1 - NS,)UT=-2i^ - = UT=iii - -'Wi).

Thus,

(m) ( m )

/,■,· - mtiii qij

Hence, for each natural number /

7

, we have:

(max, f/[7^ - mini f/·,·"^) < I I ( ^

i=i

(32)

{maxi c f f - mini c/f''’) < n " = i(l ~ Next, notice that:

Vlij'' - ¿ i ’ l < (/if - mini (/ f) < n r = i(l

-Since Si > e > 0 infinitel,y niciny. passing to the limit as » tends to oc, the product tends to zero.

Hence, for each j 6 S and arbitrary i. k £ S. we iia\'e:

(«) _

\q]J> - C 0.

That is lim „ q/f is Weak Ergodic.

C o r o lla r y 3 In the previous theorem, ¡¡the sequence of the transition matrices is doubly stochastic; that is:

E ,e s o f f = i; V.y e .s and Vn = 1 . 2 . · · · .

Then, the limit of the combination exists and Eryodic.

Indeed, lim „_oo (/ f = 1/-^' · € 5 . where E is the cardinality o f S.

Proof;

By the previous theorem we ha\'e Weak Ergodicity, so:

W € S, i ! ” ’ = f/J“ ’ + where f|” ’ -* 0 os n -> '30.

N ext. E .e s r « !;’ = + "I"’ ) = I

^

tV

,

y

' + E , s s e l ’'’ =

l

5

!“ ' + eS”' = _

1

//V.

Thus, Vi, = + £!’".

Now, i i p " * = E re.s'/ir’ i ' ’ · where are the entries o f -•dn+ieln+2 ■ · ■

=> ryif+” ^ = Er-c.?(l/tV + 4 ? ) 4 7 ’

< Ek&s{maxk (1/iV + 4 f ) ) l f f

< (l/yV + 7nax), l / if D E k e s b y^’i

(33)

= l/yV + ?7m;i'fc

On the other hand,

« i r ' " ’ > E t6 s (im n t (1/-V + e !:’

= l/.iv - maxt

Thus, If/,·”“’'’ "* — q]f\ < maxj ^ 0 as ti —> cc·. \/j G S.

Hence, lini„_,„„ q\f ~ i/A ', Vi.,j G S.

Theorem 7 Assume we have a seqtieuce of Markov chains on a countable slate

space.S. whose limit is iiiyodic. Assume that:

Bjo G S such that V/ G .S', euj^ > 6 > 0. when [ n - j g s are the entries of

the transition matrix of the limit of the sequence, say ,4.

Then, the limit of the combination oj tins senjııence exists and it is Eryodic.

Proof:

Let (-4,i)i,^:=i be the ti-ansition matrices ol: the se(|uence.

J3enote the entries of = /l i /l « · · · /1,, by [f/,·/'*]/./'gs··

Denote the entries of ,4/;j_^|,4,,j_|_2 /1.. by [ « i r ’"|,y65.

Denote the entries of .4,, by

Assume without loss o f generality that /1 is with non-zero entries and that all are with non-zero entries. Otherwise, the same argument of the finite case.

Define a coupled chain on the state spirce (5 , .S') with transition probabili­ ties:

/’ K w + i, v;.„) =

a .im ,.

u ) = (.,«)] =

H) = ¿ " ’p iy

N otice that this coupled chain is irreducible (it is with non-zeros).

Since (/0 V ('0 ('0

Cjt Pik Pji

(34)

In p articu lar, ^ cHio^^kjo > <5^ > 0. So, if we consider the state (io ,io ) then;

<7ifc[(A'„, y'„) =

(Jo, jo)

'/-o.] =

1

and by irreducibility of the com bination it is correct for anv state (?oi?o)·

Thus, if T is the first time such that A

7

· = ly· = /q: then T is hnite with probability

1

.

Next,

r p ,[(A '„,y ;) = ( A; , / ) . r = m] =

*ifij [(A< , I { ) yf

(^

0

, ^

0

), ^

( ^0 1 /'0 )] V'O'O [( "^ " — "I· ^ I i!—rn ) “ (^b 01

adding out / gives r/p[A'„ = k/T - m] = q.-jT = />/]qjjJ'''‘ ^

adding out k gives q,j[Yn = I, T = m] = qi j T = ni]q\’^)

take k = I , add over m = 1 , 2 , · · · , n

qij[Xn = k,T < /i] = f/iillA = k,T < /i]

^ q-ij[Xn = k] < qij[Xa = k,T < n] + qij[T >

77

] = qij[Yn = k ,T < n] + qij[T > ??.] ( u A ^ n = A:] < q i A Y n = k ] + q i j [ T > ?7.]. Similarly, (4.13) (4.14) iu [l'» = *1 < qal^n = 7] + (/¡j[2' > n|.

Th e above two inequlities

- (&) < (nAT > n].

Since T is finite with probability

1

, lim„_^,^^ |(/,·^* — q^jAl = 0·

This means that the combination is independent of the initial state.

Next,

AjoX V j y j ^ --- - > 0 ,( - N S„ 6 ),thus, by the irreducibility of the com bination. lim,i q\’J^ > 0, Yk G S.

(35)

Hence, the limit of the combination exists and Hrgodic (by the same ar­ gument of the finite case, replacing minimum by infimum and m axim um by supremum, the limit is independent of n).

C o r o l l a r y 4 In theorr-ins (o) and (7). ¡f the limit of the sequence of the tran­

sition matrices ,4 is stable Jrom the Jirst step; that is:

lirn,i -4'‘ = /1 and \/i. j G S, a·,; = a¡.

Then, lim,i Qn = lim„ A

1

H

2

· · · ^4„ = lim„ /H'· = lirn„, /!„ = A.

Proof: h if = + m «;r, |ei:;>|) = « , + m « r , On the other hcvnd, ( i i f = E fc e s · - m a x k = a , - m a x k | 4 ;) ’ l· Thus, k/,V ’ - «j| < max;,: 1

4

.')’ I 0 CIS n cc .

That is, lim„, - lim„

(36)

Chapter 5

Conclusion And Comments

In this work, we classified all the possibilities ol Non-Stationary Markov chains on a finite state space S.

In theorem (6), we have the condition:

min.ij a·” * = 6n > f > 0 for infinitely many n’s.

This condition is essential for our proot. We can restate this condition in an equivalent form; that is:

3 a sequence of integers (/

7)^1

^I'ch that /1,·,+] -d,-,

4-2

· · · is with non­ zero entries and the minimum over all the entries is bounded from below (for infinitely many i’s).

W hen the state space, 5 , is countable, we gave theorem (7) with the con­ dition:

3io £ S such that VI 6 S, ccij^ > 6 > 0.

This condition probably can be weakened.

(37)

R EFEREN CES

[1] M .A Berger, An Introduction To Probability And Stochastic Processes, 1993, Springer-Verlag, New York.

[2] R..N Bhattacharya. E.C Waymire. Stochastic Processes VVitli .Applications. 1990, W iley, New York.

[3] P. Billingsley, Prol)ability .And Measure. 1986. W iley. New York.

[1] L. Breiman, Probability, 1992. Societ}' Por Industrial .-\nd .Applied M ath­ em atics, Philadelphia.

[b] R. Durrett, Probability: Theory .And E.xamples. 1991. Wadsworth And Brooks, Pacific Grove, Calil.

[6] W . Feller, .An Introfluction To Prol)ability Theory .And Its Applications. V o l.l, 1968, Wiley. .New York.

[7] R .G Gallager, Discrete Stochastic Processes, 1996. Kluwer .Academic Pub­ lishers, Boston.

[S] Resnick, I. Sidne}v .Adventures In Stochastic Proces.ses, 1992, Birkhauser, Boston.

[9] A .N Shiryayev, Probability, 1984, Springer-Verlag. New Abrk.

Referanslar

Benzer Belgeler

In this work, alternative design and implementation techniques for feature extraction applications are proposed. The proposed techniques amount to de­ composing the

Candidate sickle cell region boundaries were selected at Shape Ratio &lt; 0.6 and Circularity &lt; 10 on the default M01 and user-de fined Range mask, and at Shape Ratio feature with

I think that an analysis which would interrogate the symbolic construction of women in nationalist ideologies and the imagination of women as gendered national subjects in

The connection between migration and security have been well grasped by Keyman and İçduygu (2000:383-4) who state that “migration can be seen as integral to the discourse of

düşen görevi en iyi şekilde yerine getirebilmesi amacıyla; öğretim elemanı ve öğrencilerimizin elektronik veri tabanları, elektronik kitaplar, basılı yayın,

Cirsium yildizianum (Asteraceae: Cynareae), a new species from East Anatolia, Turkey Author(s): Turan Arabacı and Tuncay Dirmenci.. Source: Annales Botanici

1 Eştoplumlaştırmacılık etnik, dinsel vb. nedenlerle derin ayrışmaların olduğu toplumlarda çözüm amaçlı ortaya atılmış bir kurumsal modeldir. Bu modele göre

A direct numerical method that avoids the perturbation theory approximation was developed by Olmez (1991) to compute the energy transfer for a quartet of waves. A spectral