markov state sequence with grand()

诚 甄 dmontezano at hotmail.com
Thu Apr 15 17:45:04 CEST 2010


Hello fellow Scilab Users,

I am using the grand() function to generate a state sequence for a 2-state Markov chain.
It works fine to generate the sequence, but I don't quite understand how the initial state parameter 'x0' works. Here is the code I use:

//-------------------
N=50;                       //length of sequence
P=[0.6 0.4;0.1 0.9];        //state transition matrix
q=grand(N,'markov',P,[1]);  //initial state is '1'
q=-(q*2.0-3.0);             //normalize to -1:+1 range

clf();
plot2d3([1:N],[q],[0],rect=[0,-1.5,N,1.5],leg='2-state Markov chain');
//------------------

If you run this piece of code a few times, sometimes it will start with state '1', but not always.

>From the help page I understand that:

- if I want the chain to start in state '1', then I just have to make 'x0 = [1]', as in the code above;

- if I want the state sequence to start from state '2', then I should do 'x0=[2]'.

- if I want to get paths starting from both states I would make x0=[1 2], and each column of q (the output) would represent a path starting at the corresponding state.

However, if I make x0=[1] or x0=[2], I get realizations starting from any of the states (sometimes state 1 and sometimes state 2) and that does not make much sense to me, because then, what are the limiting-state probabilities that the function is using to choose the initial state?
According to what is explained in the grand() help page, if I make, for example, x0=[1], shouldn't the sequence 'always' start in state 1?

I appreciate any help and clarifications on the behaviour of this function.

Best regards to all,

Dan
 		 	   		  
_________________________________________________________________
Hotmail: Free, trusted and rich email service.
https://signup.live.com/signup.aspx?id=60969
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.scilab.org/pipermail/users/attachments/20100415/338c41b9/attachment.htm>


More information about the users mailing list