Markov
Markov Chain vs Markov Process
Markov decision processes are an extension of Markov chains; the difference is the addition of actions (allowing choice) and rewards (giving motivation). Conversely, if only one action exists for each state and all rewards are the same (e.g., zero), a Markov decision process reduces to a Markov chain.
Recurrent vs Transient
Recurrent states
State is said to be persistent or recurrent if the probability of returning back to , having started at is 1
If you start at a Recurrent State, then you will for sure return back to that state at some point in the future.
Transient states
State is said to be transient if it is not guaranteed to return back to , having started at ior probability of returning back to having started at is less than 1
There is some positive probability that once you leave you will never return.