About 50 results
Open links in new tab
  1. Probability of a Markov chain $X_n \sim U (1, 2 X_ {n-1})$ reaching ...

    Feb 24, 2026 · I am analyzing a discrete-time Markov chain that can grow exponentially but also suffers from frequent, severe drops. I want to find the exact probability that it reaches a certain threshold …

  2. What is the difference between all types of Markov Chains?

    Apr 25, 2017 · A Markov process is basically a stochastic process in which the past history of the process is irrelevant if you know the current system state. In other words, all information about the …

  3. what is the difference between a markov chain and a random walk?

    Jun 17, 2022 · Then $\text {it's a Markov Chain}$ . If you use another definition : From the first line of each random walk and Markov Chain , I think a Markov chain models a type of random walk , but it …

  4. Is every stochastic process "approximately" Markov? [closed]

    Apr 13, 2026 · When I started doing machine learning in the 1990s, many practitioners and researchers used Hidden Markov Models which are measure theoretically isomorphic to Markov models, but have …

  5. Intuition behind positive recurrent and null recurrent Markov Chains

    Jul 22, 2025 · For irreducible Markov chains, if a state is recurrent, then every other state in the state space is automatically recurrent as well. This holds analogously for positive recurrence and null …

  6. 'Intuitive' difference between Markov Property and Strong Markov …

    Aug 14, 2016 · My question is a bit more basic, can the difference between the strong Markov property and the ordinary Markov property be intuited by saying: "the Markov property implies that a Markov …

  7. Why Markov matrices always have 1 as an eigenvalue

    Now in markov chain a steady state vector ( when effect multiplying or any kind of linear transformation on prob state matrix yield same vector) : qp=q where p is prob state transition matrix this means Y = …

  8. reference request - What are some modern books on Markov Chains …

    I would like to know what books people currently like in Markov Chains (with syllabus comprising discrete MC, stationary distributions, etc.), that contain many good exercises. Some such book on

  9. Real Applications of Markov's Inequality - Mathematics Stack Exchange

    Mar 11, 2015 · Markov's Inequality and its corollary Chebyshev's Inequality are extremely important in a wide variety of theoretical proofs, especially limit theorems. A previous answer provides an example.

  10. Generalisation of the Markov property to stopping times

    Aug 1, 2023 · So apparently it is a different way of generalising the weak Markov property. Broadly speaking, I would like to know whether this property ($\star$) has a name and under what conditions …