Markov Processes
Markov processes are a special class of stochastic processes that uniquely determine the future behaviour of the process by its present state. This means that the distributions of events (rates of occurrences) are independent of the history of the system. Furthermore, the transition rates are independent of the time at which the system arrived at the present state. Thus, the basic assumption of the Markov process is that the behaviour of the system in each state is memoryless. The transition from the current state of the system is determined only by the present state and not by the previous state or the time at which it reached the present state. Before a transition occurs, the time spent in each state follows an exponential distribution.
In reliability engineering analysis, these conditions are satisfied if all events (failures, repairs, switch-overs, etc.) in each state occur with constant occurrence rates (failure rate, repair rate, switch-over rate, etc.). Because the basic behaviour of the process is time-independent, these processes are also called Time Homogeneous Markov processes or simply Homogeneous Markov processes. However, failure and repair rates of a component can depend upon the current state. Because of constant transition rate restriction, the Homogenous Markov process should not be used to model the behaviour of systems that are subjected to component wear-out characteristics. General stochastic processes should be used instead.
In most cases, special classes of the stochastic processes that are generalizations to the Homogenous Markov processes are used. The corresponding models include:
• Semi-Markov models. Although very similar to Homogeneous Markov models, the transition times and the probabilities (distributions) depend on the time at which the system reached the present state. This means that the transition rates in a particular state depend on the time already spent in that state, but that they do not depend on the path by which the present state was reached. Thus, transition distributions can be non-exponential.
• Non-homogeneous models. Although very similar to Homogeneous Markov models, the transition times depend on the global system time rather than on the time at which the system reached the current state.
|
A non-exponential distribution (such as normal or Weibull) can be approximated as a set of exponential distributions. In this case, even the distributions are non-exponential, and the homogeneous Markov models discussed in this chapter can be used. However, the results are approximate. Further information about this topic is beyond the scope of this document.
|
As noted earlier, Markov processes are classified based on state space and index space characteristics.
Table 8-1 lists the characteristics of the four types of Markov processes and their corresponding model names.
Table 8-1. Markov Model Types
State Space
|
Index Space
|
Common Model Name
|
Discrete
|
Discrete
|
Discrete Time Markov Chains
|
Discrete
|
Continuous
|
Continuous Time Markov Chains
|
Continuous
|
Discrete
|
Continuous State, Discrete Time Markov Processes
|
Continuous
|
Continuous
|
Continuous State, Continuous Time Markov Processes
|
In most reliability engineering applications, the state space is discrete and the index space (time scale) is continuous. Thus, this chapter focuses on Discrete State Space, Continuous Index Space Homogenous Markov processes. Because the term Markov chain is generally used whenever state space is discrete, the above table refers to these models as Continuous Time Markov Chains. In many text books, these models are simply called Continuous Markov Models.
In addition to being an important concept in reliability analysis, Markov models find wide applications in other areas, including:
• Artificial music.
• Spread of epidemics.
• Traffic on highways.
• Occurrence of accidents.
• Growth and decay of living organisms.
• Emission of particles from radioactive sources.
• Number of people waiting in a line (queue).
• Arrival of telephone calls at a particular telephone exchange.
|
Markov models are included in this guide because they are the only accurate method for modelling complex situations. Although the complex proofs related to these models have not been included, they can be found in many reliability engineering handbooks and related publications.
|