Suppose we have two Markov chains defined on the same state space. What happens if we alternate them? If they both converge to the same stationary distribution, will the chain obtained by alternating them also converge? Consideration of these questions is motivated by the possible use of two different updating schemes for MCMC estimation, when much faster convergence can be achieved by alternating both schemes than by using either singly.