柯林斯词典Markov chain /ˈmɑːkɒf/ 1. N a sequence of events the probability for each of which is dependent only on the event immediately preceding it 马尔可夫链[statistics] 返回 Markov chain