A markov chain is a sequence of random variables X0, X1, ... Xn, ..., with the following property:

P{Xn = j | X0 = i0, X1 = i1,..., Xm-1 = im-1, Xm = i} = P{Xn = j | Xm = i}

for all integer times n > m and for states i0, i1,..., im-1, i, j belonging to state S. (Where S is a countable, finite, state space, and P denotes probability)

This means that given that you know the present state of the process then knowledge of the past of the process is irrelevant when calculating the probability distribution for future values of the process.