11 Mar
2018
11 Mar
'18
8:25 a.m.
Let S be a finite set with |S| = s. Let A be a finite random sequence of elements from S with |A| = a. Let B be a finite random sequence of elements from S with |B| = b. What is the expected length f(s, a, b) of a longest common subsequence of A and B? As an example, let A and B be finite sequences of 10 decimal digits with |A| = |B| = 10. Then S = set of decimal digits with |S| = 10. Two possible values of A and B are A = (3,1,4,1,5,9,2,6,5,3) B = (1,4,1,4,2,1,3,5,6,2) A longest common subsequence of A and B is (1,4,1,2,5) of length 5.