samedi 22 août 2020

why am i getting slightly off outputs for this dp problem?

The Question:
Given an array A of N distinct integers and an array B of integers(need not be distinct). Find the min no. of numbers that need to be added to B to make A a subsequence of it.

My Strategy:
Quite simple- find the longest common subsequence, lcs and so the answer is sizeof(A) - lcs.

My Code:

int lcs(vector<int>A, vector<int>B, int n, int m)  
{  
    int L[m + 1][n + 1];  
    int i, j;  
      
    /* Following steps build L[m+1][n+1] in  
       bottom up fashion. Note that L[i][j]  
       contains length of LCS of X[0..i-1] 
       and Y[0..j-1] */
    for (i = 0; i <= m; i++)  
    {  
        for (j = 0; j <= n; j++)  
        {  
        if (i == 0 || j == 0)  
            L[i][j] = 0;  
      
        else if (B[i - 1] == A[j - 1])  
            L[i][j] = L[i - 1][j - 1] + 1;  
      
        else
            L[i][j] = max(L[i - 1][j], L[i][j - 1]);  
        }  
    }  
          
    /* L[m][n] contains length of LCS  
    for A[0..n-1] and B[0..m-1] */

    return (n - L[m][n]);  
}  

My output:
I am getting wrong output. (Differing mostly by 1.) I was also getting TLE for some test cases.
can someone locate where i am going wrong in logic or in code?

Aucun commentaire:

Enregistrer un commentaire