• About computational complexity

    From Wisdom90@21:1/5 to All on Fri Jan 10 08:53:19 2020
    Hello,

    Read this:


    About computational complexity

    I have just read the following webpage of a PhD Computer Scientist and researcher from Montreal Quebec Canada where i am living now from year
    1989, here is the webpage and read it carefully:

    Better computational complexity does not imply better speed

    https://lemire.me/blog/2019/11/26/better-computational-complexity-does-not-imply-better-speed/

    And here is his resume:

    https://lemire.me/pdf/resume/resumelemire.pdf

    So as you are noticing on the webpage above he is saying the following
    about computational complexity:

    "But it gets worse: these are not scientific models. A scientific model
    would predict the running time of the algorithm given some
    implementation, within some error margin. However, these models do
    nothing of the sort. They are purely mathematical. They are not
    falsifiable. As long as they are mathematically correct, then they are
    always true. To be fair, some researchers like Knuth came up with models
    that closely mimic reasonable computers, but that’s not what people
    pursuing computational complexity bounds do, typically."

    But i don't agree with him because i think he is not understanding
    the goal of computational complexity, because when we say that
    an algorithm has a time complexity of n*log(n), you have to
    understand that it is by logical analogy like saying in physics what is
    the material resistance, because the n*log(n) means how well the
    algorithm is "amortizing"(it means reducing) the "time" that it takes
    taking as a reference of measure the average computational complexity of
    the giving algorithm, but we can easily notice that an algorithm is
    becoming faster, in front of the data that is giving to him, by going
    from an exponential time complexity towards a logarithmic time
    complexity, so we are noticing that the computational complexity is
    "universal" and it measures how well the algorithm amortizes(that means
    it reduces) the time that it takes taking as a reference of measure the
    average computational complexity of the giving algorithm,
    so this is is why computational complexity is scientific and also it is engineering and it gives us information on the physical world.

    So to give an interesting example of science in computing, we can ask of
    what is the time complexity of a binary search algorithm, and here
    is my mathematical calculations of its time complexity:

    Recurrence relation of a binary search algorithm is: T(n)=T(n/2)+1

    Because the "1" is like a comparison that we do in each step of
    the divide and conquer method of the binary search algorithm.

    So the calculation of the recurrence equation gives:

    1st step=> T(n)=T(n/2) + 1

    2nd step=> T(n/2)=T(n/4) + 1 ……[ T(n/4)= T(n/2^2) ]

    3rd step=> T(n/4)=T(n/8) + 1 ……[ T(n/8)= T(n/2^3) ]

    .

    .

    kth step=> T(n/2^k-1)=T(n/2^k) + 1*(k times)

    Adding all the equations we get, T(n) = T(n/2^k) + k times 1

    This is the final equation.

    So how many times we need to divide by 2 until we have only one element
    left?

    So it must be:

    n/2^k= 1

    This gives: n=2^k

    this give: log n=k [taken log(base 2) on both sides ]

    Put k= log n in the final equation above and it gives:

    T(n) = T(1) + log n

    T(n) = 1 + log n [we know that T(1) = 1 , because it’s a base condition
    as we are left with only one element in the array and that is the
    element to be searched so we return 1]

    So it gives:

    T(n) = O(log n) [taking dominant polynomial, which is n here)

    This is how we got “log n” time complexity for binary search.



    Thank you,
    Amine Moulay Ramdane.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)