• history of coding

    From gah4@u.washington.edu@21:1/5 to Eric Jacobsen on Mon Jan 27 02:43:24 2020
    On Monday, June 10, 2019 at 4:11:53 PM UTC-7, Eric Jacobsen wrote:
    On Mon, 10 Jun 2019 12:51:17 -0700 (PDT), RichD
    <r_delaney2001@yahoo.com> wrote:

    Recently I attended a coding seminar. The speaker
    very briefly reviewed the relevant history;
    first Hamming codes, ~1960: BCH codes, ~1970: convolutional,
    1995: polar, 2002: LDPC

    Clue me in - what was the advance in each case?
    I worked on a BCH project once, that seemed fairly
    efficient, how is it, or other algorithms, deficient?

    (snip)

    You'll probably find different histories out there from different
    sources, as the lineage of some of these aren't all that clear. It's
    like asking when the FFT algorithm was invented...you may get vastly different answers.

    It seems to me that part of it is the ability to process such signals
    at an appropriate speed.

    My favorite in the history of transforms and coding is ICT,
    the Integer Cosine Transform. It is similar to DCT, but the
    coefficients are optimized for speeding up the forward transform,
    at the expense of slowing down the inverse transform:

    https://ntrs.nasa.gov/search.jsp?R=19940025116

    Specifically, the forward transform is done on the CDP1802,
    a microprocessor from the 8080 and 6502 days that doesn't have
    hardware multiply.

    It seems to me that the math for the more complicated coding
    systems might have been around for years, but no ability to
    use them until computing hardware caught up.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)