• End of quantum computers?

    From Jan Panteltje@21:1/5 to All on Mon Feb 12 05:26:38 2024
    Researchers show classical computers can keep up with, and surpass, their quantum counterparts
    https://www.sciencedaily.com/releases/2024/02/240209134402.htm
    Researchers adopt innovative method to boost speed and accuracy of traditional computing

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeroen Belleman@21:1/5 to Martin Brown on Mon Feb 12 11:41:05 2024
    On 2/12/24 11:24, Martin Brown wrote:
    On 12/02/2024 05:26, Jan Panteltje wrote:

    Researchers show classical computers can keep up with, and surpass,
    their quantum counterparts
      https://www.sciencedaily.com/releases/2024/02/240209134402.htm
       Researchers adopt innovative method to boost speed and accuracy of
    traditional computing

    It is a bold claim but not backed up by any convincing evidence.

    I can believe that classical computing and in particular NN type AI can
    be speeded up by making some gross heuristic approximations that are
    usually true. Ignoring almost irrelevant noisy information may work.

    Nothing can surpass an N bit quantum computer for factoring products of impossibly long primes. [snip...]
    Entirely hypothetical. My expectation is that quantum computers
    will never be able to factor numbers with prime factors much beyond
    10^18 or so, if even that.

    Jeroen Belleman

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Jan Panteltje on Mon Feb 12 10:24:33 2024
    On 12/02/2024 05:26, Jan Panteltje wrote:

    Researchers show classical computers can keep up with, and surpass, their quantum counterparts
    https://www.sciencedaily.com/releases/2024/02/240209134402.htm
    Researchers adopt innovative method to boost speed and accuracy of traditional computing

    It is a bold claim but not backed up by any convincing evidence.

    I can believe that classical computing and in particular NN type AI can
    be speeded up by making some gross heuristic approximations that are
    usually true. Ignoring almost irrelevant noisy information may work.

    Nothing can surpass an N bit quantum computer for factoring products of impossibly long primes. The whole of modern public key cryptography is predicated on that task being well beyond present day computing power. A
    decent length quantum register computer could change that overnight.

    Dedicated hardware can always do better than general purpose computers
    at specific tasks but that is a different issue altogether.

    Turing's Bombe or Collosus would have beaten anything less than a 386 PC
    at code breaking despite them having plug boards, paper tape, relays and
    valve logic. They were incredibly cunning designs able to short circuit
    the codebreaking by ruling out big chunks of the search space.

    --
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Jeroen Belleman on Mon Feb 12 11:18:53 2024
    On 12/02/2024 10:41, Jeroen Belleman wrote:
    On 2/12/24 11:24, Martin Brown wrote:
    On 12/02/2024 05:26, Jan Panteltje wrote:

    Researchers show classical computers can keep up with, and surpass,
    their quantum counterparts
      https://www.sciencedaily.com/releases/2024/02/240209134402.htm
       Researchers adopt innovative method to boost speed and accuracy of
    traditional computing

    It is a bold claim but not backed up by any convincing evidence.

    I can believe that classical computing and in particular NN type AI
    can be speeded up by making some gross heuristic approximations that
    are usually true. Ignoring almost irrelevant noisy information may work.

    Nothing can surpass an N bit quantum computer for factoring products
    of impossibly long primes. [snip...]
    Entirely hypothetical. My expectation is that quantum computers
    will never be able to factor numbers with prime factors much beyond
    10^18 or so, if even that.

    I remain unconvinced that quantum computers can be made reliable enough
    to do anything remotely useful in the real world. OTOH they have been
    making real progress and today's mobile phones look like magic compared
    to the big iron of yesteryear. My first mainframe IBM 370/165 had a
    whopping 4MB of main memory and you had to get a special ticket to use
    more than 500k at once. Algebra systems wouldn't run in less than 2MB.

    --
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Martin Brown on Mon Feb 12 10:34:52 2024
    On 2/12/2024 4:18 AM, Martin Brown wrote:
    On 12/02/2024 10:41, Jeroen Belleman wrote:
    On 2/12/24 11:24, Martin Brown wrote:
    On 12/02/2024 05:26, Jan Panteltje wrote:

    Researchers show classical computers can keep up with, and surpass, their >>>> quantum counterparts
      https://www.sciencedaily.com/releases/2024/02/240209134402.htm
       Researchers adopt innovative method to boost speed and accuracy of >>>> traditional computing

    It is a bold claim but not backed up by any convincing evidence.

    I can believe that classical computing and in particular NN type AI can be >>> speeded up by making some gross heuristic approximations that are usually >>> true. Ignoring almost irrelevant noisy information may work.

    Nothing can surpass an N bit quantum computer for factoring products of
    impossibly long primes. [snip...]
    Entirely hypothetical. My expectation is that quantum computers
    will never be able to factor numbers with prime factors much beyond
    10^18 or so, if even that.

    I remain unconvinced that quantum computers can be made reliable enough to do anything remotely useful in the real world. OTOH they have been making real progress and today's mobile phones look like magic compared to the big iron of
    yesteryear. My first mainframe IBM 370/165 had a whopping 4MB of main memory and you had to get a special ticket to use more than 500k at once. Algebra systems wouldn't run in less than 2MB.

    Predicting that something "can't" be done/happen is usually folly.
    Predicting that something WON'T (volition) be done is a safer bet.

    My first AI course (mid 70's) predicted it would be "about 10 years"
    for AI to be practical. That prediction was repeated "about every
    10 years" and came to be a bit of a standing joke.

    Now, almost "suddenly", it's "here" (in a particular form) and
    folks are unprepared for it (both to exploit it and safeguard
    against it).

    Not only is it available but it is also ACCESSIBLE (early predictions
    treated it as an "ivory tower" sort of technology; clearly not something
    that Joe Average User could access!)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeroen Belleman@21:1/5 to Don Y on Mon Feb 12 20:26:48 2024
    On 2/12/24 18:34, Don Y wrote:
    On 2/12/2024 4:18 AM, Martin Brown wrote:
    On 12/02/2024 10:41, Jeroen Belleman wrote:
    On 2/12/24 11:24, Martin Brown wrote:
    On 12/02/2024 05:26, Jan Panteltje wrote:

    Researchers show classical computers can keep up with, and surpass,
    their quantum counterparts
      https://www.sciencedaily.com/releases/2024/02/240209134402.htm
       Researchers adopt innovative method to boost speed and accuracy >>>>> of traditional computing

    It is a bold claim but not backed up by any convincing evidence.

    I can believe that classical computing and in particular NN type AI
    can be speeded up by making some gross heuristic approximations that
    are usually true. Ignoring almost irrelevant noisy information may
    work.

    Nothing can surpass an N bit quantum computer for factoring products
    of impossibly long primes. [snip...]
    Entirely hypothetical. My expectation is that quantum computers
    will never be able to factor numbers with prime factors much beyond
    10^18 or so, if even that.

    I remain unconvinced that quantum computers can be made reliable
    enough to do anything remotely useful in the real world. OTOH they
    have been making real progress and today's mobile phones look like
    magic compared to the big iron of yesteryear. My first mainframe IBM
    370/165 had a whopping 4MB of main memory and you had to get a special
    ticket to use more than 500k at once. Algebra systems wouldn't run in
    less than 2MB.

    Predicting that something "can't" be done/happen is usually folly.
    Predicting that something WON'T (volition) be done is a safer bet.

    [...]

    What will probably happen is that quantum computing will fail
    to live up to expectations and at some point, funding will run
    dry.

    Jeroen Belleman

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Jeroen Belleman on Mon Feb 12 15:48:18 2024
    On 2/12/2024 12:26 PM, Jeroen Belleman wrote:
    On 2/12/24 18:34, Don Y wrote:
    On 2/12/2024 4:18 AM, Martin Brown wrote:
    On 12/02/2024 10:41, Jeroen Belleman wrote:
    On 2/12/24 11:24, Martin Brown wrote:
    On 12/02/2024 05:26, Jan Panteltje wrote:

    Researchers show classical computers can keep up with, and surpass, their
    quantum counterparts
      https://www.sciencedaily.com/releases/2024/02/240209134402.htm
       Researchers adopt innovative method to boost speed and accuracy of >>>>>> traditional computing

    It is a bold claim but not backed up by any convincing evidence.

    I can believe that classical computing and in particular NN type AI can be
    speeded up by making some gross heuristic approximations that are usually >>>>> true. Ignoring almost irrelevant noisy information may work.

    Nothing can surpass an N bit quantum computer for factoring products of >>>>> impossibly long primes. [snip...]
    Entirely hypothetical. My expectation is that quantum computers
    will never be able to factor numbers with prime factors much beyond
    10^18 or so, if even that.

    I remain unconvinced that quantum computers can be made reliable enough to >>> do anything remotely useful in the real world. OTOH they have been making >>> real progress and today's mobile phones look like magic compared to the big >>> iron of yesteryear. My first mainframe IBM 370/165 had a whopping 4MB of >>> main memory and you had to get a special ticket to use more than 500k at >>> once. Algebra systems wouldn't run in less than 2MB.

    Predicting that something "can't" be done/happen is usually folly.
    Predicting that something WON'T (volition) be done is a safer bet.

    [...]

    What will probably happen is that quantum computing will fail
    to live up to expectations and at some point, funding will run
    dry.

    Possible. Or, some breakthrough will provide a way around
    existing problems. Or, some OTHER technology will offer the
    same promise with a different approach.

    As long as there is the potential for a "desirable outcome",
    folks will explore options in "that direction". (fusion
    has been 10 years off for about as long as AI!)

    [E.g., bubble memory was supposed to be a great, compact
    nonvolatile memory... until solid state technologies
    tipped the balance]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jan Panteltje@21:1/5 to blockedofcourse@foo.invalid on Tue Feb 13 06:06:37 2024
    On a sunny day (Mon, 12 Feb 2024 10:34:52 -0700) it happened Don Y <blockedofcourse@foo.invalid> wrote in <uqdkrv$1kq04$1@dont-email.me>:

    On 2/12/2024 4:18 AM, Martin Brown wrote:
    On 12/02/2024 10:41, Jeroen Belleman wrote:
    On 2/12/24 11:24, Martin Brown wrote:
    On 12/02/2024 05:26, Jan Panteltje wrote:

    Researchers show classical computers can keep up with, and surpass, their >>>>> quantum counterparts
      https://www.sciencedaily.com/releases/2024/02/240209134402.htm
       Researchers adopt innovative method to boost speed and accuracy of >>>>> traditional computing

    It is a bold claim but not backed up by any convincing evidence.

    I can believe that classical computing and in particular NN type AI can be >>>> speeded up by making some gross heuristic approximations that are usually >>>> true. Ignoring almost irrelevant noisy information may work.

    Nothing can surpass an N bit quantum computer for factoring products of >>>> impossibly long primes. [snip...]
    Entirely hypothetical. My expectation is that quantum computers
    will never be able to factor numbers with prime factors much beyond
    10^18 or so, if even that.

    I remain unconvinced that quantum computers can be made reliable enough to do
    anything remotely useful in the real world. OTOH they have been making real >> progress and today's mobile phones look like magic compared to the big iron of
    yesteryear. My first mainframe IBM 370/165 had a whopping 4MB of main memory >> and you had to get a special ticket to use more than 500k at once. Algebra >> systems wouldn't run in less than 2MB.

    Predicting that something "can't" be done/happen is usually folly.
    Predicting that something WON'T (volition) be done is a safer bet.

    My first AI course (mid 70's) predicted it would be "about 10 years"
    for AI to be practical. That prediction was repeated "about every
    10 years" and came to be a bit of a standing joke.

    Now, almost "suddenly", it's "here" (in a particular form) and
    folks are unprepared for it (both to exploit it and safeguard
    against it).

    Not only is it available but it is also ACCESSIBLE (early predictions
    treated it as an "ivory tower" sort of technology; clearly not something
    that Joe Average User could access!)

    Now <when> would this breakthrough happen for fusion power?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Jeroen Belleman on Tue Feb 13 16:10:33 2024
    On 12/02/2024 19:26, Jeroen Belleman wrote:
    On 2/12/24 18:34, Don Y wrote:
    On 2/12/2024 4:18 AM, Martin Brown wrote:
    On 12/02/2024 10:41, Jeroen Belleman wrote:
    On 2/12/24 11:24, Martin Brown wrote:
    On 12/02/2024 05:26, Jan Panteltje wrote:

    Researchers show classical computers can keep up with, and
    surpass, their quantum counterparts
      https://www.sciencedaily.com/releases/2024/02/240209134402.htm
       Researchers adopt innovative method to boost speed and accuracy >>>>>> of traditional computing

    It is a bold claim but not backed up by any convincing evidence.

    I can believe that classical computing and in particular NN type AI
    can be speeded up by making some gross heuristic approximations
    that are usually true. Ignoring almost irrelevant noisy information
    may work.

    Nothing can surpass an N bit quantum computer for factoring
    products of impossibly long primes. [snip...]
    Entirely hypothetical. My expectation is that quantum computers
    will never be able to factor numbers with prime factors much beyond
    10^18 or so, if even that.

    I remain unconvinced that quantum computers can be made reliable
    enough to do anything remotely useful in the real world. OTOH they
    have been making real progress and today's mobile phones look like
    magic compared to the big iron of yesteryear. My first mainframe IBM
    370/165 had a whopping 4MB of main memory and you had to get a
    special ticket to use more than 500k at once. Algebra systems
    wouldn't run in less than 2MB.

    Predicting that something "can't" be done/happen is usually folly.
    Predicting that something WON'T (volition) be done is a safer bet.

    [...]

    What will probably happen is that quantum computing will fail
    to live up to expectations and at some point, funding will run
    dry.

    A bit like AI in the late 70's then. Draughts(checkers) fell to it
    almost immediately - chess and machine vision were much tougher nuts to
    crack. Things in AI research only really hotted up again after Deep Blue
    beat Kasparov and then again later when Alpha-Go beat the human world
    champion at Go under match conditions.

    The latter was a feat that no-one in the field really expected to come
    so quickly. Not far behind that were the large language models.

    The next iteration of quantum computing when it is next in fashion again
    might well get over the line to being truly useful (but I'm not holding
    my breath).

    It's a bit like cheap fusion power - remaining elusively about 50 year
    away and has done now for over 50 years.

    --
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)