• These days what percentage of a CPU's work involves doing arithmetic co

    From Roger L Costello@21:1/5 to All on Wed Jul 14 18:30:24 2021
    Hello Compiler Experts!

    As I understand it, computers were originally designed to do arithmetic computations and in the old days nearly 100% of a CPU's work involved arithmetic computations.

    I look at what I now do on a daily basis with computers and it is primarily text processing. My guess is that "text processing" at the machine level
    mostly means doing comparisons and moving things into and out of memory/registers; that is, not much in the way of arithmetic computations. Is that correct?

    These days what percentage of a CPU's work involves doing arithmetic computations versus other, non-arithmetic computations?

    /Roger
    [I don't think it was ever true except perhaps on the ENIAC. Also, what do
    you mean by arithmetic? Are the additions and multiplications to do indexing and array addresssing arithmetic? If you mean floating point. there wasn't
    any floating point hardware until the IBM 704 in 1954 but there was plenty
    of computing before that. -John]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From gah4@21:1/5 to Roger L Costello on Thu Jul 15 02:31:11 2021
    On Wednesday, July 14, 2021 at 12:42:37 PM UTC-7, Roger L Costello wrote:
    Hello Compiler Experts!

    As I understand it, computers were originally designed to do arithmetic computations and in the old days nearly 100% of a CPU's work involved arithmetic computations.

    It seems that people might have believed that, even for a long time, but
    I suspect rarely true. There are stories about the IBM 704 Fortran compiler, and the authors believed that they had to make optimal use of the hardware,
    or no-one would use their compiler. At the time, that would have been
    assembly programmers, in some for or other. Then when they were testing
    the compiler, they were surprised at the code generated doing things
    better than they thought of doing.

    Early computers were sold with minimal, if any, software.

    Then IBM designed System/360 and OS/360 to go along with it.
    About that time (I am sure some will disagree when) the costs of writing software surpassed the costs of hardware. So, anything that can reduce
    the cost of hardware is worth considering. So, more and more use of
    high-level langauges, even at the cost of wasted CPU time.

    I remember wondering in the Cray-1 days, with the Cray-1 designed to be
    very fast at floating point, if it was a waste to run a compiler on it.
    It seemed to me that it would have been better to use a cross compiler,
    so the Cray floating point processing would be best used. As well as I
    know, that mostly was not done.

    I look at what I now do on a daily basis with computers and it is primarily text processing. My guess is that "text processing" at the machine level mostly means doing comparisons and moving things into and out of memory/registers; that is, not much in the way of arithmetic computations. Is that correct?

    Good text processing is reasonably numeric intensive. TeX uses dynamic programming to find the optimal line breaking points on a page. It is less optimal in computing page breaks, as computers weren't so fast at the time.
    But computers have gotten faster, so the amount of time used decreased.

    These days what percentage of a CPU's work involves doing arithmetic computations versus other, non-arithmetic computations?

    Close to zero. Remember, the CPU is most of the time sitting there waiting
    for you to do something. Some systems have an actual "null job",
    accumulating the CPU time not used for anything else. Others don't tell you about it, but might keep track of how much is used. IBM S/360 processors
    have a "wait state" to stop the CPU when there isn't anything to do. Rental charges depended on how much of the time it was actually computing.

    But note also that the power used by CMOS logic (most CPUs today)
    depends almost linearly on how much is being done. The CPU gets
    much hotter when it is actually working. This wasn't always true.
    ECL power use is almost independent of how much it is doing.
    [I generally agree except to note that modern PCs and particularly phones display a lot of high quality images and video, both of which require
    extensive arithmetic to get from the internal representation to the bitmap on the screen. General purpose CPUs have extended instruction sets like
    Intel's SSE and AVX, and often there are GPUs on the same chip as the
    CPU, as in the Apple M1. I get the impression that compilers don't
    deal very well with these things, so vendors provide large libraries
    of assembler code to use them. -John]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Hans-Peter Diettrich@21:1/5 to All on Thu Jul 15 22:02:58 2021
    On 7/15/21 11:31 AM, gah4 wrote:
    On Wednesday, July 14, 2021 at 12:42:37 PM UTC-7, Roger L Costello wrote:

    I look at what I now do on a daily basis with computers and it is primarily >> text processing. My guess is that "text processing" at the machine level
    mostly means doing comparisons and moving things into and out of
    memory/registers; that is, not much in the way of arithmetic computations. Is
    that correct?

    Good text processing is reasonably numeric intensive. TeX uses dynamic programming to find the optimal line breaking points on a page.

    Much arithmetic is required for rendering glyphs on screen, depending on
    fonts, text attributes etc. That's true already for a browser, no need
    to ask for an editor or sophisticated text processing system.

    An editor does not much computation, it mostly fits together updated
    text snippets. Together with an Undo function this results in a list of
    changes that are fixed only when the text is saved to disk again.

    DoDi

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Robin Vowels@21:1/5 to All on Fri Jul 16 15:12:02 2021
    ----- Original Message -----
    From: "Roger L Costello" <costello@mitre.org>
    Sent: Thursday, July 15, 2021 4:30 AM

    As I understand it, computers were originally designed to do arithmetic computations and in the old days nearly 100% of a CPU's work involved arithmetic computations.

    [I don't think it was ever true except perhaps on the ENIAC.

    From ENIAC, computers were designed to perform arithmetic
    computations. The motivation was to be able to reduce the
    amount of time that it was taking to deliver results (compared to
    manual methods using mechanical adding machines). At the same
    time, it was expected that human errors would be reduced.

    (Even earlier, Charles Babbage, appalled by errors in tables
    produced by hand methods, designed machines to do
    the work.)

    Also, what do
    you mean by arithmetic? Are the additions and multiplications to do indexing and array addresssing arithmetic?

    Of course.

    If you mean floating point. there wasn't
    any floating point hardware until the IBM 704 in 1954

    It is said that the Z3 (1941) was designed with floating-point.

    But even if that were not true, floating-point was already in use
    in the 1940s at least in the design of the ACE and the Pilot ACE.
    Even before a machine as built, those involved were designing
    and refining instruction tables (subroutines) for carrying out
    numerical work. In the 1950s, floating-point software
    (including block floating) was extensively used on Pilot ACE and,
    from 1955, on DEUCE.

    but there was plenty of computing before that. -John]
    [I don't think the Z3 was ever built other than as a much later
    retrocomputing project. Von Neumann apparently considered floating
    point for the EDVAC and IAS machine but rejected it as too complex
    and anyway doing the scaling in software was easy, which it
    probably was if you were Von Neumann. -John]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Robin Vowels@21:1/5 to All on Fri Jul 16 14:47:25 2021
    ----- Original Message -----
    From: "gah4" <gah4@u.washington.edu>
    Sent: Thursday, July 15, 2021 7:31 PM

    On Wednesday, July 14, 2021 at 12:42:37 PM UTC-7, Roger L Costello wrote:
    Hello Compiler Experts!

    As I understand it, computers were originally designed to do arithmetic
    computations and in the old days nearly 100% of a CPU's work involved
    arithmetic computations.

    It seems that people might have believed that, even for a long time, but
    I suspect rarely true.
    .
    Certainly early computers were designed to perform numerial computations.
    From ENIAC, which was designed to compute range tables, Pilot ACE, DEUCE
    and most others were all designed to perform numerical work.
    They were intended to take over the work of computors (those who used mechanical caculating machines), and designed to reduce the time taken
    to carry out numerical computations.

    There are stories about the IBM 704 Fortran compiler,
    and the authors believed that they had to make optimal use of the hardware, or no-one would use their compiler. At the time, that would have been assembly programmers, in some for or other.

    In the early days, there was no assembler. It was all machine code,
    laboriously crafted by programmers.

    Then when they were testing
    the compiler, they were surprised at the code generated doing things
    better than they thought of doing.

    Early computers were sold with minimal, if any, software.

    That's not true. For the DEUCE, over 1,000 programs and subroutines
    were published by The English Electric Company and distributed free
    to the owners of their computers. Many of these programs and subroutines
    were designed by the users (customers). The cards occupied several
    punch card cabinets holding about 50,000 cards, while the writeups
    (user manuals) occupied a couple of filing cabinets. Everything was in duplicate.

    For SILLIAC, a large handbook containing programs and subroutines
    was published.

    For the S/360 cited, IBM published volumes of scientific subroutines
    in both FORTRAN and PL/I.
    [On the other hand, the first commercial computers, LEO in the UK and
    Univac in the US, were used for business work which involved only
    modest amounts of arithmetic. IBM's 701 "defense calculator" was
    primarily for arithmetic but the 702 was character addressed for
    business use.
    The goal of S/360 was to unify IBM's product lines
    so there'd be only one set of hardware and operating system for
    commercial and scientific use. Memory limits soon forced multiple
    operating systems (BOS, DOS/TOS, OS) but I gather that on all of them
    the most used application was sort/merge. -John]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From gah4@21:1/5 to then our moderator on Thu Jul 15 23:49:55 2021
    On Thursday, July 15, 2021 at 10:22:58 AM UTC-7, gah4 wrote:

    (snip, I wrote)

    Close to zero. Remember, the CPU is most of the time sitting there waiting for you to do something. Some systems have an actual "null job",
    accumulating the CPU time not used for anything else.

    (snip, then our moderator wrote)

    [I generally agree except to note that modern PCs and particularly phones display a lot of high quality images and video, both of which require extensive arithmetic to get from the internal representation to the bitmap on the screen. General purpose CPUs have extended instruction sets like
    Intel's SSE and AVX, and often there are GPUs on the same chip as the
    CPU, as in the Apple M1. I get the impression that compilers don't
    deal very well with these things, so vendors provide large libraries
    of assembler code to use them. -John]

    Yes, I wasn't so sure what was "olden days" and what is "new days".

    There is pretty much a continuous change, as processors get faster,
    then less efficient processing makes more sense.
    Among others, less efficient, interpreted languages have become
    more popular.

    It is interesting, though. For much of the 1990's, faster and faster
    processor became available for compute intensive applications like computational physics, but mostly driven by demand from other uses.

    Some of that was people who bought faster processors because they
    could, and some by gaming. For the most part, processors haven't been
    built for compute intensive use from about the 1990's.

    In the 1980's, there were some coprocessor to speed up compute intensive problems, such as FPS (Floating Point Systems). But as desktop computers,
    and especially x86 machines, got faster there was less need for them.

    And then GPUs to speed up graphics, mostly for games, but then compute intensive users found that they could use them, too. Except that most are only single precision.

    As for compilers: In Fortran95, the FORALL statement was added, a non-loop parallel assignment statement well designed for vector processors, just when vector processors (like ones made by Cray) were going away.

    FORALL requires (at least the effect of) the whole right side be evaluated before the left side is changed. So it isn't actually well designed for vector processors, with vector registers, like the Cray-1.

    So now there is DO CONCURRENT. Completely different from FORALL,
    and hopefully more adapted to modern processors. But I don't know so
    well how it does with SSE and such.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Philipp Klaus Krause@21:1/5 to All on Fri Jul 16 18:31:05 2021
    any floating point hardware until the IBM 704 in 1954

    It is said that the Z3 (1941) was designed with floating-point.

    […]
    [I don't think the Z3 was ever built other than as a much later retrocomputing project.  Von Neumann apparently considered floating
    point for the EDVAC and IAS machine but rejected it as too complex
    and anyway doing the scaling in software was easy, which it
    probably was if you were Von Neumann. -John]

    AFAIK, the Z1 (built in 1939, working but like many early computers not
    very reliable, working, destroyed 1944, working replica in a museum in
    Berlin), the Z3 (built in 1941, destroyed 1943, working replica in a
    museum in Hünfeld), the Z4 (built 1945, in use until 1959, original in a museum in Munich) and the Z5 (built 1953, in use until 1958, current whereabouts unknown, probably scrapped) all have binary floating-point.

    Philipp
    [I've seen the replica Z1, which is entirely mechanical. I didn't realize
    it used floating point. -John]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From George Neuner@21:1/5 to All on Fri Jul 16 16:22:34 2021
    On Thu, 15 Jul 2021 23:49:55 -0700 (PDT), gah4 <gah4@u.washington.edu>
    wrote:

    There is pretty much a continuous change, as processors get faster,
    then less efficient processing makes more sense.
    Among others, less efficient, interpreted languages have become
    more popular.

    It is interesting, though. For much of the 1990's, faster and faster >processor became available for compute intensive applications like >computational physics, but mostly driven by demand from other uses.

    Some of that was people who bought faster processors because they
    could, and some by gaming. For the most part, processors haven't been
    built for compute intensive use from about the 1990's.

    In the 1980's, there were some coprocessor to speed up compute intensive >problems, such as FPS (Floating Point Systems). But as desktop computers, >and especially x86 machines, got faster there was less need for them.

    And then GPUs to speed up graphics, mostly for games, but then compute >intensive users found that they could use them, too. Except that most are only
    single precision.

    But processors /aren't/ getting faster (much) anymore - they're near
    the limits both of feature size reduction and of ability to dissipate
    heat.

    The wires and insulators now are just a few atoms thick, and since
    there are insulators /inside/ transistors, the transistors themselves
    can't get much smaller [they can change shape, which is how things are progressing currently].

    Modern CPUs live in a perpetual state of "rolling blackout" in which
    functional units are turned on and off, cycle by cycle, as needed.
    This is /NOT/ done for "green" minded energy conservation [that's just
    self serving PR by the manufacturers] - it's /necessary/ to prevent
    the chips from burning up.


    And GPUs are /very/ slow relative to CPUs. The only reason they seem
    to perform well is because the problems they target are embarrassingly parallel. Try solving a problem that requires lots of array reduction
    steps and you'll see your performance go straight into the toilet.
    [Yes, I know that there are tree methods for parallelizing reductions
    ... they are not always straightforward to implement, and they only
    work for /some/ reduction problems.]


    I have worked with Connection Machines (CM-2), DSPs, FPGAs, and I have
    written a lot of SIMD code for image and array processing tasks. I am
    well aware of what is possible using various styles of parallel
    processing. There's a lot that can be done ... and a lot /more/ that
    can't: the vast majority of all computing problems do not have any
    known parallel solutions.

    It's true that there is a lot of instruction level (micro-thread)
    parallelism available in most programs. It is dificult to exploit
    with current hardware. This is a topic frequently discussed in
    comp.arch.

    YMMV,
    George

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Hans-Peter Diettrich@21:1/5 to Philipp Klaus Krause on Sat Jul 17 23:14:19 2021
    On 7/16/21 6:31 PM, Philipp Klaus Krause wrote:

    [I've seen the replica Z1, which is entirely mechanical. I didn't realize
    it used floating point. -John]

    Horst Zuse claims the Z1 had 22 bit floating point arithmetic. <http://www.horst-zuse.homepage.t-online.de/z1.html>

    DoDi

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Derek Jones@21:1/5 to All on Mon Jul 19 15:35:30 2021
    Roger,

    As I understand it, computers were originally designed to do arithmetic computations and in the old days nearly 100% of a CPU's work involved arithmetic computations.
    Knight did lots of work trying to compare computer performance,
    which required measuring instruction usage across the most common
    kinds of applications.

    Links to his papers and some data here: http://shape-of-code.coding-guidelines.com/2016/04/30/costperformance-analysis-of-1944-1967-computers-knights-data/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From minforth@arcor.de@21:1/5 to Roger L Costello on Tue Jul 27 14:07:35 2021
    Roger L Costello schrieb am Mittwoch, 14. Juli 2021 um 21:42:37 UTC+2:
    Hello Compiler Experts!

    As I understand it, computers were originally designed to do arithmetic computations and in the old days nearly 100% of a CPU's work involved arithmetic computations.

    I look at what I now do on a daily basis with computers and it is primarily text processing. My guess is that "text processing" at the machine level mostly means doing comparisons and moving things into and out of memory/registers; that is, not much in the way of arithmetic computations. Is that correct?

    These days what percentage of a CPU's work involves doing arithmetic computations versus other, non-arithmetic computations?

    /Roger
    [I don't think it was ever true except perhaps on the ENIAC. Also, what do you mean by arithmetic? Are the additions and multiplications to do indexing and array addresssing arithmetic? If you mean floating point. there wasn't any floating point hardware until the IBM 704 in 1954 but there was plenty
    of computing before that. -John]

    Cryptocurrency mining does not involve lots of text processing. ;-) Computational weather forecasting neither, or medical image processing .. etc etc ..
    Define your application domain and you get a different response

    From historic perspective, a big driver for developing "computation machines" had been military applications. Specifically artillery computers.
    [Unless someone can return this thread to compilers, I think it would better fit
    in comp.arch and alt.folklore.computers, both of which regularly discuss old computer designs. -John]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)