Hello Compiler Experts!
As I understand it, computers were originally designed to do arithmetic computations and in the old days nearly 100% of a CPU's work involved arithmetic computations.
I look at what I now do on a daily basis with computers and it is primarily text processing. My guess is that "text processing" at the machine level mostly means doing comparisons and moving things into and out of memory/registers; that is, not much in the way of arithmetic computations. Is that correct?
These days what percentage of a CPU's work involves doing arithmetic computations versus other, non-arithmetic computations?
On Wednesday, July 14, 2021 at 12:42:37 PM UTC-7, Roger L Costello wrote:
I look at what I now do on a daily basis with computers and it is primarily >> text processing. My guess is that "text processing" at the machine level
mostly means doing comparisons and moving things into and out of
memory/registers; that is, not much in the way of arithmetic computations. Is
that correct?
Good text processing is reasonably numeric intensive. TeX uses dynamic programming to find the optimal line breaking points on a page.
As I understand it, computers were originally designed to do arithmetic computations and in the old days nearly 100% of a CPU's work involved arithmetic computations.
[I don't think it was ever true except perhaps on the ENIAC.
Also, what do
you mean by arithmetic? Are the additions and multiplications to do indexing and array addresssing arithmetic?
If you mean floating point. there wasn't
any floating point hardware until the IBM 704 in 1954
but there was plenty of computing before that. -John][I don't think the Z3 was ever built other than as a much later
On Wednesday, July 14, 2021 at 12:42:37 PM UTC-7, Roger L Costello wrote:.
Hello Compiler Experts!
As I understand it, computers were originally designed to do arithmetic
computations and in the old days nearly 100% of a CPU's work involved
arithmetic computations.
It seems that people might have believed that, even for a long time, but
I suspect rarely true.
There are stories about the IBM 704 Fortran compiler,
and the authors believed that they had to make optimal use of the hardware, or no-one would use their compiler. At the time, that would have been assembly programmers, in some for or other.
Then when they were testing
the compiler, they were surprised at the code generated doing things
better than they thought of doing.
Early computers were sold with minimal, if any, software.
Close to zero. Remember, the CPU is most of the time sitting there waiting for you to do something. Some systems have an actual "null job",
accumulating the CPU time not used for anything else.
[I generally agree except to note that modern PCs and particularly phones display a lot of high quality images and video, both of which require extensive arithmetic to get from the internal representation to the bitmap on the screen. General purpose CPUs have extended instruction sets like
Intel's SSE and AVX, and often there are GPUs on the same chip as the
CPU, as in the Apple M1. I get the impression that compilers don't
deal very well with these things, so vendors provide large libraries
of assembler code to use them. -John]
any floating point hardware until the IBM 704 in 1954
It is said that the Z3 (1941) was designed with floating-point.
[…]
[I don't think the Z3 was ever built other than as a much later retrocomputing project. Von Neumann apparently considered floating
point for the EDVAC and IAS machine but rejected it as too complex
and anyway doing the scaling in software was easy, which it
probably was if you were Von Neumann. -John]
There is pretty much a continuous change, as processors get faster,
then less efficient processing makes more sense.
Among others, less efficient, interpreted languages have become
more popular.
It is interesting, though. For much of the 1990's, faster and faster >processor became available for compute intensive applications like >computational physics, but mostly driven by demand from other uses.
Some of that was people who bought faster processors because they
could, and some by gaming. For the most part, processors haven't been
built for compute intensive use from about the 1990's.
In the 1980's, there were some coprocessor to speed up compute intensive >problems, such as FPS (Floating Point Systems). But as desktop computers, >and especially x86 machines, got faster there was less need for them.
And then GPUs to speed up graphics, mostly for games, but then compute >intensive users found that they could use them, too. Except that most are only
single precision.
[I've seen the replica Z1, which is entirely mechanical. I didn't realize
it used floating point. -John]
As I understand it, computers were originally designed to do arithmetic computations and in the old days nearly 100% of a CPU's work involved arithmetic computations.Knight did lots of work trying to compare computer performance,
Hello Compiler Experts!
As I understand it, computers were originally designed to do arithmetic computations and in the old days nearly 100% of a CPU's work involved arithmetic computations.
I look at what I now do on a daily basis with computers and it is primarily text processing. My guess is that "text processing" at the machine level mostly means doing comparisons and moving things into and out of memory/registers; that is, not much in the way of arithmetic computations. Is that correct?
These days what percentage of a CPU's work involves doing arithmetic computations versus other, non-arithmetic computations?
/Roger
[I don't think it was ever true except perhaps on the ENIAC. Also, what do you mean by arithmetic? Are the additions and multiplications to do indexing and array addresssing arithmetic? If you mean floating point. there wasn't any floating point hardware until the IBM 704 in 1954 but there was plenty
of computing before that. -John]
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 293 |
Nodes: | 16 (2 / 14) |
Uptime: | 232:04:28 |
Calls: | 6,624 |
Files: | 12,171 |
Messages: | 5,319,447 |