I've been pursuing the idea of what I call algorithm optimization.
It's the idea that algorithms coded by individuals may not be optimal,
and may require refactoring / re-engineering to be made optimal based on what's trying to be achieved.
On Sun, 13 Sep 2020, Rick C. Hodgin wrote:
I've been pursuing the idea of what I call algorithm optimization.
It's the idea that algorithms coded by individuals may not be optimal,
and may require refactoring / re-engineering to be made optimal based on
what's trying to be achieved.
I agree with John: generally, it should be the programmer's job to choose
the right algorithm. Otherwise, the compiler just becomes an algorithm library, and--well, if you're going to build an algorithm library, why not just expose it directly to your language's users?
The main problem is that algorithms are /heavily/ dependant on data structures. So if you want to change an algorithm significantly, you'll
need to change its data structures, and usually the compiler can't tell if external code is going to look at those data structures.
In the case of your example, though it's hard to tell, I expect that the optimal structure would be a freelist, but because 'first' is a global variable, modifications to it have to be the same with and without optimizations. (You might be able to get around this by making 'first' a static variable local to iAlloc_new_sdata; but this approach doesn't
scale, and it's already putting pressure on the programmer's algorithms, which is what you were trying to avoid.)
A secondary problem is compile time: recognizing algorithms is likely to
be very expensive, which is not a great user experience. (Though llvm/gcc
do recognize some things, like algorithm for triangle numbers.)
I've been pursuing the idea of what I call algorithm optimization. It's
the idea that algorithms coded by individuals may not be optimal, and
may require refactoring / re-engineering to be made optimal based on
what's trying to be achieved.
Are these all standard optimization techniques which exist, or is this something else I'm pursuing with the big push to have optimization take
place at the BAlive level to revamp algorithms based on fundamental data types and data/flow analyses of the intent of the algorithms?
Note: All of this is my original thinking this all through. I have not read books or articles or papers from others on how to do things. I
look at the code and I think things. I used to discuss them with Walter Banks, but he passed in late 2019.
I've been pursuing the idea of what I call algorithm optimization. It's
the idea that algorithms coded by individuals may not be optimal, and
may require refactoring / re-engineering to be made optimal based on
what's trying to be achieved.
Compilers had done to death figuring out how best to optimize what
the developer wrote. The future is optimizing what they intended to write.
A while back I had the idea of trying to figure out what floating-point calculation was being attempted, e.g., using a Taylor series when a Chebyshev series would be more efficient.
I've been pursuing the idea of what I call algorithm optimization. It's
the idea that algorithms coded by individuals may not be optimal, and
may require refactoring / re-engineering to be made optimal based on
what's trying to be achieved.
Rick C. Hodgin
[I think the usual way to do this is to provide a way to express higher level algorithms in your programming language so the compiler doesn't have to try to reverse engineer them. -John]
On Monday, September 14, 2020 at 9:53:57 PM UTC-5, Rick C. Hodgin wrote:
I've been pursuing the idea of what I call algorithm optimization. It'sExample elided for space.
the idea that algorithms coded by individuals may not be optimal, and
may require refactoring / re-engineering to be made optimal based on
what's trying to be achieved.
Rick C. Hodgin
[I think the usual way to do this is to provide a way to express higher level
algorithms in your programming language so the compiler doesn't have to try >> to reverse engineer them. -John]
I agree that this should usually be the programmer's domain. However
there has been some work done in this area. A book I remember is:
Metzger, Robert. _Automatic Algorithm Recognition and Replacement: A New Approach to Program Optimization_
This approaches the issue more from a "I want to replace serial
algorithms with parallel algorithms." if I recall correctly so it may
not be exactly what you are looking for.
This approaches the issue more from a "I want to replace serial
algorithms with parallel algorithms." if I recall correctly so it may
not be exactly what you are looking for.
Now, say someone is doing their CS project for class, where they are
supposed to write, and time, bubblesort?
I suppose you can find a Chebyshev series that closely approximates
the series coded, but takes fewer terms.
But what about the person who wants to compare two series'?
If you replace one or both, then the comparison will be wrong.
Not that there haven't been problems since the beginning of
optimizing compilers, where the results were different than
expected.
[Back when people cared about Whetstone and Dyrystone benchmarks,
compilers recognized code sequences from those benchmarks for, uh,
special processing. But it doesn't generalize very well. -John]
[Back when people cared about Whetstone and Dyrystone benchmarks,
compilers recognized code sequences from those benchmarks for, uh,
special processing. But it doesn't generalize very well. -John]
On Wednesday, September 16, 2020 at 8:14:44 AM UTC-7,
mwmar...@gmail.com wrote:
(snip)
This approaches the issue more from a "I want to replace serial
algorithms with parallel algorithms." if I recall correctly so it may
not be exactly what you are looking for.
That might make more sense. So, an algorithm that it mathematically equivalent, but not necessarily numerically equivalent.
One thought was that someone codes bubblesort, and the compiler
generates quicksort. Small complication that bubblesort is stable, and quicksort isn't. (Add an array with the original position to break
ties.)
Am 16.09.2020 um 07:25 schrieb gah4:
One thought was that someone codes bubblesort, and the compiler
generates quicksort. Small complication that bubblesort is stable, and quicksort isn't. (Add an array with the original position to break ties.)
Right, algorithm or control flow optimization should be located in an
earlier project stage, not in compilation. It also smells like the dream
of automated "proof of correctness", whose basics I learned 50 years ago
but never found usable results yet. How shall a tool suggest other algorithm(s) without knowing (having determined - how?!) about the goals
of a piece of code?
1.[...]
I've been pursuing the idea of what I call algorithm optimization. It's
the idea that algorithms coded by individuals may not be optimal, and
may require refactoring / re-engineering to be made optimal based on
what's trying to be achieved.
In the above example, a linked list structure is allocated and some data
is stored into it. In this example a single x variable, but in a
real-world case there may be many variables.
[I think the usual way to do this is to provide a way to express higher level algorithms in your programming language so the compiler doesn't have to try to reverse engineer them. -John]
[I think the usual way to do this is to provide a way to express higher level
algorithms in your programming language so the compiler doesn't have to try to reverse engineer them. -John]
What's the best language to express algorithms in?
Or, how many languages claim that already...
I think I remember this being discussed many years ago.
One thought was that someone codes bubblesort, and the compiler
generates quicksort. Small complication that bubblesort is stable, and quicksort isn't. (Add an array with the original position to break
ties.)
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 293 |
Nodes: | 16 (2 / 14) |
Uptime: | 241:09:39 |
Calls: | 6,624 |
Files: | 12,173 |
Messages: | 5,320,138 |