On 6/6/2022 3:11 PM, Jeff Barnett wrote:I downloaded Fisher's dissertation. I need a while to read it.
On 6/6/2022 2:10 PM, Douglas Eagleson wrote:
I am working on a project. How does a synchronous system get to be
stated in my version of AI, old Greek theory. These have two or more
systems with a common state. Synchronous can be random/coincidental
caused or caused by a variable/human action. These point to an AI
relation inferable.
Two clocks can be made synchronous by a simple relative time. Actions
at times can make this relative solvable. Making synchronous an
abstract relation.
Warning: my spelling of names is probably not accurate.
In the mid 1960s, Alan Perlis then at CMU, was dissertation advisor to
two PhD students working in foundations of Computer Science vis à vis program language semantics.
One student was Tim Standish who wrote about data structure definition primitives. One could use the proposed set of primitives to explain data structure definition in your favorite languages. In other words, his primitives could be used as a macro language to define the intent of
data declarations. This dissertation was noted by a big chunk of the CS community who was, at the time, trying develop better tools for
inventing and implementing new languages. Last I knew, Tim was at University of California at Irvine.
The other student was Bob(?) Fisher(?) and he did something that on the surface sounded as similar to Tim's work. The difference was that he wanted primitives to define the meaning of /control/ structures. Not
only did he handle the usual (sequence, parallel, conditional, etc.) he also dealt with sexier things such as atomic-with-respect-to, wait-for-condition (join), indivisible-with-respect-to, priorities
(e.g., to model interrupts), and more. I think Bob(?) was at DARPA soon after school and then disappeared into the wood work.
I don't know how you might get a copy of Bob's dissertation but, if you could, a whole panorama of interesting possibilities might be made apparent to you and your endeavor.
I'm sorry that I can't be more specific with references and citationsI did a little poking around and found a correct name: "Dave A. Fisher".
but by encounters with the individuals mentioned happened 50+ years ago.
His dissertation is also available online. Google "Fisher, Control Structures" and the first hit is a PDF at the pseudo URL "https://citeseers.ist.psu.edu>viewdoc>download". Just click on this
item in the Google output and whatever your setup does for PDF will happen. --
Jeff Barnett
On 6/9/2022 3:32 PM, Douglas Eagleson wrote:I did a google search and found some later work of Fisher.
On Thursday, June 9, 2022 at 1:21:36 AM UTC+8, Jeff Barnett wrote:
On 6/6/2022 3:11 PM, Jeff Barnett wrote:I downloaded Fisher's dissertation. I need a while to read it.
On 6/6/2022 2:10 PM, Douglas Eagleson wrote:Structures" and the first hit is a PDF at the pseudo URL
I am working on a project. How does a synchronous system get to be
stated in my version of AI, old Greek theory. These have two or more >>>> systems with a common state. Synchronous can be random/coincidental >>>> caused or caused by a variable/human action. These point to an AI
relation inferable.
Two clocks can be made synchronous by a simple relative time. Actions >>>> at times can make this relative solvable. Making synchronous an
abstract relation.
Warning: my spelling of names is probably not accurate.
In the mid 1960s, Alan Perlis then at CMU, was dissertation advisor to >>> two PhD students working in foundations of Computer Science vis à vis >>> program language semantics.
One student was Tim Standish who wrote about data structure definition >>> primitives. One could use the proposed set of primitives to explain data >>> structure definition in your favorite languages. In other words, his
primitives could be used as a macro language to define the intent of
data declarations. This dissertation was noted by a big chunk of the CS >>> community who was, at the time, trying develop better tools for
inventing and implementing new languages. Last I knew, Tim was at
University of California at Irvine.
The other student was Bob(?) Fisher(?) and he did something that on the >>> surface sounded as similar to Tim's work. The difference was that he
wanted primitives to define the meaning of /control/ structures. Not
only did he handle the usual (sequence, parallel, conditional, etc.) he >>> also dealt with sexier things such as atomic-with-respect-to,
wait-for-condition (join), indivisible-with-respect-to, priorities
(e.g., to model interrupts), and more. I think Bob(?) was at DARPA soon >>> after school and then disappeared into the wood work.
I don't know how you might get a copy of Bob's dissertation but, if you >>> could, a whole panorama of interesting possibilities might be made
apparent to you and your endeavor.
I'm sorry that I can't be more specific with references and citations >>> but by encounters with the individuals mentioned happened 50+ years ago. >> I did a little poking around and found a correct name: "Dave A. Fisher". >> His dissertation is also available online. Google "Fisher, Control
"https://citeseers.ist.psu.edu>viewdoc>download". Just click on this
item in the Google output and whatever your setup does for PDF will happen.
--
Jeff Barnett
Basically, my first look is to be understanding the general/abstract control structure.
Is there a control structure definable using object theory?A short answer to your question is probably no but maybe. The issue is
Generalizing the meaning of it's primitive.
that control cliches define behavior, not "static" relations among data.
The "maybe" comes from local nests of related behaviors as abstract
objects then defining relations among these sorts of objects.
It's been a longtime since I read it so I can't rely on my memory for
any real details. What I do remember is that it was a thrill to see a
thesis take on such a difficult, abstract problem and get some of it
right (IMO). There was nothing like it in the literature so it was a
first hack at nailing down one of the most important aspects of computational systems and the whole notion of a computation.
Unfortunately, this work was not followed by a second tier of research.
--
Jeff Barnett
On 6/9/2022 8:19 PM, Douglas Eagleson wrote:well I read the first 50 pages and it turned into Einstein level logic.
On Friday, June 10, 2022 at 6:45:59 AM UTC+8, Jeff Barnett wrote:That sounds right. I bumped into him once after he was done at CMU. We talked for a while - he was amazed that anyone had read his
On 6/9/2022 3:32 PM, Douglas Eagleson wrote:I did a google search and found some later work of Fisher.
On Thursday, June 9, 2022 at 1:21:36 AM UTC+8, Jeff Barnett wrote:A short answer to your question is probably no but maybe. The issue is
On 6/6/2022 3:11 PM, Jeff Barnett wrote:I downloaded Fisher's dissertation. I need a while to read it.
On 6/6/2022 2:10 PM, Douglas Eagleson wrote:I did a little poking around and found a correct name: "Dave A. Fisher".
I am working on a project. How does a synchronous system get to be >>>>>> stated in my version of AI, old Greek theory. These have two or more >>>>>> systems with a common state. Synchronous can be random/coincidental >>>>>> caused or caused by a variable/human action. These point to an AI >>>>>> relation inferable.
Two clocks can be made synchronous by a simple relative time. Actions >>>>>> at times can make this relative solvable. Making synchronous an >>>>>> abstract relation.
Warning: my spelling of names is probably not accurate.
In the mid 1960s, Alan Perlis then at CMU, was dissertation advisor to >>>>> two PhD students working in foundations of Computer Science vis à vis >>>>> program language semantics.
One student was Tim Standish who wrote about data structure definition >>>>> primitives. One could use the proposed set of primitives to explain data
structure definition in your favorite languages. In other words, his >>>>> primitives could be used as a macro language to define the intent of >>>>> data declarations. This dissertation was noted by a big chunk of the CS
community who was, at the time, trying develop better tools for
inventing and implementing new languages. Last I knew, Tim was at >>>>> University of California at Irvine.
The other student was Bob(?) Fisher(?) and he did something that on the
surface sounded as similar to Tim's work. The difference was that he >>>>> wanted primitives to define the meaning of /control/ structures. Not >>>>> only did he handle the usual (sequence, parallel, conditional, etc.) he
also dealt with sexier things such as atomic-with-respect-to,
wait-for-condition (join), indivisible-with-respect-to, priorities >>>>> (e.g., to model interrupts), and more. I think Bob(?) was at DARPA soon
after school and then disappeared into the wood work.
I don't know how you might get a copy of Bob's dissertation but, if you
could, a whole panorama of interesting possibilities might be made >>>>> apparent to you and your endeavor.
I'm sorry that I can't be more specific with references and citations >>>>> but by encounters with the individuals mentioned happened 50+ years ago.
His dissertation is also available online. Google "Fisher, Control
Structures" and the first hit is a PDF at the pseudo URL
"https://citeseers.ist.psu.edu>viewdoc>download". Just click on this >>>> item in the Google output and whatever your setup does for PDF will happen.
--
Jeff Barnett
Basically, my first look is to be understanding the general/abstract
control structure.
Is there a control structure definable using object theory?
Generalizing the meaning of it's primitive.
that control cliches define behavior, not "static" relations among data. >> The "maybe" comes from local nests of related behaviors as abstract
objects then defining relations among these sorts of objects.
It's been a longtime since I read it so I can't rely on my memory for
any real details. What I do remember is that it was a thrill to see a
thesis take on such a difficult, abstract problem and get some of it
right (IMO). There was nothing like it in the literature so it was a
first hack at nailing down one of the most important aspects of
computational systems and the whole notion of a computation.
Unfortunately, this work was not followed by a second tier of research. >> --
Jeff Barnett
I believe he went to help with DOD on the foundations of
the ADA language.
dissertation. I can't recall where this happened but he mentioned having been at DARPA and I assumed that he was there in the Information
Technology Office as a Program Manager (PM). In the 1960s and most of
the 70s, most of that office's PMs were recent PhD graduates. Later on,
PMs were either military or civilians who were comfortable in suits
and ties. Big change.
As I said above, I was disappointed that nobody picked up and continued
his line of research. If taken to the next step it would have an impact
on hardware design, compiling programs with tons of parallelism, and
make it possible to better reason about covert channels when trying to determine security properties of systems.
Good luck with your endeavor.
--
Jeff Barnett
On 7/3/2022 10:29 PM, Douglas Eagleson wrote:hello
On Friday, June 10, 2022 at 10:48:49 AM UTC+8, Jeff Barnett wrote:
On 6/9/2022 8:19 PM, Douglas Eagleson wrote:well I read the first 50 pages and it turned into Einstein level logic.
On Friday, June 10, 2022 at 6:45:59 AM UTC+8, Jeff Barnett wrote:That sounds right. I bumped into him once after he was done at CMU. We
On 6/9/2022 3:32 PM, Douglas Eagleson wrote:I did a google search and found some later work of Fisher.
On Thursday, June 9, 2022 at 1:21:36 AM UTC+8, Jeff Barnett wrote: >>>>>> On 6/6/2022 3:11 PM, Jeff Barnett wrote:A short answer to your question is probably no but maybe. The issue is >>>> that control cliches define behavior, not "static" relations among data.
I downloaded Fisher's dissertation. I need a while to read it.On 6/6/2022 2:10 PM, Douglas Eagleson wrote:I did a little poking around and found a correct name: "Dave A. Fisher".
I am working on a project. How does a synchronous system get to be >>>>>>>> stated in my version of AI, old Greek theory. These have two or more
systems with a common state. Synchronous can be random/coincidental >>>>>>>> caused or caused by a variable/human action. These point to an AI >>>>>>>> relation inferable.
Two clocks can be made synchronous by a simple relative time. Actions
at times can make this relative solvable. Making synchronous an >>>>>>>> abstract relation.
Warning: my spelling of names is probably not accurate.
In the mid 1960s, Alan Perlis then at CMU, was dissertation advisor to
two PhD students working in foundations of Computer Science vis à vis
program language semantics.
One student was Tim Standish who wrote about data structure definition
primitives. One could use the proposed set of primitives to explain data
structure definition in your favorite languages. In other words, his >>>>>>> primitives could be used as a macro language to define the intent of >>>>>>> data declarations. This dissertation was noted by a big chunk of the CS
community who was, at the time, trying develop better tools for >>>>>>> inventing and implementing new languages. Last I knew, Tim was at >>>>>>> University of California at Irvine.
The other student was Bob(?) Fisher(?) and he did something that on the
surface sounded as similar to Tim's work. The difference was that he >>>>>>> wanted primitives to define the meaning of /control/ structures. Not >>>>>>> only did he handle the usual (sequence, parallel, conditional, etc.) he
also dealt with sexier things such as atomic-with-respect-to, >>>>>>> wait-for-condition (join), indivisible-with-respect-to, priorities >>>>>>> (e.g., to model interrupts), and more. I think Bob(?) was at DARPA soon
after school and then disappeared into the wood work.
I don't know how you might get a copy of Bob's dissertation but, if you
could, a whole panorama of interesting possibilities might be made >>>>>>> apparent to you and your endeavor.
I'm sorry that I can't be more specific with references and citations
but by encounters with the individuals mentioned happened 50+ years ago.
His dissertation is also available online. Google "Fisher, Control >>>>>> Structures" and the first hit is a PDF at the pseudo URL
"https://citeseers.ist.psu.edu>viewdoc>download". Just click on this >>>>>> item in the Google output and whatever your setup does for PDF will happen.
--
Jeff Barnett
Basically, my first look is to be understanding the general/abstract >>>>> control structure.
Is there a control structure definable using object theory?
Generalizing the meaning of it's primitive.
The "maybe" comes from local nests of related behaviors as abstract >>>> objects then defining relations among these sorts of objects.
It's been a longtime since I read it so I can't rely on my memory for >>>> any real details. What I do remember is that it was a thrill to see a >>>> thesis take on such a difficult, abstract problem and get some of it >>>> right (IMO). There was nothing like it in the literature so it was a >>>> first hack at nailing down one of the most important aspects of
computational systems and the whole notion of a computation.
Unfortunately, this work was not followed by a second tier of research. >>>> --
Jeff Barnett
I believe he went to help with DOD on the foundations of
the ADA language.
talked for a while - he was amazed that anyone had read his
dissertation. I can't recall where this happened but he mentioned having >> been at DARPA and I assumed that he was there in the Information
Technology Office as a Program Manager (PM). In the 1960s and most of
the 70s, most of that office's PMs were recent PhD graduates. Later on, >> PMs were either military or civilians who were comfortable in suits
and ties. Big change.
As I said above, I was disappointed that nobody picked up and continued >> his line of research. If taken to the next step it would have an impact >> on hardware design, compiling programs with tons of parallelism, and
make it possible to better reason about covert channels when trying to
determine security properties of systems.
Good luck with your endeavor.
--
Jeff Barnett
He calls a control structure as defining an Interpreter for an Interpreter.
This creator Interpreter does not need to compile itself as per Turing advice. It can be implemented in a language such as C using subroutines and functions and other C structures.
He goes into great detail designing a syntax for his language Sol.
I did have difficulty reading which "Interpreter" he was writing about.
He introduced the operation of a process "monitor". Basically
a list of objects held in a main process. He stated quite nicely the
idea of only exercising a list item when an input variable has a state change.
Maybe a poorman's object monitor can implement object process?
I am still looking at the synchronous issue. Basically I need to make
a blackbox the checks for this state. The issue of clock error occurs.
So a class of input must be a degree of accuracy, maybe as a percentage.
I got lost in the realm of math. Does a function simply define
a synchronous path? I need some advice.
I do believe two numbers always existing together can be called synchronousI'll start with the last questions first: You ask what he meant by a function and that is not so easy to answer. Mathematically a function is
an entity that maps some values to other values, where the input values
are always mapped to the same output values. In the world of software we don't mean that at all. Take a similar question: What's a structure? The answer is that it's the thing defined by your language's primitive with
a name like DEFSTRUCT. Similarly a function, subroutine, etc., is that
thing defined by the primitive your language provides to define such
things. Does a function define a synchronous path? Depends on the
language in which it is defined.
I don't know what it means for two numbers to always exist together so I couldn't determine if they were synchronous.
It's been (quite) a while since I read this thesis but I might be able
to add some to what you have got out of it so far. The things said about multiple interpreters was the following: In order to interpret a control structure, the interpreter must "do" the control structure. Let's take
an example: PARALLEL(x, y, z), where x, y, and z are program pieces. In order to really get the effect of parallel execution, the interpreter
must, in general, start Ix, Iy, and Iz; three interpreter routines, one
to interpret x, one to interpret y, and one to interpret z and they must execute in parallel. And the same thing needs to happen when each of the other control primitives are encountered. (There are, of course, optimizations such as subsume a sequential element that appears in a sequential, etc.) One may think that you could simulate PARALLEL by some sort of interleaving on sequential hardware but when you mix in other control relations the interpreter can't be faithful to the implied semantics.
The synchronization issue is that, for example, a monitor must instantaneously spot that its condition has been satisfied so that a declared reaction will occur. This is a hell of a burden on any interpretation scheme. Let's look at an example: Let the variable X be a sixteen bit integer; let the variable H be the high order 8 bits of X
and L be the low order 8 bits of X. Assume that there is a monitor on
the value of X, then that monitor must actively take a peek when either
H or L is modified. Similarly if there is a monitor on either H or L,
it must take a peek any time X is modified. This example may seem quite artificial but it isn't. Consider interrupt structures of your favorite computer. Bits are flipped in registers and interpreted as signals to
and by the OS. Describing and simulating such capabilities as they
actually work is quite difficult.
For a moment, set aside the issue of interpreting programs written in
the control structure language and consider using the language to write
a detailed spec for a modern CPU with multiple cores and multiple
threads per core. You want to specify what the range of behaviors are allowed. If you think about this for a while, I believe that you will appreciate why the dissertation seems so convoluted. It's too bad that a second dissertation on the same topic did not follow and clarify all of these issues.
At one point in the 1970s, I wanted to abstract the control flow and
data flow within a speech understanding system so I invented a language called CSL (Control Structure Language) in which modules did not know
about each other. Data communications was over a set of software buses (think pipes) and common data stores. CSL provided the primitives to
move data from module to module and enforce sequential execution among threads, an I don't care what order they run in (pseudo parallel), and condition monitors. There were some tokens pushed around to simulate
control signals, etc., something like Petri nets. The point of this
exercise was to put together a problem solver that did not commit order
of computation constraints when there was no reason to do so. As we
learned more, we could modify the CSL to exhibit more directed behavior.
By the way, the pseudo parallel directive assigned random numbers dynamically to parallel threads as priorities so that running the system
on the same data multiple times could exhibit multiple behaviors and generate different answers.
I don't necessarily recommend reading another obscure paper (on CSL) but
if you are interested, a pdf copy is at
https://notatt.com/large-systems.pdf
--
Jeff Barnett
On 7/3/2022 10:29 PM, Douglas Eagleson wrote:I am still writing a reply.
On Friday, June 10, 2022 at 10:48:49 AM UTC+8, Jeff Barnett wrote:
On 6/9/2022 8:19 PM, Douglas Eagleson wrote:well I read the first 50 pages and it turned into Einstein level logic.
On Friday, June 10, 2022 at 6:45:59 AM UTC+8, Jeff Barnett wrote:That sounds right. I bumped into him once after he was done at CMU. We
On 6/9/2022 3:32 PM, Douglas Eagleson wrote:I did a google search and found some later work of Fisher.
On Thursday, June 9, 2022 at 1:21:36 AM UTC+8, Jeff Barnett wrote: >>>>>> On 6/6/2022 3:11 PM, Jeff Barnett wrote:A short answer to your question is probably no but maybe. The issue is >>>> that control cliches define behavior, not "static" relations among data.
I downloaded Fisher's dissertation. I need a while to read it.On 6/6/2022 2:10 PM, Douglas Eagleson wrote:I did a little poking around and found a correct name: "Dave A. Fisher".
I am working on a project. How does a synchronous system get to be >>>>>>>> stated in my version of AI, old Greek theory. These have two or more
systems with a common state. Synchronous can be random/coincidental >>>>>>>> caused or caused by a variable/human action. These point to an AI >>>>>>>> relation inferable.
Two clocks can be made synchronous by a simple relative time. Actions
at times can make this relative solvable. Making synchronous an >>>>>>>> abstract relation.
Warning: my spelling of names is probably not accurate.
In the mid 1960s, Alan Perlis then at CMU, was dissertation advisor to
two PhD students working in foundations of Computer Science vis à vis
program language semantics.
One student was Tim Standish who wrote about data structure definition
primitives. One could use the proposed set of primitives to explain data
structure definition in your favorite languages. In other words, his >>>>>>> primitives could be used as a macro language to define the intent of >>>>>>> data declarations. This dissertation was noted by a big chunk of the CS
community who was, at the time, trying develop better tools for >>>>>>> inventing and implementing new languages. Last I knew, Tim was at >>>>>>> University of California at Irvine.
The other student was Bob(?) Fisher(?) and he did something that on the
surface sounded as similar to Tim's work. The difference was that he >>>>>>> wanted primitives to define the meaning of /control/ structures. Not >>>>>>> only did he handle the usual (sequence, parallel, conditional, etc.) he
also dealt with sexier things such as atomic-with-respect-to, >>>>>>> wait-for-condition (join), indivisible-with-respect-to, priorities >>>>>>> (e.g., to model interrupts), and more. I think Bob(?) was at DARPA soon
after school and then disappeared into the wood work.
I don't know how you might get a copy of Bob's dissertation but, if you
could, a whole panorama of interesting possibilities might be made >>>>>>> apparent to you and your endeavor.
I'm sorry that I can't be more specific with references and citations
but by encounters with the individuals mentioned happened 50+ years ago.
His dissertation is also available online. Google "Fisher, Control >>>>>> Structures" and the first hit is a PDF at the pseudo URL
"https://citeseers.ist.psu.edu>viewdoc>download". Just click on this >>>>>> item in the Google output and whatever your setup does for PDF will happen.
--
Jeff Barnett
Basically, my first look is to be understanding the general/abstract >>>>> control structure.
Is there a control structure definable using object theory?
Generalizing the meaning of it's primitive.
The "maybe" comes from local nests of related behaviors as abstract >>>> objects then defining relations among these sorts of objects.
It's been a longtime since I read it so I can't rely on my memory for >>>> any real details. What I do remember is that it was a thrill to see a >>>> thesis take on such a difficult, abstract problem and get some of it >>>> right (IMO). There was nothing like it in the literature so it was a >>>> first hack at nailing down one of the most important aspects of
computational systems and the whole notion of a computation.
Unfortunately, this work was not followed by a second tier of research. >>>> --
Jeff Barnett
I believe he went to help with DOD on the foundations of
the ADA language.
talked for a while - he was amazed that anyone had read his
dissertation. I can't recall where this happened but he mentioned having >> been at DARPA and I assumed that he was there in the Information
Technology Office as a Program Manager (PM). In the 1960s and most of
the 70s, most of that office's PMs were recent PhD graduates. Later on, >> PMs were either military or civilians who were comfortable in suits
and ties. Big change.
As I said above, I was disappointed that nobody picked up and continued >> his line of research. If taken to the next step it would have an impact >> on hardware design, compiling programs with tons of parallelism, and
make it possible to better reason about covert channels when trying to
determine security properties of systems.
Good luck with your endeavor.
--
Jeff Barnett
He calls a control structure as defining an Interpreter for an Interpreter.
This creator Interpreter does not need to compile itself as per Turing advice. It can be implemented in a language such as C using subroutines and functions and other C structures.
He goes into great detail designing a syntax for his language Sol.
I did have difficulty reading which "Interpreter" he was writing about.
He introduced the operation of a process "monitor". Basically
a list of objects held in a main process. He stated quite nicely the
idea of only exercising a list item when an input variable has a state change.
Maybe a poorman's object monitor can implement object process?
I am still looking at the synchronous issue. Basically I need to make
a blackbox the checks for this state. The issue of clock error occurs.
So a class of input must be a degree of accuracy, maybe as a percentage.
I got lost in the realm of math. Does a function simply define
a synchronous path? I need some advice.
I do believe two numbers always existing together can be called synchronousI'll start with the last questions first: You ask what he meant by a function and that is not so easy to answer. Mathematically a function is
an entity that maps some values to other values, where the input values
are always mapped to the same output values. In the world of software we don't mean that at all. Take a similar question: What's a structure? The answer is that it's the thing defined by your language's primitive with
a name like DEFSTRUCT. Similarly a function, subroutine, etc., is that
thing defined by the primitive your language provides to define such
things. Does a function define a synchronous path? Depends on the
language in which it is defined.
I don't know what it means for two numbers to always exist together so I couldn't determine if they were synchronous.
It's been (quite) a while since I read this thesis but I might be able
to add some to what you have got out of it so far. The things said about multiple interpreters was the following: In order to interpret a control structure, the interpreter must "do" the control structure. Let's take
an example: PARALLEL(x, y, z), where x, y, and z are program pieces. In order to really get the effect of parallel execution, the interpreter
must, in general, start Ix, Iy, and Iz; three interpreter routines, one
to interpret x, one to interpret y, and one to interpret z and they must execute in parallel. And the same thing needs to happen when each of the other control primitives are encountered. (There are, of course, optimizations such as subsume a sequential element that appears in a sequential, etc.) One may think that you could simulate PARALLEL by some sort of interleaving on sequential hardware but when you mix in other control relations the interpreter can't be faithful to the implied semantics.
The synchronization issue is that, for example, a monitor must instantaneously spot that its condition has been satisfied so that a declared reaction will occur. This is a hell of a burden on any interpretation scheme. Let's look at an example: Let the variable X be a sixteen bit integer; let the variable H be the high order 8 bits of X
and L be the low order 8 bits of X. Assume that there is a monitor on
the value of X, then that monitor must actively take a peek when either
H or L is modified. Similarly if there is a monitor on either H or L,
it must take a peek any time X is modified. This example may seem quite artificial but it isn't. Consider interrupt structures of your favorite computer. Bits are flipped in registers and interpreted as signals to
and by the OS. Describing and simulating such capabilities as they
actually work is quite difficult.
For a moment, set aside the issue of interpreting programs written in
the control structure language and consider using the language to write
a detailed spec for a modern CPU with multiple cores and multiple
threads per core. You want to specify what the range of behaviors are allowed. If you think about this for a while, I believe that you will appreciate why the dissertation seems so convoluted. It's too bad that a second dissertation on the same topic did not follow and clarify all of these issues.
At one point in the 1970s, I wanted to abstract the control flow and
data flow within a speech understanding system so I invented a language called CSL (Control Structure Language) in which modules did not know
about each other. Data communications was over a set of software buses (think pipes) and common data stores. CSL provided the primitives to
move data from module to module and enforce sequential execution among threads, an I don't care what order they run in (pseudo parallel), and condition monitors. There were some tokens pushed around to simulate
control signals, etc., something like Petri nets. The point of this
exercise was to put together a problem solver that did not commit order
of computation constraints when there was no reason to do so. As we
learned more, we could modify the CSL to exhibit more directed behavior.
By the way, the pseudo parallel directive assigned random numbers dynamically to parallel threads as priorities so that running the system
on the same data multiple times could exhibit multiple behaviors and generate different answers.
I don't necessarily recommend reading another obscure paper (on CSL) but
if you are interested, a pdf copy is at
https://notatt.com/large-systems.pdf
--
Jeff Barnett
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 297 |
Nodes: | 16 (2 / 14) |
Uptime: | 103:08:07 |
Calls: | 6,660 |
Calls today: | 2 |
Files: | 12,209 |
Messages: | 5,335,070 |