• Re: Why Bloat Is Still Software’s Biggest Vulnerability

    From John Larkin@21:1/5 to bloggs.fredbloggs.fred@gmail.com on Sat Feb 10 09:06:03 2024
    On Sat, 10 Feb 2024 08:10:13 -0800 (PST), Fred Bloggs <bloggs.fredbloggs.fred@gmail.com> wrote:

    Another failure of 'let the market decide.'

    https://spectrum.ieee.org/lean-software-development

    Complexity is a game that some people enjoy.

    And some people like simplicity. Their stuff works better.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jan Panteltje@21:1/5 to jl@997PotHill.com on Sun Feb 11 06:43:31 2024
    On a sunny day (Sat, 10 Feb 2024 09:06:03 -0800) it happened John Larkin <jl@997PotHill.com> wrote in <g4bfsidsbmg316togaaff19e63vv1pnqbo@4ax.com>:

    On Sat, 10 Feb 2024 08:10:13 -0800 (PST), Fred Bloggs ><bloggs.fredbloggs.fred@gmail.com> wrote:

    Another failure of 'let the market decide.'

    https://spectrum.ieee.org/lean-software-development

    Complexity is a game that some people enjoy.

    And some people like simplicity. Their stuff works better.

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:
    https://panteltje.nl/panteltje/pic/scope_pic/index.html
    nice to do Fourier transform in a few bytes... sine lookup table
    has a Usenet compatible output, use fixed size font:
    https://panteltje.nl/panteltje/pic/scope_pic/screen_dump2.txt

    Most web things I have coded in a few lines of C,
    started on a browser too, but that is a moving target.. takes too much time. Also wrote this Newsreader I am posting this with, it runs on a Raspberry Pi4 raspberrypi: ~ # whereis NewsFleX
    NewsFleX: /usr/local/bin/NewsFleX
    raspberrypi: ~ # lb /usr/local/bin/NewsFleX
    -rwxr-xr-x 1 root root 383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    lb is short for ls -rtl --color=none
    383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    'lb' is short for ls -rtl --color=none
    383,796 bytes
    So < 400 kB
    Linked in is libforms for the GUI.
    Old verion for x86 here:
    https://panteltje.nl/panteltje/newsflex/index.html
    libforms however changed, so unless you use a very old verion of that it won't work.

    I have dropped that xforms lib too and still have a GUI...
    https://panteltje.nl/pub/boats_and_planes.gif
    runs 24/7
    -rwxr-xr-x 1 root root 329604 Feb 7 2021 xgpspc
    329,604 bytes
    monitors planes and boat traffic, does navighation, auto-pilot what not.
    latest version even has a fire solution.. for defence of course
    Only uses these libs, from the Makefile:
    $(COMPILER) -o xgpspc $(XGPSPC) -lm -lpthread -lXaw -ljpeg
    libmath, libjpeg and libXaw (for the display).

    Simplicity, or simple city or whatever it was
    of course gcc as compiler.
    Or gpasm for the PIC asm code.

    I think the ever more bloat comes from trying to sell ever more,
    a capitalist trick to suck you for money.
    More bloat causes need for ever more powerfull hardware.
    So bloat writers get shares in hardware manufacurers and get rich.
    Microsore or whatever is a big example.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cursitor Doom@21:1/5 to All on Sun Feb 11 09:44:22 2024
    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje <alien@comet.invalid>
    wrote:

    On a sunny day (Sat, 10 Feb 2024 09:06:03 -0800) it happened John Larkin ><jl@997PotHill.com> wrote in <g4bfsidsbmg316togaaff19e63vv1pnqbo@4ax.com>:

    On Sat, 10 Feb 2024 08:10:13 -0800 (PST), Fred Bloggs >><bloggs.fredbloggs.fred@gmail.com> wrote:

    Another failure of 'let the market decide.'

    https://spectrum.ieee.org/lean-software-development

    Complexity is a game that some people enjoy.

    And some people like simplicity. Their stuff works better.

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:
    https://panteltje.nl/panteltje/pic/scope_pic/index.html
    nice to do Fourier transform in a few bytes... sine lookup table
    has a Usenet compatible output, use fixed size font:
    https://panteltje.nl/panteltje/pic/scope_pic/screen_dump2.txt

    Most web things I have coded in a few lines of C,
    started on a browser too, but that is a moving target.. takes too much time. >Also wrote this Newsreader I am posting this with, it runs on a Raspberry Pi4 >raspberrypi: ~ # whereis NewsFleX
    NewsFleX: /usr/local/bin/NewsFleX
    raspberrypi: ~ # lb /usr/local/bin/NewsFleX
    -rwxr-xr-x 1 root root 383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    lb is short for ls -rtl --color=none
    383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    'lb' is short for ls -rtl --color=none
    383,796 bytes
    So < 400 kB
    Linked in is libforms for the GUI.
    Old verion for x86 here:
    https://panteltje.nl/panteltje/newsflex/index.html
    libforms however changed, so unless you use a very old verion of that it won't work.

    I have dropped that xforms lib too and still have a GUI...
    https://panteltje.nl/pub/boats_and_planes.gif
    runs 24/7
    -rwxr-xr-x 1 root root 329604 Feb 7 2021 xgpspc
    329,604 bytes
    monitors planes and boat traffic, does navighation, auto-pilot what not. >latest version even has a fire solution.. for defence of course
    Only uses these libs, from the Makefile:
    $(COMPILER) -o xgpspc $(XGPSPC) -lm -lpthread -lXaw -ljpeg
    libmath, libjpeg and libXaw (for the display).

    Simplicity, or simple city or whatever it was
    of course gcc as compiler.
    Or gpasm for the PIC asm code.

    I think the ever more bloat comes from trying to sell ever more,
    a capitalist trick to suck you for money.
    More bloat causes need for ever more powerfull hardware.
    So bloat writers get shares in hardware manufacurers and get rich.
    Microsore or whatever is a big example.

    That's all very impressive, Jan, but if you were *truly* a hardcore
    programmer, you'd be using machine code. ;-)
    More seriously, bloat enables coders to hide back doors much more
    effectively. They'd never get away with that kind of subterfuge with
    ASM.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Bill Sloman@21:1/5 to Cursitor Doom on Sun Feb 11 21:50:53 2024
    On 11/02/2024 8:44 pm, Cursitor Doom wrote:
    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje <alien@comet.invalid> wrote:
    On a sunny day (Sat, 10 Feb 2024 09:06:03 -0800) it happened John Larki <jl@997PotHill.com> wrote in <g4bfsidsbmg316togaaff19e63vv1pnqbo@4ax.com>:
    On Sat, 10 Feb 2024 08:10:13 -0800 (PST), Fred Bloggs <bloggs.fredbloggs.fred@gmail.com> wrote:

    <snip>

    That's all very impressive, Jan, but if you were *truly* a hardcore programmer, you'd be using machine code. ;-)

    Nobody writes machine code. Assembler has a one-to-one relationship with machine code, but tit is easier to write and read.

    More seriously, bloat enables coders to hide back doors much more effectively. They'd never get away with that kind of subterfuge with
    ASM.

    Of course they would. Have your ever tried to make sense of poorly
    documented and commented assembly code?

    And it is possible to make machine code self-modifying - at least on
    some machines - which offers even more opportunity, to put in back doors
    (and take then away again after you've exploited them).

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jan Panteltje@21:1/5 to cd@notformail.com on Sun Feb 11 11:26:15 2024
    On a sunny day (Sun, 11 Feb 2024 09:44:22 +0000) it happened Cursitor Doom <cd@notformail.com> wrote in <nh5hsit657809ebhciaseg2vgprofkhfv1@4ax.com>:

    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje <alien@comet.invalid>
    wrote:

    On a sunny day (Sat, 10 Feb 2024 09:06:03 -0800) it happened John Larkin >><jl@997PotHill.com> wrote in <g4bfsidsbmg316togaaff19e63vv1pnqbo@4ax.com>:

    On Sat, 10 Feb 2024 08:10:13 -0800 (PST), Fred Bloggs >>><bloggs.fredbloggs.fred@gmail.com> wrote:

    Another failure of 'let the market decide.'

    https://spectrum.ieee.org/lean-software-development

    Complexity is a game that some people enjoy.

    And some people like simplicity. Their stuff works better.

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:
    https://panteltje.nl/panteltje/pic/scope_pic/index.html
    nice to do Fourier transform in a few bytes... sine lookup table
    has a Usenet compatible output, use fixed size font:
    https://panteltje.nl/panteltje/pic/scope_pic/screen_dump2.txt

    Most web things I have coded in a few lines of C,
    started on a browser too, but that is a moving target.. takes too much time. >>Also wrote this Newsreader I am posting this with, it runs on a Raspberry Pi4 >>raspberrypi: ~ # whereis NewsFleX
    NewsFleX: /usr/local/bin/NewsFleX
    raspberrypi: ~ # lb /usr/local/bin/NewsFleX
    -rwxr-xr-x 1 root root 383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    lb is short for ls -rtl --color=none
    383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    'lb' is short for ls -rtl --color=none
    383,796 bytes
    So < 400 kB
    Linked in is libforms for the GUI.
    Old verion for x86 here:
    https://panteltje.nl/panteltje/newsflex/index.html
    libforms however changed, so unless you use a very old verion of that it won't work.

    I have dropped that xforms lib too and still have a GUI...
    https://panteltje.nl/pub/boats_and_planes.gif
    runs 24/7
    -rwxr-xr-x 1 root root 329604 Feb 7 2021 xgpspc
    329,604 bytes
    monitors planes and boat traffic, does navighation, auto-pilot what not. >>latest version even has a fire solution.. for defence of course
    Only uses these libs, from the Makefile:
    $(COMPILER) -o xgpspc $(XGPSPC) -lm -lpthread -lXaw -ljpeg
    libmath, libjpeg and libXaw (for the display).

    Simplicity, or simple city or whatever it was
    of course gcc as compiler.
    Or gpasm for the PIC asm code.

    I think the ever more bloat comes from trying to sell ever more,
    a capitalist trick to suck you for money.
    More bloat causes need for ever more powerfull hardware.
    So bloat writers get shares in hardware manufacurers and get rich. >>Microsore or whatever is a big example.

    That's all very impressive, Jan, but if you were *truly* a hardcore >programmer, you'd be using machine code. ;-)

    I have used machine code in the long ago past.
    Here is a nice Z80 disassembler I wrote:
    https://panteltje.nl/panteltje/z80/index.html
    from emails I know people still use it.


    More seriously, bloat enables coders to hide back doors much more >effectively. They'd never get away with that kind of subterfuge with
    ASM.

    Yes, all those libraries.. I follow the news and sometimes things are loaded that have backdoors.

    But asm, long ago I was involved with card hacking,
    things are read only, and how to list the code of a PIC micro
    (in those days in the TV smart cards for encrypted TV channels).
    That is how I got interested and came to use Microchip PICs..
    It is not always easy to list those codes to get the secret algo they use to encrypt TV transmissions.
    I stopped when some EU politician got upset.. some persisted and got sentenced....
    But that is how I learned about PICs and got interested in crypto.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Wanderer@21:1/5 to Jan Panteltje on Sun Feb 11 10:47:05 2024
    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje wrote:

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:


    Back in the 20th century, I knew how to program in C. I
    knew what the assembly code would like after I compiled it.

    This is C++.


    https://en.cppreference.com/w/cpp/links/libs


    Now I program in Python. I really don't know how to program
    in Python. I'm googlesmart. I google what I want to do,
    download the appropriate library and follow the documentation.
    I don't know if there is something malicious in there. That's
    why I really hate every little stupid program and app that
    thinks it needs to auto-update and needs admin approval to
    install and screw with the operating system. If there is
    a portable option, I get that and I keep old versions until
    they break.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jan Panteltje@21:1/5 to Wanderer on Sun Feb 11 17:13:40 2024
    On a sunny day (Sun, 11 Feb 2024 10:47:05) it happened Wanderer<dont@emailme.com> wrote in <980294@dontemail.com>:

    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje wrote:

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:


    Back in the 20th century, I knew how to program in C. I
    knew what the assembly code would like after I compiled it.

    This is C++.


    https://en.cppreference.com/w/cpp/links/libs


    Now I program in Python. I really don't know how to program
    in Python. I'm googlesmart. I google what I want to do,
    download the appropriate library and follow the documentation.
    I don't know if there is something malicious in there. That's
    why I really hate every little stupid program and app that
    thinks it needs to auto-update and needs admin approval to
    install and screw with the operating system. If there is
    a portable option, I get that and I keep old versions until
    they break.

    I do not speak phyton...
    No need...
    Cplushplush is a crime against humanity, operator overloading etc.
    If I see some open source C++ code I like, then I usually recode it in C,
    makes it simpler much of the time, did that with some Arduino code.

    Sometimes you really need libraries,
    I just came across this voice to text program for the Raspberry Pi last week:
    https://www.tomshardware.com/raspberry-pi/raspberry-pi-project-lets-you-generate-ai-art-for-your-tv-using-voice-commands
    leads to
    https://www.hackster.io/petewarden/recognizing-speech-with-a-raspberry-pi-50b0e6
    seems to be 1 GB size, for voice recognition you need a lot..
    Have not tried or downloaded it yet.

    tomshardware.com has often Raspberry projects (all the way down on their main page).

    All that said, I run Firefox browser on a Raspberry Pi4 8 GB..
    I think it forwards everything I do to anybody ;-) ;-)

    I have disabled WiFi and Bluetooth now in the startup file.
    But still use a wireless keyboard....
    So there is room for improvement as far as security goes.
    Am using a Huawei USB stick for 4G internet access that works everywhere in the country or even Europe here.
    But of course one could log / decode the RF...

    see-eye-aaa must know everything about me by now...
    May sent them in a loop!
    If they were not there yet....

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cursitor Doom@21:1/5 to Wanderer on Sun Feb 11 18:04:55 2024
    On Sun, 11 Feb 2024 10:47:05, Wanderer<dont@emailme.com> wrote:

    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje wrote:

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:


    Back in the 20th century, I knew how to program in C. I
    knew what the assembly code would like after I compiled it.

    This is C++.


    https://en.cppreference.com/w/cpp/links/libs

    I never got on with C++. C has a certain elegance to it that I very
    much like and I've never moved on from it. In fact I'm such a purist,
    I stay faithful to the K&R variant. They tell me it's limiting to do
    that, but it does *everything* I need to do so why go further? I find
    the simplicity and lack of unnecesary bloat very appealing. I'd
    probably still be coding in ASM if C hadn't come along. For me at
    least, K&R C is perfection.


    Now I program in Python. I really don't know how to program
    in Python. I'm googlesmart. I google what I want to do,
    download the appropriate library and follow the documentation.
    I don't know if there is something malicious in there. That's
    why I really hate every little stupid program and app that
    thinks it needs to auto-update and needs admin approval to
    install and screw with the operating system. If there is
    a portable option, I get that and I keep old versions until
    they break.

    Very wise. I like your style, Wanderer!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cursitor Doom@21:1/5 to All on Sun Feb 11 17:56:28 2024
    On Sun, 11 Feb 2024 11:26:15 GMT, Jan Panteltje <alien@comet.invalid>
    wrote:

    On a sunny day (Sun, 11 Feb 2024 09:44:22 +0000) it happened Cursitor Doom ><cd@notformail.com> wrote in <nh5hsit657809ebhciaseg2vgprofkhfv1@4ax.com>:

    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje <alien@comet.invalid> >>wrote:

    On a sunny day (Sat, 10 Feb 2024 09:06:03 -0800) it happened John Larkin >>><jl@997PotHill.com> wrote in <g4bfsidsbmg316togaaff19e63vv1pnqbo@4ax.com>: >>>
    On Sat, 10 Feb 2024 08:10:13 -0800 (PST), Fred Bloggs >>>><bloggs.fredbloggs.fred@gmail.com> wrote:

    Another failure of 'let the market decide.'

    https://spectrum.ieee.org/lean-software-development

    Complexity is a game that some people enjoy.

    And some people like simplicity. Their stuff works better.

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:
    https://panteltje.nl/panteltje/pic/scope_pic/index.html
    nice to do Fourier transform in a few bytes... sine lookup table
    has a Usenet compatible output, use fixed size font:
    https://panteltje.nl/panteltje/pic/scope_pic/screen_dump2.txt

    Most web things I have coded in a few lines of C,
    started on a browser too, but that is a moving target.. takes too much time. >>>Also wrote this Newsreader I am posting this with, it runs on a Raspberry Pi4
    raspberrypi: ~ # whereis NewsFleX
    NewsFleX: /usr/local/bin/NewsFleX
    raspberrypi: ~ # lb /usr/local/bin/NewsFleX
    -rwxr-xr-x 1 root root 383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    lb is short for ls -rtl --color=none
    383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    'lb' is short for ls -rtl --color=none
    383,796 bytes
    So < 400 kB
    Linked in is libforms for the GUI.
    Old verion for x86 here:
    https://panteltje.nl/panteltje/newsflex/index.html
    libforms however changed, so unless you use a very old verion of that it won't work.

    I have dropped that xforms lib too and still have a GUI...
    https://panteltje.nl/pub/boats_and_planes.gif
    runs 24/7
    -rwxr-xr-x 1 root root 329604 Feb 7 2021 xgpspc
    329,604 bytes
    monitors planes and boat traffic, does navighation, auto-pilot what not. >>>latest version even has a fire solution.. for defence of course
    Only uses these libs, from the Makefile:
    $(COMPILER) -o xgpspc $(XGPSPC) -lm -lpthread -lXaw -ljpeg
    libmath, libjpeg and libXaw (for the display).

    Simplicity, or simple city or whatever it was
    of course gcc as compiler.
    Or gpasm for the PIC asm code.

    I think the ever more bloat comes from trying to sell ever more,
    a capitalist trick to suck you for money.
    More bloat causes need for ever more powerfull hardware.
    So bloat writers get shares in hardware manufacurers and get rich. >>>Microsore or whatever is a big example.

    That's all very impressive, Jan, but if you were *truly* a hardcore >>programmer, you'd be using machine code. ;-)

    I have used machine code in the long ago past.
    Here is a nice Z80 disassembler I wrote:
    https://panteltje.nl/panteltje/z80/index.html
    from emails I know people still use it.


    More seriously, bloat enables coders to hide back doors much more >>effectively. They'd never get away with that kind of subterfuge with
    ASM.

    Yes, all those libraries.. I follow the news and sometimes things are loaded >that have backdoors.

    But asm, long ago I was involved with card hacking,
    things are read only, and how to list the code of a PIC micro
    (in those days in the TV smart cards for encrypted TV channels).
    That is how I got interested and came to use Microchip PICs..
    It is not always easy to list those codes to get the secret algo they use to >encrypt TV transmissions.
    I stopped when some EU politician got upset.. some persisted and got sentenced....
    But that is how I learned about PICs and got interested in crypto.

    Many thanks for that well thought-out and well-reasoned response, Jan.
    Nice to hear from someone who knows what they're talking about instead
    of some half-baked garbage from a moron like Bill Sloman who wouldn't
    even be able to set up something as elementary as an Antikythera
    orrery. ;-)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jeroen Belleman@21:1/5 to Cursitor Doom on Sun Feb 11 21:05:23 2024
    On 2/11/24 18:56, Cursitor Doom wrote:
    On Sun, 11 Feb 2024 11:26:15 GMT, Jan Panteltje <alien@comet.invalid>
    wrote:

    On a sunny day (Sun, 11 Feb 2024 09:44:22 +0000) it happened Cursitor Doom >> <cd@notformail.com> wrote in <nh5hsit657809ebhciaseg2vgprofkhfv1@4ax.com>: >>
    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje <alien@comet.invalid>
    wrote:

    On a sunny day (Sat, 10 Feb 2024 09:06:03 -0800) it happened John Larkin >>>> <jl@997PotHill.com> wrote in <g4bfsidsbmg316togaaff19e63vv1pnqbo@4ax.com>: >>>>
    On Sat, 10 Feb 2024 08:10:13 -0800 (PST), Fred Bloggs
    <bloggs.fredbloggs.fred@gmail.com> wrote:

    Another failure of 'let the market decide.'

    https://spectrum.ieee.org/lean-software-development

    Complexity is a game that some people enjoy.

    And some people like simplicity. Their stuff works better.

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:
    https://panteltje.nl/panteltje/pic/scope_pic/index.html
    nice to do Fourier transform in a few bytes... sine lookup table
    has a Usenet compatible output, use fixed size font:
    https://panteltje.nl/panteltje/pic/scope_pic/screen_dump2.txt

    Most web things I have coded in a few lines of C,
    started on a browser too, but that is a moving target.. takes too much time.
    Also wrote this Newsreader I am posting this with, it runs on a Raspberry Pi4
    raspberrypi: ~ # whereis NewsFleX
    NewsFleX: /usr/local/bin/NewsFleX
    raspberrypi: ~ # lb /usr/local/bin/NewsFleX
    -rwxr-xr-x 1 root root 383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    lb is short for ls -rtl --color=none
    383796 Mar 13 2023 /usr/local/bin/NewsFleX*

    'lb' is short for ls -rtl --color=none
    383,796 bytes
    So < 400 kB
    Linked in is libforms for the GUI.
    Old verion for x86 here:
    https://panteltje.nl/panteltje/newsflex/index.html
    libforms however changed, so unless you use a very old verion of that it won't work.

    I have dropped that xforms lib too and still have a GUI...
    https://panteltje.nl/pub/boats_and_planes.gif
    runs 24/7
    -rwxr-xr-x 1 root root 329604 Feb 7 2021 xgpspc
    329,604 bytes
    monitors planes and boat traffic, does navighation, auto-pilot what not. >>>> latest version even has a fire solution.. for defence of course
    Only uses these libs, from the Makefile:
    $(COMPILER) -o xgpspc $(XGPSPC) -lm -lpthread -lXaw -ljpeg
    libmath, libjpeg and libXaw (for the display).

    Simplicity, or simple city or whatever it was
    of course gcc as compiler.
    Or gpasm for the PIC asm code.

    I think the ever more bloat comes from trying to sell ever more,
    a capitalist trick to suck you for money.
    More bloat causes need for ever more powerfull hardware.
    So bloat writers get shares in hardware manufacurers and get rich.
    Microsore or whatever is a big example.

    That's all very impressive, Jan, but if you were *truly* a hardcore
    programmer, you'd be using machine code. ;-)

    I have used machine code in the long ago past.
    Here is a nice Z80 disassembler I wrote:
    https://panteltje.nl/panteltje/z80/index.html
    from emails I know people still use it.


    More seriously, bloat enables coders to hide back doors much more
    effectively. They'd never get away with that kind of subterfuge with
    ASM.

    Yes, all those libraries.. I follow the news and sometimes things are loaded >> that have backdoors.

    But asm, long ago I was involved with card hacking,
    things are read only, and how to list the code of a PIC micro
    (in those days in the TV smart cards for encrypted TV channels).
    That is how I got interested and came to use Microchip PICs..
    It is not always easy to list those codes to get the secret algo they use to >> encrypt TV transmissions.
    I stopped when some EU politician got upset.. some persisted and got sentenced....
    But that is how I learned about PICs and got interested in crypto.

    Many thanks for that well thought-out and well-reasoned response, Jan.
    Nice to hear from someone who knows what they're talking about instead
    of some half-baked garbage from a moron like Bill Sloman who wouldn't
    even be able to set up something as elementary as an Antikythera
    orrery. ;-)

    Now that you mention it: That piece of hardware was actually pretty sophisticated, and I think that even today, only few people would
    have been able to use it to good effect.

    There is a series of videos of someone who built a replica and he
    explains its workings to some length. Search for "clickspring
    antikythera" on youtube. I found it fascinating, and also somewhat
    humbling to realize that my knowledge of our solar system is nothing
    compared to what was encoded in this mechanism.

    Of course, these days software does it better.

    Jeroen Belleman

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Wanderer on Sun Feb 11 13:43:33 2024
    On 2/11/2024 10:47 AM, Wanderer wrote:
    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje wrote:

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:

    Back in the 20th century, I knew how to program in C. I
    knew what the assembly code would like after I compiled it.

    A minor point; you THOUGHT you knew what the ASM would look like...
    you knew what the processor should be *doing*.

    Newer compilers are often considerably smarter than the
    programmers using them. They will rearrange code (where
    dependencies allow it) to avoid pipeline stalls. Or,
    realign structures to avoid misaligned memory accesses.
    Or even eliminate calls to functions that it can inline
    more efficiently. Or, avoid generating code that will
    tickle a "bug" in the targeted processor's HARDWARE!

    But, the point of knowing what the processor is expected
    to be doing is important.

    This is C++.

    https://en.cppreference.com/w/cpp/links/libs

    C++ (and other OOPS) adds complexity to help the programmer
    manage the complexity in his program/solution.

    A big (huge!) part of software development is modeling
    the application and its domain. A good model makes the
    implementation intuitive... it just *fits*. This is
    important because code is meant to be READ, not WRITTEN;
    if the next guy (that you likely have never met and with
    unknown capabilities) can't understand what you've written,
    expecting him to make fixes or enhancements is a fool's hope.

    E.g., my system is entirely object *based* despite not
    being written in an OO language (think about the difference).
    It makes sense for a developer (or user) to think of verbs
    and nouns -- OPEN the GARAGE DOOR.

    In a procedural language, you would have a plethora of
    "routines" cluttering up the namespace: open_garage_door(), close_garage_door(), open_front_door(), open_side_door(),
    open_car_door(), open_access_panel_to_furnace(), etc.

    They would all share some common characteristics -- yet,
    each would have to REMEMBER to include those in its
    implementation. (e.g., do you have to UNLOCK the door
    before you can open it? Even if this only applies to
    SOME doors, having it present in a base class reminds
    you that you have to address that -- instead of waiting
    for the issue to manifest as a bug!)

    In my world, I can "move" an object to a different
    "backing server" (the active piece of code that handles
    requests to operate on objects of a particular type).
    Or, even to a different backing server on another
    processor elsewhere in the network (and, moving the
    server -- which, of course, is also an object! -- there
    to be waiting for the object to arrive!)

    I.e., there are verbs (methods) that apply to all objects.
    Defining the system as object based reminds me that I have to
    address each of these verbs for EVERY object type.

    It also makes it easier for me to address OPENing *any*
    door, regardless of the actual "type" of door -- because
    anything that derives from "Door" has an "open" method.
    I don't have to say:

    case (door_type) {
    GARAGE => open_garage_door();
    HOUSE => open_front_door();
    FURNACE => open_access_panel_to_furnace();
    ...
    }

    [What happens when there's a new door type that THIS code
    doesn't explicitly recognize?? DOGGIE => open_doggie_door()]

    Now I program in Python. I really don't know how to program
    in Python. I'm googlesmart. I google what I want to do,
    download the appropriate library and follow the documentation.

    Thats part of the bloat problem (and the decline of software
    quality, in general). It's *programming* not software engineering.
    ANYONE can program... all you have to do is throw keystrokes at
    it until it APPEARS to work! You don't need to understand
    the hardware, the operating system, the libraries, etc.

    Another part of the problem is fat interfaces; too many BUILT ways
    to solve the same problem. And, nothing that enforces your choice
    of solutions. The fact that these "mechanisms" are so poorly
    characterized means you are free to IMAGINE how it will work IN
    YOUR CASE instead of having a contract that you can both rely on
    ("you will use me in this way and I will provide this result").

    Imagine semiconductors being as loosely characterized: a diode
    allows for current to flow one way (how MUCH current? what is the
    drop across the junction? how much power can the packaged device
    dissipate? at what reverse voltage will it breakdown?).

    How is this code supposed to work:
    memcpy(LAST_LOGICAL_MEMORY_ADDRESS-VALUE, some_address, VALUE+7)
    Or this?
    memcpy(some_address, LAST_LOGICAL_MEMORY_ADDRESS-VALUE, VALUE+7)

    How *will* it work on a 68K? 80386? ARMv8?

    On bare metal? Under a toy OS? Under a "real" OS?

    Operator overloading is a HUGE win, esp for arithmetic operators.
    I can say:
    temp = A.x * (B.y - C.y)
    + B.x * (C.y - A.y)
    + C.x * (A.y - B.y);
    area = temp/2;
    and:
    - have a greater chance of getting it right
    - have a greater chance of The Next Guy recognizing what I've done!
    when all of those operators have been properly overloaded for:
    Point A,B,C;
    data -- which, today, I may have decided have components that are
    Q24.8 but, tomorrow, I may decide should be Q40.24!

    And, instead of just delaring A, B and C as simple structs made of
    integers, a constructor for each must be (silently) invoked... in
    case there are any niggling details involved.

    [did you miss the implied casts to "temp" and "area"'s respective
    data types? What if I want to change those types? how much of
    THIS code will change???]

    Imagine coding that in a procedural language and The Poor Bloke
    who has to read what you've written!

    coord_size_t t1 = coordinate_sub(B.y, C.y);
    coord_size_t t2 = coordinate_mul(A.x, t1);
    ...

    But, there are costs to this, imposed by the language. I run
    similar operations MILLIONS of times each second in my gesture
    recognizer... overhead has a cost! :<

    I don't know if there is something malicious in there. That's
    why I really hate every little stupid program and app that
    thinks it needs to auto-update and needs admin approval to
    install and screw with the operating system. If there is
    a portable option, I get that and I keep old versions until
    they break.

    Portable just bundles the dependencies into the executable.
    So, you end up with larger binaries -- that *can't* be
    upgraded (someone has to build you a new portable version
    with whichever dependencies -- or application -- updated).

    Much of the reason for "bloat" can be attributed to (imagined?)
    user demands. The Microsoft Mentality has users looking for
    something to click on when they are performing a task.
    E.g., spell check a document (no, it must do this WHILE they
    are typing cuz they LOVE being distracted by decorated text
    alerting them to the POSSIBILITY of a misspelling!).

    The UNIX Mentality had smarter users who knew how to plumb
    applications together to get a desired result.

    E.g., to look for duplicate files on a machine, you
    could recursively parse the hierarchy (or the portion of
    interest) with "find <hierarchy> -name * -print" -- to
    ignore all of the ".<whatever>" files. And, while
    doing so, compute the MD5 hash of the file. Storing
    these as (pathname, hash) in a "flat database" (i.e., FILE!),
    you could then sort | uniq and get a list of duplicates.

    In the Microsoft world, you need an app -- with its own
    GUI! -- to do this.

    Much, also, is a result of programmers being unable to
    grasp (grok) all of an application's detail. So, they
    may implement the same functionality many different
    times (ways?) to address the same problem in different
    places. Why do Windows apps all report the sizes of
    files differently? Why is a "0 byte" file shown as "1K"?
    Is that MB or MiB? (and why do you have to keep explaining
    it to people?)

    Try unpacking a deep file hierarchy from an archive
    to a point deep in the windows filesystem. Why can't
    it create: /some/deep/point/in/the/windows/file/system/archive/with/a/long/path?
    Yet, I can unpack the archive to C:\ and then *move* it to that
    point in the filesystem!

    If a program surprises the user with its behavior, is that
    a bug?

    In embedded devices, much bloat is due to folks overprovisioning
    their solutions: "Let's use Linux!" Really? That's like
    taking a drive to the beach in the "semi" tractor trailer (lorry)!

    [Any idea how many lines of code -- i.e., latent bugs -- that
    "component" brings into your design? Do you ANNUALLY send a
    salary-scale contribution to the community that you are hoping
    will fix problems that your customers uncover?? Or, are you
    just a leach?]

    Then, to justify the bloat that they've just built into their
    product, they create reasons to use the extra features that
    it makes available!

    "We can create a folder for each DRIVER. And, under those, store
    the prefered seat position in a file called seat_position. Likewise,
    the mirror settings in mirror_settings. And, favorite radio
    stations in a Radio subfolder with a file for each preset:
    1, 2, 3, 4..."

    So, they have now BURDENED their design with all of that "extra
    stuff" that they didn't THINK they needed in the original
    design document. Plus, imagined uses for it. And, of course,
    the extra code that using it requires! (being able to search
    for each file in the hierarchy, give the driver a NAME,
    parse the contents of each file to ensure it is appropriate
    for the type of data EXPECTED within, report errors to the
    user when it is NOT, etc.)

    And, they now have dependencies on a bunch of code that they
    likely are not competent to maintain (show of hands: how many
    kernel hackers??) let alone fully understand.

    How big should the ARP cache be? What sort of admission policy
    should it use? How to ensure your DNS isn't poisoned? What
    if the filesystem gets corrupted (power outage during a write)?
    Where do you keep the defaults to reinitialize all of those
    corrupted files?

    How do you FORMALLY test that design? Esp the parts that
    you likely don't understand -- do you know where to probe
    for weaknesses? And, their consequences??

    "as simple as can be -- and no simpler"

    Can you keep the details of your design IN YOUR SINGLE CRANIUM
    in enough detail that you can explain how it works to a new
    hire (and respond to detailed questions)?

    We have a saying, here:
    Enjoy your visit! Take someone BACK home with you!
    You should have the same attitude when revisiting your code;
    what can I remove or how can I refactor this to make it
    SIMPLER?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Wanderer on Sun Feb 11 13:56:36 2024
    On 2/11/2024 10:47 AM, Wanderer wrote:
    Back in the 20th century, I knew how to program in C. I
    knew what the assembly code would like after I compiled it.

    This is C++.

    https://en.cppreference.com/w/cpp/links/libs

    You can also look at different "strains" of C++
    (e.g., EC++) to avoid some of the cruft/overhead

    [And, there are other OO languages that have
    friendlier characteristics]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Larkin@21:1/5 to blockedofcourse@foo.invalid on Sun Feb 11 15:08:50 2024
    On Sun, 11 Feb 2024 13:56:36 -0700, Don Y
    <blockedofcourse@foo.invalid> wrote:

    On 2/11/2024 10:47 AM, Wanderer wrote:
    Back in the 20th century, I knew how to program in C. I
    knew what the assembly code would like after I compiled it.

    This is C++.

    https://en.cppreference.com/w/cpp/links/libs

    You can also look at different "strains" of C++
    (e.g., EC++) to avoid some of the cruft/overhead

    [And, there are other OO languages that have
    friendlier characteristics]

    https://en.wikipedia.org/wiki/List_of_programming_languages

    Applications are boring. It's much more fun to invent programming
    languages.

    What's telling is that new programming languages are popular and older
    ones aren't.

    https://stackoverflow.blog/2017/10/31/disliked-programming-languages/

    In other words, programming languages are fads.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Don Y on Sun Feb 11 17:06:29 2024
    On 2/11/2024 1:43 PM, Don Y wrote:
    How is this code supposed to work:
        memcpy(LAST_LOGICAL_MEMORY_ADDRESS-VALUE, some_address, VALUE+7)
    Or this?
        memcpy(some_address, LAST_LOGICAL_MEMORY_ADDRESS-VALUE, VALUE+7)

    How *will* it work on a 68K?  80386?  ARMv8?

    On bare metal?  Under a toy OS?  Under a "real" OS?

    This is actually a delightful example of how poorly characterized
    software components are (and why you should have your own sources
    for everything that you use in a design!).

    Taking just the first memcpy(3c) example...

    *ASSUME*[1] that data is copied at "from" to "to" in ascending
    sequential order, starting at "from". Eventually, "to" will
    hit the LAST_LOGICAL_MEMORY_ADDRESS. Then, *pass* it. Will
    this wrap around to "0x0"?

    What if the LAST_LOGICAL_MEMORY_ADDRESS isn't the largest
    representable as an integer? E.g., a device that has a limit
    on "code space" that is smaller than the total address space?

    What if "to" is "from+1"? E.g., will ABCDEFGHIJ end up
    as AABCDEFGHIJ? Or, AAAAAAAAAA? Or, ABCDABCDAB? Or...

    What if "to" is "from"? Will *any* reads (or writes) be
    performed?

    What if you naively use this to initialize a (memory-mapped)
    I/O device? Will the "registers" in the device be accessed
    in ascending, sequential order? Or, could they be
    accessed in the order B A D C E F H J I? Is there anything
    that ensures a location is only updated *exactly* once?
    Is A A A A A A A A A A A A B C D E F G H I J possible?

    What if the OS SIGSEGV's (as expected)? How much of this
    "work" will have been done? Can you just abort the balance
    of the operation on the assumption that "the first part"
    did what you wanted it to do??

    Defend each answer! :> Now, pick a different library
    implementation or a different processor and make the same
    claims... (and that's a trivial STANDARD LIBRARY function!)

    -----------
    [1] there are no guarantees that this assumption is correct
    or any BETTER than any of the others that follow!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Dan Purgert@21:1/5 to Don Y on Tue Feb 13 12:00:59 2024
    On 2024-02-11, Don Y wrote:
    [...]
    E.g., my system is entirely object *based* despite not
    being written in an OO language (think about the difference).
    It makes sense for a developer (or user) to think of verbs
    and nouns -- OPEN the GARAGE DOOR.

    You open the garage door. It's dark inside.


    ( sorry, had to :) )

    --
    |_|O|_|
    |_|_|O| Github: https://github.com/dpurgert
    |O|O|O| PGP: DDAB 23FB 19FA 7D85 1CC1 E067 6D65 70E5 4CE7 2860

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Dan Purgert on Tue Feb 13 11:24:55 2024
    On 2/13/2024 5:00 AM, Dan Purgert wrote:
    On 2024-02-11, Don Y wrote:
    [...]
    E.g., my system is entirely object *based* despite not
    being written in an OO language (think about the difference).
    It makes sense for a developer (or user) to think of verbs
    and nouns -- OPEN the GARAGE DOOR.

    You open the garage door. It's dark inside.

    You cast a spell of continual light -- then, extinguish it
    lest your companions ALSO notice the giant spider (Mclaren
    765LT) hiding therein!

    Slipping inside, you drive off, crushing the Sorceror's foot
    in the process. Incensed, he throws a fireball in
    your direction but, alas, too slow to catch up with the 750
    horses under the hood...

    ( sorry, had to :) )


    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From albert@spenarnc.xs4all.nl@21:1/5 to Wanderer on Wed Feb 14 14:09:27 2024
    In article <980294@dontemail.com>, Wanderer <dont@emailme.com> wrote:
    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje wrote:

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:


    Back in the 20th century, I knew how to program in C. I
    knew what the assembly code would like after I compiled it.

    This is C++.


    https://en.cppreference.com/w/cpp/links/libs


    Now I program in Python. I really don't know how to program
    in Python. I'm googlesmart. I google what I want to do,
    download the appropriate library and follow the documentation.
    I don't know if there is something malicious in there. That's
    why I really hate every little stupid program and app that
    thinks it needs to auto-update and needs admin approval to
    install and screw with the operating system. If there is
    a portable option, I get that and I keep old versions until
    they break.

    Totally agree. I'm waiting till one managed to subvert one
    of the mainstream browsers with a backdoor via the obligatory
    daily updates.

    Groetjes Albert
    --
    Don't praise the day before the evening. One swallow doesn't make spring.
    You must not say "hey" before you have crossed the bridge. Don't sell the
    hide of the bear until you shot it. Better one bird in the hand than ten in
    the air. First gain is a cat purring. - the Wise from Antrim -

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From albert@spenarnc.xs4all.nl@21:1/5 to bill.sloman@ieee.org on Wed Feb 14 14:05:16 2024
    In article <uqa8qg$ui04$1@dont-email.me>,
    Bill Sloman <bill.sloman@ieee.org> wrote:
    On 11/02/2024 8:44 pm, Cursitor Doom wrote:
    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje <alien@comet.invalid> wrote: >>> On a sunny day (Sat, 10 Feb 2024 09:06:03 -0800) it happened John Larki <jl@997PotHill.com> wrote in
    <g4bfsidsbmg316togaaff19e63vv1pnqbo@4ax.com>:
    On Sat, 10 Feb 2024 08:10:13 -0800 (PST), Fred Bloggs <bloggs.fredbloggs.fred@gmail.com> wrote:

    <snip>

    That's all very impressive, Jan, but if you were *truly* a hardcore
    programmer, you'd be using machine code. ;-)

    Nobody writes machine code. Assembler has a one-to-one relationship with >machine code, but tit is easier to write and read.

    Nobody hu? Smith does. Written a compiler in hex code using only
    a hex to bin converter.
    https://dacvs.neocities.org/SF/
    The take away is, it is easier than you expect.


    More seriously, bloat enables coders to hide back doors much more
    effectively. They'd never get away with that kind of subterfuge with
    ASM.

    Of course they would. Have your ever tried to make sense of poorly
    documented and commented assembly code?

    And it is possible to make machine code self-modifying - at least on
    some machines - which offers even more opportunity, to put in back doors
    (and take then away again after you've exploited them).
    You must silence hysteric virus detectors before you could do that.

    --
    Bill Sloman, Sydney


    Groetjes Albert
    --
    Don't praise the day before the evening. One swallow doesn't make spring.
    You must not say "hey" before you have crossed the bridge. Don't sell the
    hide of the bear until you shot it. Better one bird in the hand than ten in
    the air. First gain is a cat purring. - the Wise from Antrim -

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to albert@spenarnc.xs4all.nl on Wed Feb 14 11:09:15 2024
    On 2/14/2024 6:05 AM, albert@spenarnc.xs4all.nl wrote:
    Nobody hu? Smith does. Written a compiler in hex code using only
    a hex to bin converter.
    https://dacvs.neocities.org/SF/
    The take away is, it is easier than you expect.

    One writes code to be *read*. Just because you CAN do something
    doesn't mean you SHOULD do that something. People spend inane
    amounts of time arranging dominos... just to knock them over
    (what's the point in that?)

    A kid I attended school with built his own little computer (pre-CP/M),
    wrote a monitor in machine code that he then burned into ROM.
    Used that to write an assembler. Then an OS, etc. Interesting
    "hobby" and worthwhile only if your time has no value.

    I had a job where we had a cheap, *live* system monitor that would
    let us watch variables and patch code while the system was running.
    But, the UI was limited to a six digit *numeric* display -- which
    means "split octal" (0xFFFF is 377377) instead of hexadecimal -- and
    keypad. So, you had to memorize opcodes in octal and convert
    all arguments to that prior to use/recognition.

    "Walking" (ADDRESS++) through the code required you to recognize
    opcodes and recall how many bytes followed before the next opcode
    would be encountered. Or *if* it would be encountered (as absolute
    and relative jumps/calls could interrupt the sequential flow).

    Having that *live* ability to interact with the system was a huge
    asset (at a time when ICE was uncommon -- and expensive!) and was
    present in every product that we released (so, you could carry a
    tiny piece of hardware to a site and interact with the system).
    You could twiddle data and code and watch how the system reacted
    without having to go back to the development environment and
    turn the crank for a "what if".

    But, the requirement to "hand disassemble/assemble" was just ridiculous!
    (why not the same hardware interface augmented with some code to make
    the UX less risky? Why not tied into the symbol table of the
    running executable so you KNEW what you were seeing and tweaking?)

    Prior to that, I'd written machine code (again in octal) for the Nova.
    Data entry via the 16 toggle switches on the front panel. Data
    readout via the 16 indicator lamps associated with them.

    Again, a convenient capability (when access to an assembler/compiler
    wasn't possible in the field... "I need to throw together a little
    routine to exercise some particular bit of hardware so I can
    'scope the hardware) but annoyingly complex and not a very portable
    skillset.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cursitor Doom@21:1/5 to All on Wed Feb 14 18:34:26 2024
    On Tue, 13 Feb 2024 12:00:59 -0000 (UTC), Dan Purgert <dan@djph.net>
    wrote:

    On 2024-02-11, Don Y wrote:
    [...]
    E.g., my system is entirely object *based* despite not
    being written in an OO language (think about the difference).
    It makes sense for a developer (or user) to think of verbs
    and nouns -- OPEN the GARAGE DOOR.

    You open the garage door. It's dark inside.


    ( sorry, had to :) )

    In German they would say "can you the garage door open make?"

    Kind of makes more sense to a computer (or that gnome in the Star Wars
    films).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Dan Green@21:1/5 to blockedofcourse@foo.invalid on Wed Feb 14 18:28:29 2024
    On Wed, 14 Feb 2024 11:09:15 -0700, Don Y
    <blockedofcourse@foo.invalid> wrote:

    On 2/14/2024 6:05 AM, albert@spenarnc.xs4all.nl wrote:
    Nobody hu? Smith does. Written a compiler in hex code using only
    a hex to bin converter.
    https://dacvs.neocities.org/SF/
    The take away is, it is easier than you expect.

    One writes code to be *read*. Just because you CAN do something
    doesn't mean you SHOULD do that something. People spend inane
    amounts of time arranging dominos... just to knock them over
    (what's the point in that?)

    A kid I attended school with built his own little computer (pre-CP/M),
    wrote a monitor in machine code that he then burned into ROM.
    Used that to write an assembler. Then an OS, etc. Interesting
    "hobby" and worthwhile only if your time has no value.

    I had a job where we had a cheap, *live* system monitor that would
    let us watch variables and patch code while the system was running.
    But, the UI was limited to a six digit *numeric* display -- which
    means "split octal" (0xFFFF is 377377) instead of hexadecimal -- and
    keypad. So, you had to memorize opcodes in octal and convert
    all arguments to that prior to use/recognition.

    "Walking" (ADDRESS++) through the code required you to recognize
    opcodes and recall how many bytes followed before the next opcode
    would be encountered. Or *if* it would be encountered (as absolute
    and relative jumps/calls could interrupt the sequential flow).

    Having that *live* ability to interact with the system was a huge
    asset (at a time when ICE was uncommon -- and expensive!) and was
    present in every product that we released (so, you could carry a
    tiny piece of hardware to a site and interact with the system).
    You could twiddle data and code and watch how the system reacted
    without having to go back to the development environment and
    turn the crank for a "what if".

    But, the requirement to "hand disassemble/assemble" was just ridiculous!
    (why not the same hardware interface augmented with some code to make
    the UX less risky? Why not tied into the symbol table of the
    running executable so you KNEW what you were seeing and tweaking?)

    Prior to that, I'd written machine code (again in octal) for the Nova.
    Data entry via the 16 toggle switches on the front panel. Data
    readout via the 16 indicator lamps associated with them.

    Again, a convenient capability (when access to an assembler/compiler
    wasn't possible in the field... "I need to throw together a little
    routine to exercise some particular bit of hardware so I can
    'scope the hardware) but annoyingly complex and not a very portable
    skillset.

    I write in machine code sometimes when it's the best approach. On the comp.lang.c newsgroup, we've had a *lot* of entries for the
    'obfuscated C contest' over the years and a sub-set of us decided it
    would be a hoot to have an obfuscated machine code contest as well.
    Personally I found it really, really enjoyable (I was in the minority
    as we never had another one, though).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Dan Green on Wed Feb 14 11:57:11 2024
    On 2/14/2024 11:28 AM, Dan Green wrote:
    Again, a convenient capability (when access to an assembler/compiler
    wasn't possible in the field... "I need to throw together a little
    routine to exercise some particular bit of hardware so I can
    'scope the hardware) but annoyingly complex and not a very portable
    skillset.

    I write in machine code sometimes when it's the best approach. On the comp.lang.c newsgroup, we've had a *lot* of entries for the
    'obfuscated C contest' over the years and a sub-set of us decided it
    would be a hoot to have an obfuscated machine code contest as well. Personally I found it really, really enjoyable (I was in the minority
    as we never had another one, though).

    Core Wars, anyone? :>

    But, nowadays, you are -- most often -- interfacing to a system that
    was written in a HLL. So, knowing where you are in the algorithm isn't
    as easily discerned as the compiler could have moved code around, elided
    things it thought superfluous, etc. If your goal is to get productive
    work done, you'd likely want more assurances that your code would
    be doing what you intended (and, sooner or later, you're going to
    have to "write it for real").

    The boot ROM (bipolar) on the Reading Machine was, IIRC, just sixteen
    16-bit words. So, you didn't have the luxury of being able to
    write what you *wanted* but, instead, had to settle for what would
    *fit*. E.g., I think the software loaded to some random address
    and then immediately copied itself to the correct address. The
    "random address" happened to be the opcode for one of the other
    instructions in the ROM.

    In my early designs, we relied heavily on self-modifying code to
    provide features at low cost. E.g., to disable a task, you
    would change the opcode of the CALL to that task to another
    "benign" opcode for another 3-byte instruction (cuz CALL was a
    3-byte instruction). This allowed you to preserve the entry
    point to the task (the target of the CALL) while disabling
    its execution (by converting the instruction to something
    that effectively ignored the "address" argument)

    When I worked on i4004's, we each found utility in carrying
    a cheat sheet of opcodes in our wallets -- a "pocket assembler".
    The development tools were slow and klunky (and only one set
    shared among us all!) so it was effective to patch binary images,
    if you could fit your change into the space you're overwriting.

    This was an efficient use of time when your access to the
    "real" tools was limited to one or two turns of the crank
    in an 8 hour day! While a colleague had *his* turn, you
    could load the image from whichever 1702 was to be modified
    into the programmer and manually patch the bytes. Then, write
    the new image into a new 1702, plug it into the prototype
    and see how your proposed change would perform. Then, mark up
    the (ASM) listing to incorporate a cleaner patch next time
    you had access to the development system.

    Nowadays, everyone effectively has access to BETTER tools,
    simulators, etc. so dealing with real hardware is more of
    a nuisance...

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to albert@spenarnc.xs4all.nl on Wed Feb 14 11:58:59 2024
    On 2/14/2024 6:09 AM, albert@spenarnc.xs4all.nl wrote:
    I don't know if there is something malicious in there. That's
    why I really hate every little stupid program and app that
    thinks it needs to auto-update and needs admin approval to
    install and screw with the operating system. If there is
    a portable option, I get that and I keep old versions until
    they break.

    Totally agree. I'm waiting till one managed to subvert one
    of the mainstream browsers with a backdoor via the obligatory
    daily updates.

    How do you know that what you've "frozen" hasn't already been compromised?
    A latent virus can wreak havoc on your system *years* after infection.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Peter@21:1/5 to Don Y on Thu Feb 15 10:13:44 2024
    Graet to see you Don after all these years - 2006!!

    I had a customer many years ago who did write a ton of code in hex. To
    enable modifications they had a bit of space after each function, so
    edits to a function did not need shifting everything after it :)

    Peter

    Don Y <blockedofcourse@foo.invalid> wrote:

    One writes code to be *read*. Just because you CAN do something
    doesn't mean you SHOULD do that something. People spend inane
    amounts of time arranging dominos... just to knock them over
    (what's the point in that?)

    A kid I attended school with built his own little computer (pre-CP/M),
    wrote a monitor in machine code that he then burned into ROM.
    Used that to write an assembler. Then an OS, etc. Interesting
    "hobby" and worthwhile only if your time has no value.

    I had a job where we had a cheap, *live* system monitor that would
    let us watch variables and patch code while the system was running.
    But, the UI was limited to a six digit *numeric* display -- which
    means "split octal" (0xFFFF is 377377) instead of hexadecimal -- and
    keypad. So, you had to memorize opcodes in octal and convert
    all arguments to that prior to use/recognition.

    "Walking" (ADDRESS++) through the code required you to recognize
    opcodes and recall how many bytes followed before the next opcode
    would be encountered. Or *if* it would be encountered (as absolute
    and relative jumps/calls could interrupt the sequential flow).

    Having that *live* ability to interact with the system was a huge
    asset (at a time when ICE was uncommon -- and expensive!) and was
    present in every product that we released (so, you could carry a
    tiny piece of hardware to a site and interact with the system).
    You could twiddle data and code and watch how the system reacted
    without having to go back to the development environment and
    turn the crank for a "what if".

    But, the requirement to "hand disassemble/assemble" was just ridiculous!
    (why not the same hardware interface augmented with some code to make
    the UX less risky? Why not tied into the symbol table of the
    running executable so you KNEW what you were seeing and tweaking?)

    Prior to that, I'd written machine code (again in octal) for the Nova.
    Data entry via the 16 toggle switches on the front panel. Data
    readout via the 16 indicator lamps associated with them.

    Again, a convenient capability (when access to an assembler/compiler
    wasn't possible in the field... "I need to throw together a little
    routine to exercise some particular bit of hardware so I can
    'scope the hardware) but annoyingly complex and not a very portable
    skillset.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Wanderer@21:1/5 to All on Thu Feb 15 12:02:31 2024
    On Wed, 14 Feb 2024 11:58:59 -0700 Don Y wrote

    How do you know that what you've "frozen" hasn't already been compromised?
    A latent virus can wreak havoc on your system *years* after infection.

    I update the anti-virus, spyware and malware programs.
    I got fan-made kerbal space program mods that want to access
    to the internet to check for updates. Or your download some
    freeware to open some .crap extension that some fool used on
    a file and this app wants admin priviledges so it can integrate
    into the operation and become a default program. When I was in
    college I had programming Professor who taught Pascal and his
    big thing was 'scoping'. Every procedure should be self-contained
    and have simple defined connection to the global program. Now every
    program wants to pepper my computer with dll's they that programmer
    picked up from a package he got from who knows where.

    You're right. I don't know if my current system is infected but I know
    the odds of getting infected don't get better with more interactions and
    more partners.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Wanderer on Thu Feb 15 13:59:43 2024
    On 2/15/2024 12:02 PM, Wanderer wrote:
    On Wed, 14 Feb 2024 11:58:59 -0700 Don Y wrote

    How do you know that what you've "frozen" hasn't already been compromised? >> A latent virus can wreak havoc on your system *years* after infection.

    I update the anti-virus, spyware and malware programs.

    But, if the particular "infestation" hasn't "manifested", in
    public, previously, it is possible that the AV tools don't
    KNOW about it -- yet.

    I don't run AV on this machine (which is the only out-facing
    machine here). I keep nothing on it (just a browser and mail
    client) so can easily reconstruct it, if ever hosed.

    Every 6 months, I pull the disk and set it on a shelf. I
    install a fresh disk with the original machine image on it.
    And, run a virus scan on the disk pulled 6 months *earlier*
    (recycling that media once it has been verified as clean).

    I got fan-made kerbal space program mods that want to access
    to the internet to check for updates. Or your download some
    freeware to open some .crap extension that some fool used on
    a file and this app wants admin priviledges so it can integrate
    into the operation and become a default program. When I was in
    college I had programming Professor who taught Pascal and his
    big thing was 'scoping'. Every procedure should be self-contained
    and have simple defined connection to the global program.

    Yes. "Compartmentalization". Most applications fail on this
    score (whether for desktop or for an embedded instrument/appliance)
    because they reside in a single process container -- any thread
    can dick with any other thread's data.

    Ideally, you want to partition the problem into small units
    that have *slim* interfaces (minimize information sharing
    as this leads to a more robust solution and a more *performant*
    one, once you put up barriers between those units (processes).

    Now every
    program wants to pepper my computer with dll's they that programmer
    picked up from a package he got from who knows where.

    Yup. People have no idea what they are "baking into" their "programs".
    So many dweebs thinking "let's use Linux" (as the basis for a product)
    without any understanding of what that entails, what risks it presents
    and how to *maintain* it ("We'll just ask on some forum and hope
    someone takes an interest in solving OUR CUSTOMERS' PROBLEMS").

    My current system enforces *every* contract so a program can't
    access anything that it hasn't *declared* a need to use (data
    as well as code). This exposes all of the interconnects so
    a user (customer) can decide if he wants to install that app
    ("Why does this app need to access...?") or if he wants to
    further thwart any abuses he might suspect of the app ("Yeah,
    you can access my address book! But, the version that YOU see
    won't have any names in it!")

    You're right. I don't know if my current system is infected but I know
    the odds of getting infected don't get better with more interactions and
    more partners.

    I think you have to just think about what you want *from* your machine
    and use that as a filter when deciding if a new app/upgrade should be installed, or not.

    I keep a set of laptops for "play"; install an app on them to see if
    it adds any value (or, to just get some temporary use from it) and
    then reinstall the disk image (3 minutes). I used to use disposable
    VMs for this (make a clean copy of a VM; install app; play; flush
    dirty VM) but laptops are quicker than spinning up all the disks
    on my ESXi server!

    I avoid updates/upgrades, in general, as they tend to just replace
    one set of "bad behaviors" with another. *AND*, take time for me
    to identify those! Much easier to live with the problems I already
    know about than to keep swapping them for a new set!

    [It would be nice if each update came in two flavors: ONLY bug fixes
    and bug fixes PLUS changes]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Peter on Fri Feb 16 14:13:42 2024
    On 2/15/2024 3:13 AM, Peter wrote:
    Graet to see you Don after all these years - 2006!!

    Hey there, Mr "Pool" :>

    I trust all is well, remodel long completed, kids now grown
    (which of them was first to make you "Gramps"? and wasn't your
    youngest looking for his pilot's license?), thus PBfH having
    less of an impact on your life, etc.

    I had a customer many years ago who did write a ton of code in hex. To
    enable modifications they had a bit of space after each function, so
    edits to a function did not need shifting everything after it :)

    But what was their *reason* for this? I had an employer (*had* been
    an engineer and deluded himself into thinking he could still *do*
    engineering) who was stuck in the past -- as if the tools and
    techniques he had used were still relavent, even a few years later!

    When it took hours to assemble, link, burn images, it made sense to
    have mechanisms to support minor tweeks to the code (overwriting
    instructions with NOPs and filling in a "0xFF" postamble with new
    code). But, nowadays, make world on even large projects is just
    a coffee break -- and, you can dump your code into RAM to watch
    it run (assuming you have to run on a target and not in a
    simulator).

    [Nowadays, I netboot images just for the savings that one step
    makes possible!]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From albert@spenarnc.xs4all.nl@21:1/5 to blockedofcourse@foo.invalid on Sat Feb 17 11:49:30 2024
    In article <uqivkh$2ontb$2@dont-email.me>,
    Don Y <blockedofcourse@foo.invalid> wrote:
    On 2/14/2024 6:05 AM, albert@spenarnc.xs4all.nl wrote:
    Nobody hu? Smith does. Written a compiler in hex code using only
    a hex to bin converter.
    https://dacvs.neocities.org/SF/
    The take away is, it is easier than you expect.

    One writes code to be *read*. Just because you CAN do something
    doesn't mean you SHOULD do that something. People spend inane
    amounts of time arranging dominos... just to knock them over
    (what's the point in that?)

    This project is meant to be read. You can't be serious suggesting
    that this is a tool to be used.
    If you spend the time looking at the code, you'd discover that it
    is quite educational, and make you wonder where the software bloat
    comes from.


    <SNIP>

    Groetjes Albert
    --
    Don't praise the day before the evening. One swallow doesn't make spring.
    You must not say "hey" before you have crossed the bridge. Don't sell the
    hide of the bear until you shot it. Better one bird in the hand than ten in
    the air. First gain is a cat purring. - the Wise from Antrim -

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Larkin@21:1/5 to blockedofcourse@foo.invalid on Sat Feb 17 13:59:38 2024
    On Sun, 11 Feb 2024 13:43:33 -0700, Don Y
    <blockedofcourse@foo.invalid> wrote:

    On 2/11/2024 10:47 AM, Wanderer wrote:
    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje wrote:

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:

    Back in the 20th century, I knew how to program in C. I
    knew what the assembly code would like after I compiled it.

    A minor point; you THOUGHT you knew what the ASM would look like...
    you knew what the processor should be *doing*.

    Newer compilers are often considerably smarter than the
    programmers using them. They will rearrange code (where
    dependencies allow it) to avoid pipeline stalls. Or,
    realign structures to avoid misaligned memory accesses.
    Or even eliminate calls to functions that it can inline
    more efficiently.


    Or it may skip doing things that it thinks are unnecessary. As FPGA
    compilers will do.

    One trick is to do stuff so complex that the compiler optimizer gives
    up. Or use terms that it can't know at compile time.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Larkin@21:1/5 to langwadt@fonz.dk on Sat Feb 17 15:39:42 2024
    On Sat, 17 Feb 2024 15:22:44 -0800 (PST), Lasse Langwadt Christensen <langwadt@fonz.dk> wrote:

    lørdag den 17. februar 2024 kl. 23.01.21 UTC+1 skrev John Larkin:
    On Sun, 11 Feb 2024 13:43:33 -0700, Don Y
    <blocked...@foo.invalid> wrote:

    On 2/11/2024 10:47 AM, Wanderer wrote:
    On Sun, 11 Feb 2024 06:43:31 GMT, Jan Panteltje wrote:

    It is cool coding in asm without using external libraries.
    I can do anything I like in KILOBYTES:

    Back in the 20th century, I knew how to program in C. I
    knew what the assembly code would like after I compiled it.

    A minor point; you THOUGHT you knew what the ASM would look like...
    you knew what the processor should be *doing*.

    Newer compilers are often considerably smarter than the
    programmers using them. They will rearrange code (where
    dependencies allow it) to avoid pipeline stalls. Or,
    realign structures to avoid misaligned memory accesses.
    Or even eliminate calls to functions that it can inline
    more efficiently.
    Or it may skip doing things that it thinks are unnecessary. As FPGA
    compilers will do.

    One trick is to do stuff so complex that the compiler optimizer gives
    up. Or use terms that it can't know at compile time.

    if you need that you are doing something wrong..

    I'm doing stuff that works.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to RichD on Sat Feb 17 17:29:08 2024
    On 2/17/2024 1:18 PM, RichD wrote:
    On February 11, Don Y wrote:
    Newer compilers are often considerably smarter than the
    programmers using them. They will rearrange code (where
    dependencies allow it) to avoid pipeline stalls. Or,
    realign structures to avoid misaligned memory accesses.
    Or even eliminate calls to functions that it can inline
    more efficiently.

    Indeed. This was the motivation, and result, of the original RISC architecture. Revolutionary, in its day, as processors were
    becoming astonishingly complex, with trig functions wired into
    a single machine op code!

    Compilers can also have broader knowledge of YOUR code than
    you do -- or WANT to! So, can deduce relationships that
    may not have been obvious to you when writing the code.

    Why EVER have "Code not reached"? Did you make some bad
    assumptions about the data and control that led you to THINK
    that code would be executed?

    And, simplify things that you would foolishly try to
    "hand optimize". E.g., writing:
    foo <<= 1;
    can make you THINK you are being clever. But, it causes a
    cognitive hiccup in the reader's mind if the intent is:
    foo /= 2;
    or, more explicitly:
    foo = foo / 2;
    A good compiler will generate the same code for each
    case (assuming integer data types) so why force the
    reader to perform that extra step in recognition?

    And, eliminate some common syntax "errors" that aren't
    actually prohibited by the language:
    x = foo & bar;
    (are you sure you don't mean "foo && bar"?)
    x = y = z;
    (are you sure you don't mean "(y == z)"?)

    Your goal should always be to make it MORE work for you to do something
    wrong than to do it *right*!

    And still today, those ideas are misunderstood, as many engineers
    say "yeah, RISC is faster because they have fewer instructions,
    it's obvious." (also cheaper, in cost/benefit)

    Not *fewer* instructions but, rather, SIMPLER instructions.
    Less silicon.

    Given these considerations, does anybody write assembly code for
    modern RISC processors?

    ASM or machine code? The two differ by the presence of a compiler
    in the former that is absent in the latter.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to albert@spenarnc.xs4all.nl on Sat Feb 17 17:16:14 2024
    On 2/17/2024 3:49 AM, albert@spenarnc.xs4all.nl wrote:
    In article <uqivkh$2ontb$2@dont-email.me>,
    Don Y <blockedofcourse@foo.invalid> wrote:
    On 2/14/2024 6:05 AM, albert@spenarnc.xs4all.nl wrote:
    Nobody hu? Smith does. Written a compiler in hex code using only
    a hex to bin converter.
    https://dacvs.neocities.org/SF/
    The take away is, it is easier than you expect.

    One writes code to be *read*. Just because you CAN do something
    doesn't mean you SHOULD do that something. People spend inane
    amounts of time arranging dominos... just to knock them over
    (what's the point in that?)

    This project is meant to be read. You can't be serious suggesting
    that this is a tool to be used.

    Why write code that isn't going to be used?

    If you spend the time looking at the code, you'd discover that it
    is quite educational, and make you wonder where the software bloat
    comes from.

    Why does your PHONE have a clock in it? AND a calendar?
    Don't you remember today's date? Don't you wear a wristwatch?
    Why does your PC have these data, too? Don't you have a
    *phone* (er, wristwatch and a memory for dates)?

    Which of these has the CORRECT information? How do you know?

    [I have an "atomic clock" that is supposed to keep correct
    time/date. Except, twice a year, it diddles the time for
    DST start end. *But*, we don't observe DST! So, while
    all of the DUMB clocks in the house correctly track the REAL
    time all year round, this one has to be "fixed", twice
    a year, to get it to IGNORE the DST changes! So, twice
    a year I look at the clock and wonder why *my* notion of
    the current time differs from it's -- and then tell
    it to move me to the next time zone, east or west, as
    appropriate]

    Why do I *need* an icon IN my word processor to invoke the
    spell-checker? Thesaurus? Grammar check? Reading complexity
    assessment? How does having them accessible *in* the word
    processor help me if I am writing an email -- in an entirely
    different "text editor" that mimics much of the functionality
    of the word processor?

    Does each possible application that allows for text entry
    merit the availability of these tools? What if I wanted
    to send an SMS?

    Are you just LAZY and can't invoke each of these as STAND-ALONE
    tools, as needed?

    So, EVERY app gets burdened with new implementations of
    tools that should be separate entities. To economize on
    resources AND ensure for more consistent results!

    [Windows lists files in "user-friendly order" -- which differs
    from ALPHABETICAL order! Other applications opt for a more
    traditional LRAlpha. Comparing file lists between the two
    just ADDS to confusion (loss of productivity).]

    Here is a trivial, ubiquitous, algorithm written in machine code:

    DD217A00DD7E01FE323805DD3401185FDD360100DD7E02FE3C3805DD3402184F DD360200DD7E03FE3C3805DD3403183FDD360300DD7E04FE183805DD3404182F DD360400216F000600DD4E0609DD7E05BE3805DD34051817DD360500DD7E06FE 0C3805DD34061807DD360600DD34071F1F1C1F1E1F1E1F1F1E1F1E1F

    No comments -- machines don't read comments! And, machines don't
    need the code to be formatted to show instruction breaks -- it's
    just a large array of bytes!

    Of course, you would have to sort out where the code resides in
    memory as all of the transfers of control *could* use absolute
    addresses (I deliberately chose a processor that offers relative
    addressing as a native addressing mode AND USED THAT to make the
    binary less obscure).

    Likewise, all absolute DATA addressing would require knowing WHAT
    was being referenced, esp if the data are not as interrelated as
    in this example.

    Machine code *teaches* nothing -- beyond as a historical curio.


    A bit easier to understand in ASM:

    LD IX,TIMING

    LD A,(IX+JIFFY)
    CP A,JPS
    JR C,NXTSEC

    INC (IX+JIFFY)
    JR DONE

    NXTSEC:
    LD (IX+JIFFY),0

    LD A,(IX+SECOND)
    CP A,SPM
    JR C,NXTMIN

    INC (IX+SECOND)
    JR DONE

    NXTMIN:
    LD (IX+SECOND),0

    LD A,(IX+MINUTE)
    CP A,MPH
    JR C,NXTHR

    INC (IX+MINUTE)
    JR DONE

    NXTHR:
    LD (IX+MINUTE),0

    LD A,(IX+HOUR)
    CP A,HPD
    JR C,NXTDAY

    INC (IX+HOUR)
    JR DONE

    NXTDAY:
    LD (IX+HOUR),0

    LD HL,DAYMON
    LD B,0
    LD C,(IX+MONTH)
    ADD HL,BC

    LD A,(IX+DAY)
    CP A,(HL)
    JR C,NXTMON

    INC (IX+DAY)
    JR DONE

    NXTMON:
    LD (IX+DAY),0

    LD A,(IX+MONTH)
    CP A,MPY
    JR C,NXTYR

    INC (IX+MONTH)
    JR DONE

    NXTYR:
    LD (IX+MONTH),0

    INC (IX+YEAR)

    DONE:

    six-significant-character labels (external linkage) not being unusual, hysterically. (In practice, one would use a pointer to walk through the
    list of counters instead of accessing them directly.)

    In this form, the structure and intent is clearer. And, it uses exactly
    the same run-time resources as the machine language variant, before!


    This is how people THINK about the algorithm (assuming 0-based ordinals):

    ASSERT( ( (jiffy >= 0) && (jiffy <= JIFFIES_PER_SECOND) ) )
    if !(jiffy >= JIFFIES_PER_SECOND-1) {
    jiffy++;
    } else {
    jiffy = 0;
    ASSERT( ( (second >= 0) && (second <= SECONDS_PER_MINUTE) ) )
    if !(second >= SECONDS_PER_MINUTE-1) {
    second++;
    } else {
    second = 0;
    ASSERT( ( (minute >= 0) && (minute <= MINUTES_PER_HOUR) ) )
    if !(minute >= MINUTES_PER_HOUR-1) {
    minute++;
    } else {
    minute = 0;
    ASSERT( ( (hour >= 0) && (hour <= HOURS_PER_DAY) ) )
    if !(hour >= HOURS_PER_DAY-1) {
    hour++;
    } else {
    hour = 0;
    ASSERT( ( (day >= 0) && (day <= DAYS_PER_MONTH[month]) ) )
    if !(day >= DAYS_PER_MONTH[month]-1) {
    day++;
    } else {
    if ( (0 == year % 4)
    && ((0 != year % 100) || (0 == year % 400)) ) {
    day++;
    } else {
    day = 0;
    ASSERT( ( (month >= 0)
    && (month <= MONTHS_PER_YEAR) ) )
    if !(month >= MONTHS_PER_YEAR-1) {
    month++;
    } else {
    month = 0;
    year++;
    }
    }
    }
    }
    }
    }
    }

    Use of *longer* symbolic names makes the app easier to understand -- and recognize! Even in the absence of explanatory comments (Joe Average likely doesn't understand that 1900 and 2100 would NOT be leap years as the only "century mark" in their lifetime -- 2000 -- was).

    Note that the inclusion of the leap year test is intuitive -- yet absent in
    the ASM and machine language versions. It would *likely* occur to Joe
    Average if asked to explain how "time" is tracked.

    And, I can add contractual declarations in a HLL that improve the quality
    of the code, it's readability AND detect *some* types of data corruption
    at run time (as most folks code in a single process container so no
    protection from *themselves* or other threads, co-executing, there) Note
    that people *do* think of these constraints, even if on a purely intuitive level. Expressing them explicitly makes this more formal (and allows the compiler, runtime and future developers to be aware of these "intuitions")

    But only a newbie would code it like that! THIS form is fraught with the potential for syntax and logic errors. Quick: which of the parens goes
    with each -- if the indentation (which may be incorrect as the compiler
    doesn't enforce indentation rules!) wasn't there to "help"? (Have *I*
    botched it??)

    A smarter way of expressing the algorithm:

    {
    while (FOREVER) {
    ASSERT( ( (jiffy >= 0) && (jiffy <= JIFFIES_PER_SECOND) ) )
    if !(jiffy >= JIFFIES_PER_SECOND-1) {
    jiffy++;
    break;
    }

    jiffy = 0;
    ASSERT( ( (second >= 0) && (second <= SECONDS_PER_MINUTE) ) )
    if !(second >= SECONDS_PER_MINUTE-1) {
    second++;
    break;
    }

    second = 0;
    ASSERT( ( (minute >= 0) && (minute <= MINUTES_PER_HOUR) ) )
    if !(minute >= MINUTES_PER_HOUR-1) {
    minute++;
    break;
    }

    minute = 0;
    ASSERT( ( (hour >= 0) && (hour <= HOURS_PER_DAY) ) )
    if !(hour >= HOURS_PER_DAY-1) {
    hour++;
    break;
    }

    hour = 0;
    ASSERT( ( (day >= 0) && (day <= DAYS_PER_MONTH[month]) ) )
    if !(day >= DAYS_PER_MONTH[month]-1) {
    day++;
    break;
    }

    if ( (0 == year % 4) && ((0 != year % 100) || (0 == year % 400)) ) {
    day++;
    break;
    }

    day = 0;
    ASSERT( ( (month >= 0) && (month <= MONTHS_PER_YEAR) ) )
    if !(month >= MONTHS_PER_YEAR-1) {
    month++;
    break;
    }

    month = 0;
    ASSERT( (year >= 0) )
    year++;
    break;
    }

    ASSERT( ( (jiffy >= 0) && (jiffy <= JIFFIES_PER_SECOND) ) )
    ASSERT( ( (second >= 0) && (second <= SECONDS_PER_MINUTE) ) )
    ASSERT( ( (minute >= 0) && (minute <= MINUTES_PER_HOUR) ) )
    ASSERT( ( (hour >= 0) && (hour <= HOURS_PER_DAY) ) )
    ASSERT( ( (day >= 0) && (day <= DAYS_PER_MONTH[month]) ) )
    ASSERT( ( (month >= 0) && (month <= MONTHS_PER_YEAR) ) )
    ASSERT( (year >= 0) )

    return;
    }

    This leads to a more structured way of recognizing and exploiting
    the similarities in the code:

    index = 0;
    while (0 != divisors[index]) {
    if (counters[index] < divisors[index]) {
    counters[index]++;
    break;
    } else {
    counters[index] = 0;
    index++;
    }
    }


    const int
    divisors[] = {
    JIFFIES_PER_SECOND,
    SECONDS_PER_MINUTE,
    MINUTES_PER_HOUR,
    HOURS_PER_DAY,
    DAYS_PER_MONTH,
    MONTHS_PER_YEAR,
    EOF
    };

    int
    counters[] = {
    0, // jiffy
    0, // second
    0, // minute
    0, // hour
    0, // day
    0, // month
    0, // year
    };

    STATIC_ASSERT( ( sizeof(counters[])/sizeof(counters[0]) )
    == ( sizeof(divisors[])/sizeof(divisors[0]) ) );

    but, this requires a "fixup" routine to address the leap-year handling
    (which happens at 00:00 on 29 Feb when we reset the date to 1 Mar in NON-leap-years). It, also, makes it harder to add the rest of the
    contractual constraints!

    Should we use signed/unsigned chars instead of ints to save a few bytes
    of DATA *and* TEXT? How hard would it be to make that change, here -- vs.
    in a machine language implementation?

    Which of all of these implementations will be easiest to *accurately* modify
    to handle Daylight Savings Time? Not just knowing *how* to apply the time shift but *when* (date AND time-of-day) to apply it?? Or, to generalize timezone handling (Newfoundland, anyone?)

    How does a person set the time, in Boston, to 2024 Mar 10 02:45? And,
    what point in time does 2024 Nov 3 01:45 reference? What time is it in Chicago, then?? Or, the date to 5 October 1582?

    Of course, all of this data would have to be wrapped in a mutex to
    ensure some other actor doesn't see partial updates (like Mar 1 00:00
    while the time was being updated to Apr 1 on at Mar 31 00:00!)
    Or, better, provide an accessor instead of exposing the raw data!


    An OOPS approach would make each of these "counters" as *objects* and then
    just increment one, see if it signals a wrap-around and, if so, increment
    the next more significant one, etc. The "day" counter would have friends
    in the month and year counters so it could decide whether 28 becomes 29,
    or not.

    And, all of those annoying details would be hidden from the person
    READING the code; jiffies become seconds, seconds become minutes,
    minutes become hours, etc. Do we care if the hours are presented
    as [0..23]? Or, [1..24]? Or, [1..12][AP]?


    Adding textual names would bury those *in* their respective definitions
    instead of further polluting the namespace:

    month_names[] = {
    "January",
    "February",
    ...
    "November",
    "December",
    };
    STATIC_ASSERT( ( sizeof(month_names[])/sizeof(month_names[0]) )
    == MONTHS_PER_YEAR )

    And, what if you had to I18N these?

    month_names_french[] = {
    "Janvier",
    "Février",
    ...
    "Novembre",
    "Décembre",
    };
    STATIC_ASSERT( ( sizeof(month_names_french[])/sizeof(month_names_french[0]) )
    == MONTHS_PER_YEAR )

    month_names_italian[] = {
    "Gennaio",
    "Febbraio",
    ...
    "Novembre",
    "Dicembre",
    };
    STATIC_ASSERT( ( sizeof(month_names_italian[])/sizeof(month_names_italian[0]) )
    == MONTHS_PER_YEAR )

    And, then likely have to add abbreviations for each:

    month_abbrevs_french[] = {
    "Janv.",
    "Février",
    ...
    "Nov.",
    "Déc.",
    };
    STATIC_ASSERT( ( sizeof(month_abbrevs_french[])/sizeof(month_abbrevs_french[0]) )
    == MONTHS_PER_YEAR )

    [Oh, my! They don't all fit in a 3+1 character representation!! So much for fixed-width date fields... would you have noticed that and coded to accommodate it? Or, would you blindly copy the abbreviation into a char[3] because
    that's how you *think* of month abbreviations???]


    You'd likely add a tool to determine day-of-week from date. And,
    I18N that, as well. Along with their abbreviations.

    And, as this is utility that many apps would likely avail themselves
    of, you'd probably wrap it in a library -- that other apps could
    link into their own executables! But, gee, what if I don't really
    want/need support for Tamil? Or, Marathi? Why does my app have
    to bear that cost?


    How much of this is bloat? Feeping Creaturism? Essential functionality? MARKETABLE functionality? E.g., I like being able to let my PC tell
    me what day of the week it is as I don't have normal exogenous
    synchronizer in my lifestyle! Should the PC assume that functionality?
    What's wrong with a WRISTWATCH? Why does a cell phone provide that?
    Surely, I could be REQUIRED to use something other than the PC (or
    cell phone) for timekeeping, right (in the interest of minimizing bloat!)?


    The *justifiable* reasons for "code growth" (which differs from "bloat")
    is to add functionality, structure and readability to the codebase.
    These improve the quality of the code as well as its accuracy, reliability
    and maintainability.


    My first commercial product had 12KB of code and 256 bytes of RAM. And,
    took 3 engineers to develop it. Entirely in ASM. Largely because there
    were insufficient resources (memory, MIPS, real-time and tools) to solve the problem as tasked. Yet this was a huge improvement on the i4004 design
    that preceeded it -- both from the development point of view and the UX!

    I suspect I could recreate the entire codebase in a HLL in a few weekends.
    I wouldn't have to write -- and PROFILE (to verify real-time performance!)
    and debug -- a floating point package; I could just write expressions in
    infix notation. I wouldn't have to consider which values were represented
    in binary vs. BCD.

    I likely wouldn't have to share *bits* in a byte as flags for the code
    to govern it's execution -- just write what you *want* the code to do and
    let the compiler sort out the most efficient way of doing so! And, I
    wouldn't have to read the code of my fellow developers to see if THEY
    had decided to use a previously unused bit for THEIR routines; I'd declare
    MY data PRIVATE to my modules and count on the compiler to enforce that
    barrier -- yet still allow each of them to reuse names that I had
    "already" used -- privately!! (i, j, x, y, index, count, etc.)

    I could stub the code to emit key values for me to verify its operation
    "on the bench" as well as in the actual implementation. I could sprinkle invariants through the code to catch *my* errors.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Cursitor Doom@21:1/5 to blockedofcourse@foo.invalid on Sun Feb 18 10:13:31 2024
    On Sat, 17 Feb 2024 17:16:14 -0700, Don Y
    <blockedofcourse@foo.invalid> wrote:

    On 2/17/2024 3:49 AM, albert@spenarnc.xs4all.nl wrote:
    In article <uqivkh$2ontb$2@dont-email.me>,
    Don Y <blockedofcourse@foo.invalid> wrote:
    On 2/14/2024 6:05 AM, albert@spenarnc.xs4all.nl wrote:
    Nobody hu? Smith does. Written a compiler in hex code using only
    a hex to bin converter.
    https://dacvs.neocities.org/SF/
    The take away is, it is easier than you expect.

    One writes code to be *read*. Just because you CAN do something
    doesn't mean you SHOULD do that something. People spend inane
    amounts of time arranging dominos... just to knock them over
    (what's the point in that?)

    This project is meant to be read. You can't be serious suggesting
    that this is a tool to be used.

    Why write code that isn't going to be used?

    One perfectly good reason is to make life hard for the
    reverse-engineers. Build in a pile of redundant code, only release
    executables, never the source, and thereby improve your software
    sales.

    If you spend the time looking at the code, you'd discover that it
    is quite educational, and make you wonder where the software bloat
    comes from.

    Why does your PHONE have a clock in it? AND a calendar?
    Don't you remember today's date? Don't you wear a wristwatch?
    Why does your PC have these data, too? Don't you have a
    *phone* (er, wristwatch and a memory for dates)?

    Which of these has the CORRECT information? How do you know?

    [I have an "atomic clock" that is supposed to keep correct
    time/date. Except, twice a year, it diddles the time for
    DST start end. *But*, we don't observe DST! So, while
    all of the DUMB clocks in the house correctly track the REAL
    time all year round, this one has to be "fixed", twice
    a year, to get it to IGNORE the DST changes! So, twice
    a year I look at the clock and wonder why *my* notion of
    the current time differs from it's -- and then tell
    it to move me to the next time zone, east or west, as
    appropriate]

    Atomic clocks don't do that. Sounds like you have one of those
    radio-controlled jobs that's works off a time signal.

    Why do I *need* an icon IN my word processor to invoke the
    spell-checker? Thesaurus? Grammar check? Reading complexity
    assessment? How does having them accessible *in* the word
    processor help me if I am writing an email -- in an entirely
    different "text editor" that mimics much of the functionality
    of the word processor?

    Does each possible application that allows for text entry
    merit the availability of these tools? What if I wanted
    to send an SMS?

    Are you just LAZY and can't invoke each of these as STAND-ALONE
    tools, as needed?

    So, EVERY app gets burdened with new implementations of
    tools that should be separate entities. To economize on
    resources AND ensure for more consistent results!

    [Windows lists files in "user-friendly order" -- which differs
    from ALPHABETICAL order! Other applications opt for a more
    traditional LRAlpha. Comparing file lists between the two
    just ADDS to confusion (loss of productivity).]

    Here is a trivial, ubiquitous, algorithm written in machine code:

    DD217A00DD7E01FE323805DD3401185FDD360100DD7E02FE3C3805DD3402184F >DD360200DD7E03FE3C3805DD3403183FDD360300DD7E04FE183805DD3404182F >DD360400216F000600DD4E0609DD7E05BE3805DD34051817DD360500DD7E06FE >0C3805DD34061807DD360600DD34071F1F1C1F1E1F1E1F1F1E1F1E1F

    That's hex, not MC.

    No comments -- machines don't read comments! And, machines don't
    need the code to be formatted to show instruction breaks -- it's
    just a large array of bytes!

    .... and this is why obfuscated language competitions are such fun.
    Except obfuscated Perl, of course - as there's no point! :-)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Lasse Langwadt Christensen on Sun Feb 18 05:00:57 2024
    On 2/18/2024 3:26 AM, Lasse Langwadt Christensen wrote:
    søndag den 18. februar 2024 kl. 05.11.25 UTC+1 skrev Anthony William Sloman:
    On Sunday, February 18, 2024 at 11:29:27 AM UTC+11, Don Y wrote:
    On 2/17/2024 1:18 PM, RichD wrote:
    On February 11, Don Y wrote:
    <snip>
    Given these considerations, does anybody write assembly code for
    modern RISC processors?

    ASM or machine code? The two differ by the presence of a compiler
    in the former that is absent in the latter.
    Technically speaking, the software that converts assembly code into machine code is an assembler, not a compiler.

    There's a one-to-one relationship between assembler mnemonics and numbers that constitute the machine code.

    usually but not always

    Optimizing assemblers have been around since the 70's. The most common optimization is for branch encoding on processors that support different
    branch style (e.g., long, absolute, relative, etc.) Most often, the
    "best" (shortest/fastest) opcode has limitations on how "far" the
    supported reach. You don't want to have to PICK a particular opcode
    based on the layout of the code, *now*... and, after inserting/moving
    a few instructions, have to go back to choose a *different* opcode
    because the target is now beyond the reach of the opcode you
    previously chose!

    The UNIX assembler has done this... forever. You can find tools that
    will do this on VAX, 68K, 6809, Z80, SPARC, etc.

    It's an NP-complete problem so expecting an "optimal" solution isn't
    possible. However, the alternative -- MANUALLY trying to determine the distance between branch opcode and targeted location -- is silly to
    expect programmers to do with any degree of success (can you remember
    how many bytes *each* instruction between "here" and "there" occupy?
    Do you WANT to????)

    [If the target of a branch lies in another module, then the linkage editor
    has to sort out the required displacement.]

    Other tools are smart enough to know that, for example, a "register load
    of #0" is equivalent to XORing the register with itself, subtracting itself from itself, etc. Or, may have an alternate form for "small" constants
    than "larger" constants. The cost (time and space) of each approach
    vary so deciding how you want the code to be optimized is important
    (if at all).

    Is subtracting 1 faster/smaller than decrementing a register?

    Is shifting a register left cheaper than adding a register to itself?

    Considering that most programmers (in ASM) use symbols instead of absolute values, it's not uncommon for you to see code like:
    LD A,SOME_VALUE
    and, discover (at compile time), that SOME_VALUE is actually 0. Or,
    some very small (e.g., 4 bit) integer. So, taking the instruction as
    written, literally, is more costly than it needs to be! Should the
    developer HAVE to know all of this? (Why?)

    Should the developer have to know how to keep the pipeline from stalling?
    Why not let the assembler reorder opcodes to ensure it doesn't?!

    The assembler has to guarantee that the same *functionality* is provided
    as defined in the ASM source; if it can do this more effectively than
    the nominal instruction sequence presented by the programmer, more power
    to it! (If you don't want it to dick with your code, then disable the optimizations!)

    Consider accessing the Nth element (where N is a variable) in an array
    of structures.

    STRUCT_SIZE equ <something>

    LD A,N
    LD B,#STRUCT_SIZE
    MUL
    LDA B,X (assumes product < 256)

    Imagine if STRUCT_SIZE was 2. Or 4. Or 16. Or *1*!

    Or, 3, 5, 9, etc.

    This could easily be reduced in time and/or space! The developer
    wouldn't want to have to go through and examine every symbolic
    reference to see how the associated code MIGHT be tweeked given
    the *current* numeric bindings (i.e., the STRUCT_SIZE may change
    as the code evolves so this could potentially need to be revisited
    each time the code is compiled!) -- the *tool* should do this,
    instead!

    Nowadays, we rely on HLLs to do the optimization *before* generating
    their ASM "output". But, who (what!) optimizes ASM itself?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to RichD on Sun Feb 18 13:57:54 2024
    On 2/18/2024 12:42 PM, RichD wrote:
    On February 18, skrev Lasse Langwadt Christensen:
    Given these considerations, does anybody write assembly code for
    modern RISC processors?

    ASM or machine code? The two differ by the presence of a compiler
    in the former that is absent in the latter.

    Technically speaking, the software that converts assembly code into machine code is an assembler,
    not a compiler.
    There's a one-to-one relationship between assembler mnemonics and numbers that constitute the machine code.

    usually but not always

    ?
    Exceptions?

    <https://ftp.gnu.org/old-gnu/Manuals/gas-2.9.1/html_chapter/as_23.html#SEC258> this behavior is inherited from the early days of as(1).

    <http://www.bitsavers.org/pdf/sun/sunos/4.1/800-3807-10A_Sun-3_Assembly_Language_Reference_Manual_199003.pdf>
    section 6.2
    (30+ years ago)

    <https://www.roug.org/retrocomputing/os/flex/ASM09-6809-assembler.pdf>
    and SWTPC wasn't a software trailblazer (40+ years ago):

    Optimize Assembly. The "F" option will cause the
    assembler to suppress any optimization of object code.
    Foreward references will be assembled “using the least
    restrictive addressing modes. . This. option will force the.
    assembler to complete in two passes, but ‘object.. code may. be
    considerably larger than required. This option is especially
    useful while debugging a program which will later be
    optimized. Note that the "R" option takes priority over this

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From albert@spenarnc.xs4all.nl@21:1/5 to bill.sloman@ieee.org on Thu Feb 29 14:37:18 2024
    In article <815fc7fb-5b54-4493-9651-ec4a5e96d6f5n@googlegroups.com>,
    Anthony William Sloman <bill.sloman@ieee.org> wrote:
    On Sunday, February 18, 2024 at 11:29:27 AM UTC+11, Don Y wrote:
    On 2/17/2024 1:18 PM, RichD wrote:
    On February 11, Don Y wrote:

    <snip>

    Given these considerations, does anybody write assembly code for
    modern RISC processors?

    ASM or machine code? The two differ by the presence of a compiler
    in the former that is absent in the latter.

    Technically speaking, the software that converts assembly code into
    machine code is an assembler, not a compiler.

    There's a one-to-one relationship between assembler mnemonics and numbers >that constitute the machine code. Compilers can generate strings of
    machine code from a higher-level language command, so it is a real >distinction. I wasn't conscious of this when I first learned Fortran
    coding, but I was taught how the Fortran compiler expanded single lines of >code into strings of assembler when I did a year's course on "Theory of >Computation" the following year (1966).

    One-to-one relationship between assembler mnemonics and numbers?
    This is a myth.
    You seem oblivious that Intel's
    MOV AX,BX
    chooses between two instructions to the discretion of the assembler at hand? (not to speak of shorter forms that involves AX only.)

    I've made a reverse engineering assembler that allows disassembled code
    to assemble to a copy of the original, obliged to differentiate between
    the two.
    See https://github.com/albertvanderhorst/ciasdis

    The two forms are
    MOV, X| T| BX'| R| AX|
    MOV, X| F| AX'| R| BX|

    Move primary register AX to secondary register BX using default size (X).
    Move primary register BX from secondary register AX

    It is hard to analyse viruses without those finesse.


    --
    Bill Sloman. Sydney

    Groetjes Albert
    --
    Don't praise the day before the evening. One swallow doesn't make spring.
    You must not say "hey" before you have crossed the bridge. Don't sell the
    hide of the bear until you shot it. Better one bird in the hand than ten in
    the air. First gain is a cat purring. - the Wise from Antrim -

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to albert@spenarnc.xs4all.nl on Thu Feb 29 13:33:33 2024
    On 2/29/2024 6:37 AM, albert@spenarnc.xs4all.nl wrote:
    One-to-one relationship between assembler mnemonics and numbers?
    This is a myth.

    It has never been true. An assembler is free to generate whatever code satisfies the desire stated by the programmer.

    Just like:
    X = X - X
    X = 0 * (whatever)
    X = 0
    all achieve the same results, so can:
    LD A,0
    SUB A,A
    XOR A
    in "assembly language".

    The author/designer of the "assembler" states the rules that *he* will
    apply and observe in his selection of MACHINE LANGUAGE instructions.
    If the programmer wants to force a particular bit of generated code,
    then the assembler writer provides a mechanism for doing this.
    Most programmers wouldn't care as long as the code DID what they
    wanted.

    Hysterically, the most common "optimization" was branch reduction
    because it's foolish to require an ASM programmer to have to KNOW
    "how far" a destination is from the current instruction (he would
    have to know the space required by each instruction between there
    and here AND what the "actual" PC value was at the time of the branch
    (i.e., it likely points to the NEXT instruction, not to the current instruction, and branch ranges are usually relative to this)

    [I've posted several examples of this dating back to the late 70's
    and early versions of as(1). So, it's not like tool makers
    "suddenly" realized that optimizations could be applied to ASM
    code. OTOH, there were lots of schlock assemblers written that
    were little more than lexical/syntactic analyzers and easily
    confused:
    LD HL,(256*HI_BYTE+LOW_BYTE)
    was an obvious flaw in the parse for some Z80 assemblers -- should
    it be treated as:
    VALUE EQU (256*HI_BYTE+LOW_BYTE)
    LD HL,VALUE
    or:
    LD HL,(VALUE)
    the two being semantically very different!]

    Also, programmers want (and are encouraged) to code symbolically
    and avoid "magic constants". So, the previous ASM example could
    have been:
    LD A,OFFSET
    which will be arithmetically applied to (perhaps) an address
    computation AT RUNTIME. If the assembler knows OFFSET is "0"
    (as a manifest constant), then it can replace the instruction
    with one that is more economical (in space AND time!) and,
    further, recognize stanzas of code that it can replace with
    simpler code sequences (e.g., DIRECTLY use the address that
    the code is trying to offset and elide the instructions that
    would have done so).

    There are also global (most commonly peephole) optimizations
    that can be applied by the linkage editor that are outside
    the scope of the assembler, itself.

    You seem oblivious that Intel's
    MOV AX,BX
    chooses between two instructions to the discretion of the assembler at hand? (not to speak of shorter forms that involves AX only.)

    I've made a reverse engineering assembler that allows disassembled code
    to assemble to a copy of the original, obliged to differentiate between
    the two.
    See https://github.com/albertvanderhorst/ciasdis

    The two forms are
    MOV, X| T| BX'| R| AX|
    MOV, X| F| AX'| R| BX|

    Move primary register AX to secondary register BX using default size (X). Move primary register BX from secondary register AX

    It is hard to analyse viruses without those finesse.

    Viruses deal with machine code. Their authors want to know EXACTLY what
    the processor will be executing. Most *programmers* just want the processor
    to "achieve a desired result".

    Tool makers want their users to find VALUE in their tools, not more opportunities to introduce bugs!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From albert@spenarnc.xs4all.nl@21:1/5 to blockedofcourse@foo.invalid on Fri Mar 8 09:38:04 2024
    In article <urqpn1$pptq$2@dont-email.me>,
    Don Y <blockedofcourse@foo.invalid> wrote:
    On 2/29/2024 6:37 AM, albert@spenarnc.xs4all.nl wrote:
    One-to-one relationship between assembler mnemonics and numbers?
    This is a myth.

    It has never been true. An assembler is free to generate whatever code >satisfies the desire stated by the programmer.

    That is a silly response to a description of an assembler that
    accomplish a one to one correspondance between machine code and
    mnemonics. I don't care what other assemblers do or doesn't do.
    --
    Don't praise the day before the evening. One swallow doesn't make spring.
    You must not say "hey" before you have crossed the bridge. Don't sell the
    hide of the bear until you shot it. Better one bird in the hand than ten in
    the air. First gain is a cat purring. - the Wise from Antrim -

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to albert@spenarnc.xs4all.nl on Fri Mar 8 03:01:57 2024
    On 3/8/2024 1:38 AM, albert@spenarnc.xs4all.nl wrote:
    In article <urqpn1$pptq$2@dont-email.me>,
    Don Y <blockedofcourse@foo.invalid> wrote:
    On 2/29/2024 6:37 AM, albert@spenarnc.xs4all.nl wrote:
    One-to-one relationship between assembler mnemonics and numbers?
    This is a myth.

    It has never been true. An assembler is free to generate whatever code
    satisfies the desire stated by the programmer.

    That is a silly response to a description of an assembler that
    accomplish a one to one correspondance between machine code and
    mnemonics. I don't care what other assemblers do or doesn't do.

    The assumption that an assembler -- and, thus, a qualification for
    ALL assemblers -- generates a one-to-one correspondence between
    mnemonic and machine code is false.

    What term do you apply to tools that take "assembly language"
    mnemonics and DON'T generate one-to-one correspondences?
    Are these NOT "assemblers"?

    What term do you apply to tools that take "assembly language"
    mnemonics and DO generate one-to-one correspondences?
    Are these ALSO not assemblers?

    I.e., the one-to-one correspondence is an *imagined* requirement
    of "an assembler".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Bill Sloman@21:1/5 to Don Y on Fri Mar 8 21:22:58 2024
    On 8/03/2024 9:01 pm, Don Y wrote:
    On 3/8/2024 1:38 AM, albert@spenarnc.xs4all.nl wrote:
    In article <urqpn1$pptq$2@dont-email.me>,
    Don Y  <blockedofcourse@foo.invalid> wrote:
    On 2/29/2024 6:37 AM, albert@spenarnc.xs4all.nl wrote:
    One-to-one relationship between assembler mnemonics and numbers?
    This is a myth.

    It has never been true.  An assembler is free to generate whatever code >>> satisfies the desire stated by the programmer.

    That is a silly response to a description of an assembler that
    accomplish a one to one correspondance between machine code and
    mnemonics. I don't care what other assemblers do or doesn't do.

    The assumption that an assembler -- and, thus, a qualification for
    ALL assemblers -- generates a one-to-one correspondence between
    mnemonic and machine code is false.

    What term do you apply to tools that take "assembly language"
    mnemonics and DON'T generate one-to-one correspondences?
    Are these NOT "assemblers"?

    What term do you apply to tools that take "assembly language"
    mnemonics and DO generate one-to-one correspondences?
    Are these ALSO not assemblers?

    I.e., the one-to-one correspondence is an *imagined* requirement
    of "an assembler".

    It's not an "imagined requirement" of an assembler. It's a description
    of the way the original assemblers worked. Compilers came later, and
    some of their features got grafted into assemblers that were still being
    used to convert low level code into machine code.

    The technology changed and the language changed to reflect that. No
    imagination involved - except in the design of the up-graded assemblers.

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin Brown@21:1/5 to Don Y on Fri Mar 8 11:03:39 2024
    On 08/03/2024 10:01, Don Y wrote:
    On 3/8/2024 1:38 AM, albert@spenarnc.xs4all.nl wrote:
    In article <urqpn1$pptq$2@dont-email.me>,
    Don Y  <blockedofcourse@foo.invalid> wrote:
    On 2/29/2024 6:37 AM, albert@spenarnc.xs4all.nl wrote:
    One-to-one relationship between assembler mnemonics and numbers?
    This is a myth.

    It has never been true.  An assembler is free to generate whatever code >>> satisfies the desire stated by the programmer.

    No. An assembler is required to generate the opcode that corresponds to
    the mnemonic that the programmer specified. A compiler or autocoder is
    free to use whichever way of say loading zero into the accumulator it
    likes. But if the programmer writes movi acc, #0 then in assembler then
    that is what they get (even if xor acc,acc is much faster).

    Once you start with macro assemblers and have smart macros then all bets
    are off but for intrinsic mnemonics they are a clean translation to hex.

    That is a silly response to a description of an assembler that
    accomplish a one to one correspondance between machine code and
    mnemonics. I don't care what other assemblers do or doesn't do.

    The assumption that an assembler -- and, thus, a qualification for
    ALL assemblers -- generates a one-to-one correspondence between
    mnemonic and machine code is false.

    I can't offhand think of any assembler that didn't have a regular and reproducible mapping of its opcode mnemonics to hexadecimal numbers (and
    vice versa although going backwards tended to be a many to one mapping).

    What term do you apply to tools that take "assembly language"
    mnemonics and DON'T generate one-to-one correspondences?
    Are these NOT "assemblers"?

    MOV for instance often has several hexadecimal codes that it corresponds
    to because there are so many different sorts. But that is more of a
    problem for disassemblers than anything else.

    What term do you apply to tools that take "assembly language"
    mnemonics and DO generate one-to-one correspondences?
    Are these ALSO not assemblers?

    I.e., the one-to-one correspondence is an *imagined* requirement
    of "an assembler".

    The way I obtain unsupported genuine modern opcodes in some archaic
    inline assemblers that lack support or any other way of doing it is to
    hack together a jump forward a few bytes and a load immediate long hex constant. It is only worth doing this as an absolute last resort.

    --
    Martin Brown

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Martin Brown on Fri Mar 8 05:12:13 2024
    On 3/8/2024 4:03 AM, Martin Brown wrote:
    On 08/03/2024 10:01, Don Y wrote:
    On 3/8/2024 1:38 AM, albert@spenarnc.xs4all.nl wrote:
    In article <urqpn1$pptq$2@dont-email.me>,
    Don Y  <blockedofcourse@foo.invalid> wrote:
    On 2/29/2024 6:37 AM, albert@spenarnc.xs4all.nl wrote:
    One-to-one relationship between assembler mnemonics and numbers?
    This is a myth.

    It has never been true.  An assembler is free to generate whatever code >>>> satisfies the desire stated by the programmer.

    No. An assembler is required to generate the opcode that corresponds to the mnemonic that the programmer specified.

    An assembler interprets an ASSEMBLY LANGUAGE -- that the author of the assembler created -- to generate some (usually intermediate) form of
    the operation desired.

    What "code" does this generate:

    FOO EQU 27

    Or:
    ORG $-1

    Or:
    JMP FOOBAR

    Ans: you can't tell me because they all rely on the manner in which
    the PARTICULAR author defined the assembly language for HIS assembler.
    (What the hell is a FOOBAR? What is the machine language -- a series
    of bits -- for it?)

    If you've ever had to port code to a different toolchain, you
    quickly learn that ASM is NOT portable (gee, shouldn't it be?
    esp if there is a one-to-one correspondence between mnemonics and
    machine code???!)

    [We'll ignore the fact that the actual bits you *assume* will be
    generated will rely on the value of "FOOBAR" AFTER linkage editor
    has had a pass at it as, clearly, any changes to its value -- most
    typically a symbolic location reference (though it can be ANY
    value created with any mechanism the assembler author has provided...
    even a manifest constant!) -- would change the bits generated in
    the final binary]

    Pick a processor. Any processor. What's the assembly language for THAT processor? Ans: it depends on the assembler that you choose to use.
    A processor manufacturer can define mnemonics for "machine code"
    but the assembler author is under no obligation to use those.

    A compiler or autocoder is free to use
    whichever way of say loading zero into the accumulator it likes. But if the programmer writes movi acc, #0 then in assembler then that is what they get (even if xor acc,acc is much faster).

    No. You are relying on a particular assembler's (author) interpretation of those character sequences. I can write an assembler that disables interrupts when it encounters that sequence of characters (it would confuse YOU but would be a perfectly acceptable way of defining a means of causing interrupts to
    be disabled).

    Once you start with macro assemblers and have smart macros then all bets are off but for intrinsic mnemonics they are a clean translation to hex.

    Again, no.

    That is a silly response to a description of an assembler that
    accomplish a one to one correspondance between machine code and
    mnemonics. I don't care what other assemblers do or doesn't do.

    The assumption that an assembler -- and, thus, a qualification for
    ALL assemblers -- generates a one-to-one correspondence between
    mnemonic and machine code is false.

    I can't offhand think of any assembler that didn't have a regular and reproducible mapping of its opcode mnemonics to hexadecimal numbers (and vice versa although going backwards tended to be a many to one mapping).

    Perhaps you've not had as much experience as I? Historical perspective
    is often enlightening.

    Here are some examples I previously posted:

    <https://ftp.gnu.org/old-gnu/Manuals/gas-2.9.1/html_chapter/as_23.html#SEC258> this behavior is inherited from the early days of as(1).

    <http://www.bitsavers.org/pdf/sun/sunos/4.1/800-3807-10A_Sun-3_Assembly_Language_Reference_Manual_199003.pdf>
    section 6.2
    (30+ years ago)

    "as supports extended branch instructions in addition to the
    instructions which explicitly specify the instruction length.
    These instruction's names are, in most cases, constructed from
    the word versions by replacing the b with j. For example,
    compare beq with jeq.

    "as's rules for handling branch instructions are as follows:
    - as automatically generates the corresponding short
    branch instruction if the operand of the extended branch
    instruction is a simple address in the text segment, and
    the offset to that address is sufficiently small. as
    generates the corresponding branch instruction if the
    offset is too large for a short branch, but small enough
    for a branch.
    - as implements an extended branch instruction when the operand
    either references an external address or is complex (see
    below) as follows:
    1. By a jmp or jsr (for jra or jbsr).
    2. If the target processor is the MC68010, by a conditional
    branch (with the sense of the condition inverted) around
    a jrnp for the extended conditional branches.
    3. If the target processor is the MC68020, by using
    the corresponding long branch.

    Note that last bit: your "jump if carry" is converted to a "jump if NO carry" in some cases. Clearly an entirely different opcode! oops! so much for
    the "one-to-one mapping"...

    <https://www.roug.org/retrocomputing/os/flex/ASM09-6809-assembler.pdf>
    and SWTPC wasn't a software trailblazer (40+ years ago):

    "Optimize Assembly. The "F" option will cause the
    assembler to suppress any optimization of object code.
    Foreward [sic] references will be assembled “using the least
    restrictive addressing modes. . This. option will force the.
    assembler to complete in two passes, but ‘object.. code may. be
    considerably larger than required. This option is especially
    useful while debugging a program which will later be
    optimized. Note that the "R" option takes priority over this

    Note that this is nothing "new"; most of the references are for
    DECADES old tools. Because tool developers realized that there is
    a lot of annoying "cruft" that thinking just one level above machine
    code forces on the developer.

    Why should I have to know how far away "FOOBAR" is from the current
    program counter associated with the JMP instruction that references
    it in order to know whether to use JR (jump relative -- which
    places limits on how far away the target can be but takes up less
    space and executes faster) vs. JP (which takes up more space and
    is slower but can go "anywhere"?).

    You (tool maker) don't require me to specify numerical addresses
    as the target of the jump -- because that would be extra cruft
    that the developer would have to track (and the goal of the tool
    is to make it EASIER to write CORRECT code). Why would you force
    me to know how many program locations EACH instruction between
    "here" and "FOOBAR" requires?

    Or, do you expect me to try *an* encoding (i.e., choice of mnemonics)
    and see if the assembler (or, linkage editor if the displacement isn't
    visible to the assembler, directly) throws an error -- and on which instructions? Then, REWRITE those instructions, reassemble (and relink!)
    which could cause OTHER instructions to now be "out of range"?

    Should I (developer) have to KNOW that a certain variable will
    reside in a special page of memory that can be accessed more
    efficiently using a different opcode? If I (developer) opt to
    move that variable to a different place in memory, should I
    have to rewrite every reference to it?

    Should I have to know the FINAL layout of a data structure in
    order to access elements of it? Should I be forced to assume
    a worst-case implementation that gives *me* (the developer)
    the most flexibility in accommodating the evolution of that
    structure -- at the expense of efficiency? (e.g., using a
    relative addressing mode even when direct addressing could be
    more efficient, based on the values of those "relative offsets"...
    CONSTANTS!)

    What a silly way to waste a developer's time! Why not have him
    type in constants for all symbolic references, if we really want
    to make him WORK?! Think how much simpler the assembler-writer's
    task would be if he required the developer to generate machine code
    directly!!

    "Dumb" assemblers are/were limited to one-to-one mappings -- because
    they were usually written with sparse resources. One can write
    such an assembler INSIDE a macro-assembler and letting the *macros*
    generate all of the code (a common trick for ASLs) as all you
    need is syntactic/lexical analysis and symbol table maintenance.
    Things that are available in most macro-assemblers.

    Such a tool doesn't need to *understand* your code. So, you can
    do "silly" things like arbitrarily move the "location counter" or
    reference locations *inside* (multibyte) instructions.

    Better assemblers give you a way to force a particular encoding
    of an instruction (near/far/long, etc.) *or* allow the tool to pick
    it for you based on criteria that you specify (space, time, etc.)

    And, as seen above, "better" doesn't mean "newer"!

    Running a tool on an MDS/EXORMAC limits you to what you can do
    *in* the tool because it had a simple OS and sorely limited
    hardware resources (8" floppies). A different tool running on
    a bigger machine doesn't have the same limitations (you will note
    that small development systems, like the MDS, appeared LATER than
    larger machines, e.g., hosting as(1), so were a step BACKWARDS
    in terms of capabilities)

    What term do you apply to tools that take "assembly language"
    mnemonics and DON'T generate one-to-one correspondences?
    Are these NOT "assemblers"?

    MOV for instance often has several hexadecimal codes that it corresponds to because there are so many different sorts. But that is more of a problem for disassemblers than anything else.

    My question was apparently not clear; what name do you apply to the
    TOOLS that DON'T have one-to-one correspondences between the mnemonics
    used and the bitstream produced (i.e., the tools that I cited).
    Are they NOT "assemblers" because they have violated this (artificial) constraint?

    What term do you apply to tools that take "assembly language"
    mnemonics and DO generate one-to-one correspondences?
    Are these ALSO not assemblers?

    I.e., the one-to-one correspondence is an *imagined* requirement
    of "an assembler".

    The way I obtain unsupported genuine modern opcodes in some archaic inline assemblers that lack support or any other way of doing it is to hack together a
    jump forward a few bytes and a load immediate long hex constant. It is only worth doing this as an absolute last resort.

    Unsupported opcodes are a different issue. I've been discussing
    *supported* operations that have different (but functionally equivalent) machine code implementations.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Peter@21:1/5 to Don Y on Tue Mar 12 13:05:16 2024
    Don Y <blockedofcourse@foo.invalid> wrote:

    On 2/15/2024 3:13 AM, Peter wrote:
    Graet to see you Don after all these years - 2006!!

    Hey there, Mr "Pool" :>

    Haha hello :)

    The pump packed up; turned out that the 25uF (400V AC) starting cap
    degraded to 15uF.

    I trust all is well, remodel long completed, kids now grown
    (which of them was first to make you "Gramps"? and wasn't your
    youngest looking for his pilot's license?), thus PBfH having
    less of an impact on your life, etc.

    Divorced the witch in 1999, then the next one (2003-2023) sadly ended
    in 2023.

    Youngest has a PPL (UK and FAA) and flies, both mine and his RV6.
    Chases females on Tinder and Hinge, like everybody else :)

    I had a customer many years ago who did write a ton of code in hex. To
    enable modifications they had a bit of space after each function, so
    edits to a function did not need shifting everything after it :)

    But what was their *reason* for this? I had an employer (*had* been
    an engineer and deluded himself into thinking he could still *do* >engineering) who was stuck in the past -- as if the tools and
    techniques he had used were still relavent, even a few years later!

    Stupidity - assemblers have always been around.

    When it took hours to assemble, link, burn images, it made sense to
    have mechanisms to support minor tweeks to the code (overwriting
    instructions with NOPs and filling in a "0xFF" postamble with new
    code). But, nowadays, make world on even large projects is just
    a coffee break -- and, you can dump your code into RAM to watch
    it run (assuming you have to run on a target and not in a
    simulator).

    [Nowadays, I netboot images just for the savings that one step
    makes possible!]

    Indeed.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Peter@21:1/5 to RichD on Tue Mar 12 13:10:48 2024
    RichD <r_delaney2001@yahoo.com> wrote:

    Given these considerations, does anybody write assembly code for
    modern RISC processors?

    Yes; some stuff cannot be done in C. Start with loading SP. No way in
    C!

    Some code in an RTOS is not possible in C. Look at the FreeRTOS
    sourcecode. There are bits of asm in there.

    Also asm has great uses for protecting from optimisation (which can
    change silently by upgrading the compiler!). Asm never gets modified;
    essential when talking to devices needing specific minimum /CS timing
    etc.

    Another example, for accurate delays (ST32F417, 168MHz)



    // Hang around for delay in microseconds

    __attribute__((noinline))
    void hang_around_us(uint32_t delay)
    {
    delay *= (SystemCoreClock/4100000L);

    asm volatile (
    "1: subs %[delay], %[delay], #1 \n"
    " nop \n"
    " bne 1b \n"
    : [delay] "+l"(delay)
    );
    }

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Peter on Tue Mar 12 11:39:19 2024
    On 3/12/2024 6:10 AM, Peter wrote:

    RichD <r_delaney2001@yahoo.com> wrote:

    Given these considerations, does anybody write assembly code for
    modern RISC processors?

    Yes; some stuff cannot be done in C. Start with loading SP. No way in
    C!

    Doing anything that isn't memory-mapped; how would you generate "I/O" instructions (for processors with I/O spaces)?

    Using any part of the instruction set that isn't directly mapped
    to native C constructs (how would you access support for BCD
    data types? special commands to control interrupts? opcodes to
    control atomic operations/synchronization?)

    Some code in an RTOS is not possible in C. Look at the FreeRTOS
    sourcecode. There are bits of asm in there.

    Also asm has great uses for protecting from optimisation (which can
    change silently by upgrading the compiler!). Asm never gets modified; essential when talking to devices needing specific minimum /CS timing
    etc.

    This is changing. Lots of ongoing work in optimizing and
    super-optimizing assembly language code. Even arguments
    being made that compilers should NOT be generating (final)
    ASM, from HLL sources cuz it forces them to know too
    much about the underlying hardware... things that a
    (truly) "optimizing assembler" is better suited to knowing.

    [E.g., how bondout options can change the costs of different
    opcodes from one processor in a "family" to another]

    Another example, for accurate delays (ST32F417, 168MHz)

    // Hang around for delay in microseconds

    __attribute__((noinline))
    void hang_around_us(uint32_t delay)
    {
    delay *= (SystemCoreClock/4100000L);

    asm volatile (
    "1: subs %[delay], %[delay], #1 \n"
    " nop \n"
    " bne 1b \n"
    : [delay] "+l"(delay)
    );
    }


    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Peter on Tue Mar 12 12:43:05 2024
    On 3/12/2024 6:05 AM, Peter wrote:
    I trust all is well, remodel long completed, kids now grown
    (which of them was first to make you "Gramps"? and wasn't your
    youngest looking for his pilot's license?), thus PBfH having
    less of an impact on your life, etc.

    Divorced the witch in 1999,

    Yes. But, IIRC, there was still a lot of "interaction" as a
    result of the boys. Now that they are grown, presumably that
    is less of an issue, limiting the intensity of any such interactions?

    then the next one (2003-2023) sadly ended in 2023.

    (sigh) Sorry to hear that. I recall you had high hopes and,
    hopefully, some of those were realized.

    "One can't get divorced TWICE; the first takes HALF of everything,
    the second would take the OTHER half!"

    Youngest has a PPL (UK and FAA) and flies, both mine and his RV6.

    But, is his interest purely recreational? Or, might he pursue
    that "commercially"?

    Chases females on Tinder and Hinge, like everybody else :)

    Thankfully, I've never been down that road.

    I had a customer many years ago who did write a ton of code in hex. To
    enable modifications they had a bit of space after each function, so
    edits to a function did not need shifting everything after it :)

    But what was their *reason* for this? I had an employer (*had* been
    an engineer and deluded himself into thinking he could still *do*
    engineering) who was stuck in the past -- as if the tools and
    techniques he had used were still relavent, even a few years later!

    Stupidity - assemblers have always been around.

    I think a lot has to do with wanting to THINK that an imagined skillset
    is still valuable. With UV/OTP EPROM, that tactic *might* make sense
    (as a rebuild could be time consuming vs. patching an image, on-the-fly.

    But, with FLASH and RAM-based solutions, there's no time to be saved
    (to outweigh the potential for screwing up "manually")

    When it took hours to assemble, link, burn images, it made sense to
    have mechanisms to support minor tweeks to the code (overwriting
    instructions with NOPs and filling in a "0xFF" postamble with new
    code). But, nowadays, make world on even large projects is just
    a coffee break -- and, you can dump your code into RAM to watch
    it run (assuming you have to run on a target and not in a
    simulator).

    [Nowadays, I netboot images just for the savings that one step
    makes possible!]

    Indeed.

    It's delightful to see what can now be done on-the-cheap! No
    more playing games with hardware (and its costs/reliability)
    when you can just emulate any functionality you want!

    (I have a design where a '7180 acted as an EPROM emulator in
    a production design to give me debugging support via a
    serial console that it provided; i.e., let the 7180 "fetch"
    bytes over the serial console instead of having to store them
    *in* its EPROM -- a predecessor to netbooting! :> )

    I've been tempted to try reimplementing some early designs just to
    see how quickly the development would proceed AND how much faster
    the code would execute... big change from a ~700KHz i4004 to an
    800MHz quad-core (costing a tenth as much!). It would be
    depressing to discover that a man-year effort can be reduced to
    a long weekend! :<

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Peter@21:1/5 to Don Y on Wed Mar 13 16:28:25 2024
    Don Y <blockedofcourse@foo.invalid> wrote:

    On 3/12/2024 6:05 AM, Peter wrote:
    I trust all is well, remodel long completed, kids now grown
    (which of them was first to make you "Gramps"? and wasn't your
    youngest looking for his pilot's license?), thus PBfH having
    less of an impact on your life, etc.

    Divorced the witch in 1999,

    Yes. But, IIRC, there was still a lot of "interaction" as a
    result of the boys. Now that they are grown, presumably that
    is less of an issue, limiting the intensity of any such interactions?

    Not dealt with her for ~10 years, and never will :)

    then the next one (2003-2023) sadly ended in 2023.

    (sigh) Sorry to hear that. I recall you had high hopes and,
    hopefully, some of those were realized.

    "One can't get divorced TWICE; the first takes HALF of everything,
    the second would take the OTHER half!"

    Hahaha. It's actually a power series: 1/2 + 1/4 + 1/8 + 1/16.
    According to https://en.wikipedia.org/wiki/1/2_%E2%88%92_1/4_%2B_1/8_%E2%88%92_1/16_%2B_%E2%8B%AF
    it converges to 1/3 so you will always eat afterwards ;)

    Youngest has a PPL (UK and FAA) and flies, both mine and his RV6.

    But, is his interest purely recreational? Or, might he pursue
    that "commercially"?

    I think Tinder has curtailed commercial ambitions ;)


    I've been tempted to try reimplementing some early designs just to
    see how quickly the development would proceed AND how much faster
    the code would execute... big change from a ~700KHz i4004 to an
    800MHz quad-core (costing a tenth as much!). It would be
    depressing to discover that a man-year effort can be reduced to
    a long weekend! :<

    My last design is 100x faster than anything done before, and the CPU
    costs about $7.

    But the software takes as long - because 90% of the functionality is
    now connectivity! MbedTLS etc. There is even an HTTP server (simple: I
    wrote it myself) for config.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Peter@21:1/5 to Don Y on Wed Mar 13 16:29:47 2024
    Don Y <blockedofcourse@foo.invalid> wrote:

    On 3/12/2024 6:10 AM, Peter wrote:

    RichD <r_delaney2001@yahoo.com> wrote:

    Given these considerations, does anybody write assembly code for
    modern RISC processors?

    Yes; some stuff cannot be done in C. Start with loading SP. No way in
    C!

    Doing anything that isn't memory-mapped; how would you generate "I/O" >instructions (for processors with I/O spaces)?

    I think I/O is rare; it tends to be memory mapped.

    Using any part of the instruction set that isn't directly mapped
    to native C constructs (how would you access support for BCD
    data types? special commands to control interrupts? opcodes to
    control atomic operations/synchronization?)

    Some code in an RTOS is not possible in C. Look at the FreeRTOS
    sourcecode. There are bits of asm in there.

    Also asm has great uses for protecting from optimisation (which can
    change silently by upgrading the compiler!). Asm never gets modified;
    essential when talking to devices needing specific minimum /CS timing
    etc.

    This is changing. Lots of ongoing work in optimizing and
    super-optimizing assembly language code. Even arguments
    being made that compilers should NOT be generating (final)
    ASM, from HLL sources cuz it forces them to know too
    much about the underlying hardware... things that a
    (truly) "optimizing assembler" is better suited to knowing.

    I'll leave that to the next generation. I want to make a bit of money
    now :)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Peter@21:1/5 to Peter on Wed Mar 13 16:33:26 2024
    Peter <occassionally-confused@nospam.co.uk> wrote:

    "One can't get divorced TWICE; the first takes HALF of everything,
    the second would take the OTHER half!"

    Hahaha. It's actually a power series: 1/2 + 1/4 + 1/8 + 1/16.
    According to >https://en.wikipedia.org/wiki/1/2_%E2%88%92_1/4_%2B_1/8_%E2%88%92_1/16_%2B_%E2%8B%AF
    it converges to 1/3 so you will always eat afterwards ;)

    I was wrong https://www.quora.com/How-does-1-2-1-4-1-8-1-16-till-infinity-have-a-sum-2

    So yes if you keep divorcing you will eventually lose 100%

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Peter on Wed Mar 13 14:49:03 2024
    On 3/13/2024 9:28 AM, Peter wrote:
    I've been tempted to try reimplementing some early designs just to
    see how quickly the development would proceed AND how much faster
    the code would execute... big change from a ~700KHz i4004 to an
    800MHz quad-core (costing a tenth as much!). It would be
    depressing to discover that a man-year effort can be reduced to
    a long weekend! :<

    My last design is 100x faster than anything done before, and the CPU
    costs about $7.

    Yup. I can recall paying $60 for an i4004 -- in 1970's money!
    And, a time when 2716's hit $50/each.

    For me, it's the difference between a 700KHz processor and an 800MHz
    quad processor (at less money).

    The idea of trying to save on hardware costs is just silly-speak
    (for most designs).

    But the software takes as long - because 90% of the functionality is
    now connectivity! MbedTLS etc. There is even an HTTP server (simple: I
    wrote it myself) for config.

    And, there is *increased* functionality. You do things that you wouldn't
    ever have considered, previously.

    E.g., that first i4004 product required an offline PDP-11 to
    calculate initialization coefficients for each customer/deployment.
    Customer moves to a different part of the world and we have to
    rerun the initialization code.

    *Second* product did all of that *in* the device -- something you
    would expect, nowadays (but were wowed by, 45 years ago!). A lot
    harder to get that sort of code working in a device that you (as
    engineer) will no longer be able to stand watch over ("What if the
    code throws an error? How will the customer handle that situation?")

    But, realizing that connectivity/intercommunications is the
    heart of the problem (always has been... "Eschew Globals", etc.)
    coaxes you to address THAT issue in your design framework.

    My current biggest challenge is designing for 24/7/365 runtimes
    where hardware changes and software upgrades happen without
    power cycling or rebooting (the MULTICS "Software as a Service"
    mantra). A completely new set of design challenges... (how do
    you test devices WHILE they are providing services?? how do
    you flag them as failed AND replace them without interrupting
    the rest of the system?)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Don Y@21:1/5 to Peter on Wed Mar 13 16:47:10 2024
    On 3/13/2024 9:29 AM, Peter wrote:

    Don Y <blockedofcourse@foo.invalid> wrote:

    On 3/12/2024 6:10 AM, Peter wrote:

    RichD <r_delaney2001@yahoo.com> wrote:

    Given these considerations, does anybody write assembly code for
    modern RISC processors?

    Yes; some stuff cannot be done in C. Start with loading SP. No way in
    C!

    Doing anything that isn't memory-mapped; how would you generate "I/O"
    instructions (for processors with I/O spaces)?

    I think I/O is rare; it tends to be memory mapped.

    For *new* hardware, yes. But, still present in older designs
    (that likely need to be maintained)

    Using any part of the instruction set that isn't directly mapped
    to native C constructs (how would you access support for BCD
    data types? special commands to control interrupts? opcodes to
    control atomic operations/synchronization?)

    Some code in an RTOS is not possible in C. Look at the FreeRTOS
    sourcecode. There are bits of asm in there.

    Also asm has great uses for protecting from optimisation (which can
    change silently by upgrading the compiler!). Asm never gets modified;
    essential when talking to devices needing specific minimum /CS timing
    etc.

    This is changing. Lots of ongoing work in optimizing and
    super-optimizing assembly language code. Even arguments
    being made that compilers should NOT be generating (final)
    ASM, from HLL sources cuz it forces them to know too
    much about the underlying hardware... things that a
    (truly) "optimizing assembler" is better suited to knowing.

    I'll leave that to the next generation. I want to make a bit of money
    now :)

    There are lots of ways to make money. The joy of engineering is
    that you can have *fun* -- and learn stuff -- while doing so!
    (imagine being an *accountant*, lawyer, doctor, etc. -- fields where
    "new knowledge" drips out at a trickle...)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Bill Sloman@21:1/5 to Don Y on Thu Mar 14 16:04:52 2024
    On 14/03/2024 10:47 am, Don Y wrote:
    On 3/13/2024 9:29 AM, Peter wrote:
      Don Y <blockedofcourse@foo.invalid> wrote
    On 3/12/2024 6:10 AM, Peter wrote:
       RichD <r_delaney2001@yahoo.com> wrote:

    There are lots of ways to make money.  The joy of engineering is
    that you can have *fun* -- and learn stuff -- while doing so!
    (imagine being an *accountant*, lawyer, doctor, etc. -- fields where
    "new knowledge" drips out at a trickle...)
    Medical doctors have to cope with a flood of new knowledge.

    The peer-reviewed literature where most of it comes out isn't as well
    regulated in medicine as it is in most sciences - medical professors
    still have the god-professor status that all professor had in Germany in
    1920s. and they get to publish a lot of half-baked papers.


    This means that a lot of what is touted as new knowledge is pretentious nonsense.

    The regular literature contains a lot of stuff that wasn't worth
    publishing, but it tends to be more unhelpful than actively wrong.

    Of course medical doctors are dealing with the same old problems that
    human beings have always had, while engineers have invented new problems
    to solve.

    --
    Bill Sloman, Sydney

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)