• Another Programming (Not Coding) Challenge

    From Lester Thorpe@21:1/5 to All on Sun Jan 7 18:48:41 2024
    GNU/Linux, despite all the candy-ass distros, is intended for
    programmers (not coders).

    A programmer is knowledgeable in computer science, which requires
    a deep knowledge of mathematics, including probability mathematics.

    Solve this problem (then hand it to the lackey code monkeys):

    In a certain town, a taxi cab sideswipes a parked car then flees.
    This is a big crime.

    In this town, there are only two taxi cab companies: Blue and Green,
    with Blue operating 15% of the taxi cabs.

    An eye witness says that he saw a Blue taxi cab do the crime, but
    this eyewitness is known to be reliable only 80% of the time.

    What is the probability that a Blue taxi cab committed the crime?

    GNU/Linux is for highly skilled programmers (not coders).

    If you cannot solve this problem then get the fuck out of here.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tyrone@21:1/5 to Lester Thorpe on Mon Jan 8 01:56:54 2024
    On Jan 7, 2024 at 1:48:41 PM EST, "Lester Thorpe" <lt@gnu.rocks> wrote:

    GNU/Linux, despite all the candy-ass distros, is intended for
    programmers (not coders).

    A programmer is knowledgeable in computer science, which requires
    a deep knowledge of mathematics, including probability mathematics.

    Mathematics has nothing to do with programming.

    Solve this problem (then hand it to the lackey code monkeys):

    No "computer program" is needed to solve simple brain teasers like this.
    Once you solve it in your head, what's the point of a "computer program"?

    Again, you are utterly clueless about programming.

    In a certain town, a taxi cab sideswipes a parked car then flees.
    This is a big crime.

    In this town, there are only two taxi cab companies: Blue and Green,
    with Blue operating 15% of the taxi cabs.

    An eye witness says that he saw a Blue taxi cab do the crime, but
    this eyewitness is known to be reliable only 80% of the time.

    What is the probability that a Blue taxi cab committed the crime?

    15%. The "80% eye witness" is completely meaningless and is only there in a simple-minded attempt to confuse the issue.

    Suppose the eye witness was 100% reliable? Still 15%. What if the eyewitness was a known 100% liar? Still 15%. What if there are no eyewitnesses at all?

    That's right, still 15%.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Joel on Mon Jan 8 04:27:55 2024
    On Sun, 07 Jan 2024 21:13:09 -0500, Joel wrote:

    Tyrone <none@none.none> wrote:

    Mathematics has nothing to do with programming.


    Tell that to my college, my highest math in high school was part of the advanced algebra/trig class, which is basically where I began in
    college, five credits that semester, five more for precalculus in my
    second semester, and *then* the first math that even *counted* toward my computer science major, calculus 1 in summer school. I completed calc 2
    in my third semester, before dropping out to pursue my drug career.

    Colleges often stick CS in the math department particularly when CS was beginning to be a separate discipline,

    https://www.cs.purdue.edu/undergraduate/curriculum/bachelor.html

    https://www.cs.cornell.edu/undergrad/csmajor

    Cornell mentions calculus but unless it's otherwise required Purdue
    doesn't. While I had calculus and differential equations I never had much direct use of it although knowing the concepts helped. RPI didn't have a
    CS degree when I graduated. Programming was seen to be a useful tool like
    a sliderule on steroids rather than a career.

    A working programmer may or may not need much math. For example neural
    nets require extensive matrix manipulations and the ability to find
    derivatives to implement gradient descent and so forth but packages like PyTorch or TensorFlow abstract the low level stuff away. Other areas may require much more hands on math.

    I've always seen programming a logic rather than math for the most part
    which leads to a philosophical discussion of their differences.

    https://www.memoriapress.com/articles/logic-is-not-math/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lester Thorpe@21:1/5 to Physfitfreak on Mon Jan 8 08:58:00 2024
    On Sun, 7 Jan 2024 15:56:10 -0600, Physfitfreak wrote:


    80%.


    No. The answer is 41%.

    The first step is to determine the outcome space. There
    are 4 possible outcomes:

    1) Cab is Blue and witness is correct.

    2) Cab is Green and witness is correct.

    3) Cab is Blue and witness is not correct.

    4) Cab is Green and witness is not correct.

    Now determine the probabilities for each outcome and
    then apply the basic relation for conditional probabilities:

    https://www.freecodecamp.org/news/content/images/2020/07/Screenshot-2020-07-19-at-22.58.48.png

    Here, P(A/B) = P(Blue Cab/Witness Correct)

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lester Thorpe@21:1/5 to Tyrone on Mon Jan 8 09:01:14 2024
    On Mon, 08 Jan 2024 01:56:54 +0000, Tyrone wrote:


    Mathematics has nothing to do with programming.


    Certainly not for the kind of bullshit "programming" that
    you may be involved with.

    But I am referring to REAL PROGRAMMING.





    That's right, still 15%.


    Fail.

    Correct answer is 41%.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From DFS@21:1/5 to Lester Thorpe on Mon Jan 8 09:02:27 2024
    On 1/7/2024 1:48 PM, Lester Thorpe wrote:

    GNU/Linux, despite all the candy-ass distros, is intended for

    everyone


    programmers (not coders).

    Only a frustrated, failed programmer would make up a such a false
    distinction.

    In the real world: programming = coding = developing = engineering



    A programmer is knowledgeable in computer science, which requires
    a deep knowledge of mathematics, including probability mathematics.

    Solve this problem (then hand it to the lackey code monkeys):

    No programming knowledge is required for this problem.


    In a certain town, a taxi cab sideswipes a parked car then flees.
    This is a big crime.

    In this town, there are only two taxi cab companies: Blue and Green,
    with Blue operating 15% of the taxi cabs.

    An eye witness says that he saw a Blue taxi cab do the crime, but
    this eyewitness is known to be reliable only 80% of the time.

    What is the probability that a Blue taxi cab committed the crime?

    15%

    Had you asked what is the probability that the eyewitness correctly
    reported a blue taxi cab, that would be 80%.



    Now solve this:

    A certain self-proclaimed "programming genius" on comp.os.linux.advocaca
    is known to be steadfast only 0.80% of the time. What is the
    probability that this clownish individual is a GuhNoo adherent?




    GNU/Linux is for highly skilled programmers (not coders).

    If you cannot solve this problem then get the fuck out of here.

    If you can't write working code to present a simple algorithm or
    solution, turn in your "REAL IT MAN" card and apply at Subway.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From DFS@21:1/5 to Lester Thorpe on Mon Jan 8 09:45:16 2024
    On 1/8/2024 3:58 AM, Lester Thorpe wrote:
    On Sun, 7 Jan 2024 15:56:10 -0600, Physfitfreak wrote:


    80%.


    No. The answer is 41%.

    The first step is to determine the outcome space. There
    are 4 possible outcomes:

    1) Cab is Blue and witness is correct.

    2) Cab is Green and witness is correct.

    3) Cab is Blue and witness is not correct.

    4) Cab is Green and witness is not correct.

    Now determine the probabilities for each outcome and
    then apply the basic relation for conditional probabilities:

    https://www.freecodecamp.org/news/content/images/2020/07/Screenshot-2020-07-19-at-22.58.48.png

    Here, P(A/B) = P(Blue Cab/Witness Correct)


    You're answering a question different from the one you asked.

    You asked:
    "What is the probability that a Blue taxi cab committed the crime?"

    And that answer is 15%.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tyrone@21:1/5 to DFS on Mon Jan 8 14:50:49 2024
    On Jan 8, 2024 at 9:02:27 AM EST, "DFS" <nospam@dfs.com> wrote:

    On 1/7/2024 1:48 PM, Lester Thorpe wrote:

    GNU/Linux, despite all the candy-ass distros, is intended for

    everyone


    programmers (not coders).

    Only a frustrated, failed programmer would make up a such a false distinction.

    In the real world: programming = coding = developing = engineering

    Exactly correct.

    A programmer is knowledgeable in computer science, which requires
    a deep knowledge of mathematics, including probability mathematics.

    Solve this problem (then hand it to the lackey code monkeys):

    No programming knowledge is required for this problem.

    Exactly correct. What IS required is the ability to understand and apply
    logic. Which is what all programming comes down to.


    In a certain town, a taxi cab sideswipes a parked car then flees.
    This is a big crime.

    In this town, there are only two taxi cab companies: Blue and Green,
    with Blue operating 15% of the taxi cabs.

    An eye witness says that he saw a Blue taxi cab do the crime, but
    this eyewitness is known to be reliable only 80% of the time.

    What is the probability that a Blue taxi cab committed the crime?

    15%

    Had you asked what is the probability that the eyewitness correctly
    reported a blue taxi cab, that would be 80%.

    Correct again.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Ahlstrom@21:1/5 to Lord Master on Mon Jan 8 10:35:22 2024
    Lord Master wrote this copyrighted missive and expects royalties:

    On Monday, January 8, 2024 at 9:45:14 AM UTC-5, DFS wrote:
    You asked:
    "What is the probability that a Blue taxi cab committed the crime?"
    And that answer is 15%.

    Nope. Fail again.

    If no one had come forth as a witness, then yes.

    But a witness, with a known 80% reliability, claimed that the cab was Blue. This adds a whole new dimension

    The problem is similar to the cancer-test problem.

    A person has a cancer test that indicates a 90% probability of being
    positive (has cancer). There is thus a 10% chance of not having
    cancer.

    But the test is known to fail 5% of the time giving a false positive.

    What then is the actual probablility of having cancer?

    Probably (no pun intended) 99% of all college graduates cannot
    give the correct answer which reveals just how useless the
    education system is in the USA.

    Bayes theorem is a good antidote.

    Didn't bite for your time-waster, but did look at the Wikipedia entry for the "Monty Hall Problem", and what a long article it turned out to be. Informative on more than one aspect of decision-making.

    --
    You will stop at nothing to reach your objective, but only because your
    brakes are defective.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Tyrone@21:1/5 to All on Mon Jan 8 15:43:15 2024
    On Jan 8, 2024 at 10:35:22 AM EST, "Chris Ahlstrom" <OFeem1987@teleworm.us> wrote:

    Lord Master wrote this copyrighted missive and expects royalties:

    On Monday, January 8, 2024 at 9:45:14 AM UTC-5, DFS wrote:
    You asked:
    "What is the probability that a Blue taxi cab committed the crime?"
    And that answer is 15%.

    Nope. Fail again.

    If no one had come forth as a witness, then yes.

    But a witness, with a known 80% reliability, claimed that the cab was Blue. >> This adds a whole new dimension

    The problem is similar to the cancer-test problem.

    A person has a cancer test that indicates a 90% probability of being
    positive (has cancer). There is thus a 10% chance of not having
    cancer.

    But the test is known to fail 5% of the time giving a false positive.

    What then is the actual probablility of having cancer?

    Probably (no pun intended) 99% of all college graduates cannot
    give the correct answer which reveals just how useless the
    education system is in the USA.

    Bayes theorem is a good antidote.

    Didn't bite for your time-waster, but did look at the Wikipedia entry for the "Monty Hall Problem", and what a long article it turned out to be. Informative
    on more than one aspect of decision-making.

    I was going to bring up the 3 card monty problem, but I assumed Feeb would get that wrong too. Long Wikipedia articles notwithstanding, the answer is obvious (with basic probability knowledge) and is easily proven by using a deck of cards.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Lord Master on Mon Jan 8 17:27:38 2024
    On Mon, 8 Jan 2024 08:00:05 -0800 (PST), Lord Master wrote:

    On Monday, January 8, 2024 at 10:35:26 AM UTC-5, Chris Ahlstrom wrote:


    Bayes theorem is a good antidote.


    Probability theory is the basis for statistics, and statistics is
    extremely important in computer programming.

    Definitely. Any program yLord Master writes has a 15% probability of
    working. AI solutions are stochastic but for many areas statistics do not
    enter into the design.

    Statistics always made me uneasy. The goal of the class appeared to be how
    many widgets per thousand you needed to test to ensure only 5% of the
    output was defective.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Ahlstrom@21:1/5 to Lord Master on Mon Jan 8 13:03:36 2024
    Lord Master wrote this copyrighted missive and expects royalties:

    On Monday, January 8, 2024 at 10:35:26 AM UTC-5, Chris Ahlstrom wrote:

    Bayes theorem is a good antidote.

    Probability theory is the basis for statistics, and statistics is extremely important in computer programming.

    Also useful in advertising, except it isn't used; could lend actual likelihoods or risk/benefit to the results of those random-named drugs.

    Reminds me, I need to take my Dammitol.

    --
    Whoever has lived long enough to find out what life is, knows how deep a debt of gratitude we owe to Adam, the first great benefactor of our race. He brought death into the world.
    -- Mark Twain, "Pudd'nhead Wilson's Calendar"

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Chris Ahlstrom on Mon Jan 8 20:03:39 2024
    On Mon, 8 Jan 2024 13:03:36 -0500, Chris Ahlstrom wrote:

    Lord Master wrote this copyrighted missive and expects royalties:

    On Monday, January 8, 2024 at 10:35:26 AM UTC-5, Chris Ahlstrom wrote:

    Bayes theorem is a good antidote.

    Probability theory is the basis for statistics, and statistics is
    extremely important in computer programming.

    Also useful in advertising, except it isn't used; could lend actual likelihoods or risk/benefit to the results of those random-named drugs.

    Reminds me, I need to take my Dammitol.

    I do like those ads that state 'ZippidyDooDah reduces your risk of
    bilateral plaziomosis by 37%!' while omitting the 'Your risk of
    contracting plaziomosis is 0.023%'.

    There never is a probability of encountering the side effects up to and including sudden death either.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lester Thorpe@21:1/5 to rbowman on Mon Jan 8 20:39:00 2024
    On 8 Jan 2024 17:27:38 GMT, rbowman wrote:


    Statistics always made me uneasy. The goal of the class appeared to be how many widgets per thousand you needed to test to ensure only 5% of the
    output was defective.


    Programmers deal with data routinely and thus they should be
    masters of statistics (with its probability basis).

    For example, the very first thing when analyzing data is to
    determine what type of distribution does the data follow.
    This involves applying the Kolmogorov–Smirnov test:

    https://en.wikipedia.org/wiki/Kolmogorov%E2%80%93Smirnov_test

    Now, some "programmers" could resort to "R" to perform the
    K-S test, but who wrote the R code? Ultimately, the buck
    has to stop somewhere and that is where we find the REAL
    PROGRAMMER.

    And that mutherfucker had better be a master at numerical
    analysis as well as statistics, otherwise we'll have junk
    code everywhere -- and the pseudo-programmers won't ever
    notice.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Lester Thorpe on Tue Jan 9 00:32:09 2024
    On Mon, 08 Jan 2024 20:39:00 +0000, Lester Thorpe wrote:

    Programmers deal with data routinely and thus they should be masters of statistics (with its probability basis).

    Some do.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Tue Jan 9 01:30:34 2024
    On Mon, 8 Jan 2024 18:53:35 -0600, Physfitfreak wrote:

    That's why it is taught in first year college, usually the last part of
    their college algebra course, which often falls on the 2nd semester in school.

    https://catalog.rpi.edu/preview_program.php?catoid=11&poid=2508

    College algebra? When I took statistics it was a stand-alone 4 credit
    hour course. I am surprised by the Physics schedule. We used Resnick & Halliday.

    https://en.wikipedia.org/wiki/Fundamentals_of_Physics

    The freshman year covered volume 1 and the sophomore year volume 2. The
    spring semester of the sophomore year was when it got weird with quantum theory. Resnick was a professor at RPI at the time which may have
    explained the emphasis.

    Of course 'C programming for Engineers' was FORTRAN IV programming for engineers.

    \

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Tue Jan 9 03:38:54 2024
    On Mon, 8 Jan 2024 20:18:45 -0600, Physfitfreak wrote:

    Yes, we had a FORTRAN 4 course as well Back then they were telling us,
    "C is the language of the future", and yet they didn't teach it to us!..


    C wasn't even a gleam in someone's eye in '65. PL/I was going to be the
    Next Big Thing. It wasn't one of IBMs successes. The business types had
    COBOL, the weirdos had Lisp, and the academics had ALGOL. Fartmouth had
    cooked up BASIC but it hadn't escaped into the wild.

    I think they upgraded it but RPI's System/360 30 only had 32K of core and
    the PL/I compiler needed 44k so we were safe. I was surprised to find
    there is a PL/I compiler that supposedly runs on Linux. It must be a labor
    of love. It supposedly also runs on OS/2.

    http://www.iron-spring.com/

    When I read of Wirth's passing it reminded me of Pascal, Modula-2 and
    Oberon. While he was a prolific creator of languages with ALGOL DNA there haven't made much of an impact. ObjectPascal was probably the closest.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Tue Jan 9 03:57:18 2024
    On Mon, 8 Jan 2024 20:27:00 -0600, Physfitfreak wrote:

    Sorry, anywhere I said "modern algebra" I meant to say, "college
    algebra". Modern algebra is something else, pure math, and an optional
    course for those who want to take it.

    That's not linear algebra, is it? Multivariate calculus is often a
    prerequisite for that.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Tue Jan 9 06:41:24 2024
    On Mon, 8 Jan 2024 22:43:02 -0600, Physfitfreak wrote:

    College algebra, on the other hand, is an absolute must to learn before anything else is done in physics. A light cover of it is done in high schools, but the nice, full, treatment it gets in universities under the "college algebra" course is absolutely essential to learn. From
    beginning of that text to its end (which is the probability theory in
    fact) a student should not leave _anything_ uncomprehended, because
    literally every concept in it will soon be used.

    The freshman year started with Thomas' 'Calculus and Analytic Geometry'.
    The only algebra I had was in high school. However I did take a linear equations course in summer school followed by calculus in my senior year.
    The course was taught by a RPI professor after the normal school hours and
    also used Thomas. The high school had the advantage of being basically
    across the street from RPI. The prof's name was Dis Maly which summed up
    his teaching style. His wife had taught the summer linear equations class
    and was much better.

    I don't recall any college algebra coure prior to calculus. Calculus 1 and Physics 1 were both in the freshman fall semester so they played into each other.

    I sort of regret taking calculus in high school since the normal senior
    level math was spherical trig and GIS gets into that.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Tue Jan 9 07:06:23 2024
    On Mon, 8 Jan 2024 23:39:23 -0600, Physfitfreak wrote:


    In finding averages, often the knowledge of integrals is required, and
    where in the programming world you don't encounter the concept of
    "averages"? It's all over. So you must know how to do integrals.

    If you really haven't seen a need for knowing these material, then you
    have been, as Farley named it, "Code Monkeys". And I'm not joking.

    Let me see... Controlling semiconductor sputtering equipment? No averages/ Robotic arm? Cartesian space, no averages. Fuel measurement an management?
    Not a lot there. pH and ion concentration meters? Yeah, maybe some light
    DSP when processing the A/D input from the Ross electrodes. Environmental chamber controllers? Nope. Computer aided dispatch? No. GIS? No. Crime analysis? Maybe a little although most of the interesting stuff is GIS
    related heat maps and so forth. I guess PID loops would count.

    That's not to say there isn't math involved but not a whole lot of
    integration. Where I've gotten into that is in machine learning / neural networks. That is heavy into linear algebra and calculus. Back propagation relies on derivatives in for gradient descent.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Tue Jan 9 07:30:35 2024
    On Mon, 8 Jan 2024 22:30:01 -0600, Physfitfreak wrote:


    Much of what you say (in general) goes above my head. And if you were in university in 1965 trying to learn programming then you were from a generation that were 10 years ahead of me.

    It was a bit of a different world. There was still an analog computer lab
    even as they were on the way out.


    My exposure to computer programming began at UTD in 1980 in graduate
    school, and the only computer programming course that was taught as part
    of physics department courses was this FORTRAN 4 course which I never
    took. When I wanted, first time, to take a programming course it was
    Summer and they didn't offer that course in Summer, so instead, I took a
    PL/I course in computer science which was I think still part of the math department.

    PL/I was weird. It was supposed to be a Swiss Army knife to replace
    Fortran and Cobol, inheriting the quirks of both. It was disliked by both Fortran and Cobol programmers.


    In that course, taught by an extremely pedantic professor (and ardent teacher), I heard of C for the first time, and as I mentioned it above,
    he referred to it as "the programming language of the future". But the
    funny thing was that there and after, I never heard of a C programming
    course in any of the physics or math or computer departments in that
    school.

    I don't know when C took off in the academic world. For some reason
    academics like Pascal, Modula, or Scheme for didactic languages.


    But soon, I saw the probable reasons why they didn't teach C. As soon as
    I began preparing for writing scientific programs for my own projects, I realized that literally everything that I looked into was in FORTRAN.
    There were all these bits and pieces of various subroutines that were
    being used all the time by scientists, some lingering around since 1960s (they had dates!), which were all in FORTRAN. So I put the PL/I aside
    and read a FORTRAN 77 (I believe) text from begin to end to start
    writing in that language. Or was it FORTRAN 9 ? Or 99? I'm not sure
    anymore. Everybody in the department was writing their programs in
    FORTRAN.

    Fortran is alive and well. gfortran is mostly up to 2008 although there
    still are aliases for f77 and f95.

    About Wirth, all I can say is that even in the dead comp.programming
    forum there was a sudden rush of posts about his recent passing. So he must've indeed been an important guy in that field.


    He was in an academic sort of way. I think one of his quotes was software
    gets slower than the hardware gets faster or something like that.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Farley Flud@21:1/5 to Physfitfreak on Tue Jan 9 09:16:39 2024
    On Mon, 8 Jan 2024 22:43:02 -0600, Physfitfreak wrote:


    Modern Algebra is totally abstract. An optional thing. It is a pure math course often taught only in math departments, but sometimes some
    students (including myself) went there and took it. I almost never used
    it in anything. It only gets a bit of its application in physics in
    advanced graduate courses, and even then in only some subjects. It deals
    with groups, rings, etc.


    The entire field of digital cryptography is based in abstract (modern)
    algebra. Its importance is profound.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From candycanearter07@21:1/5 to rbowman on Tue Jan 9 08:24:19 2024
    On 1/9/24 01:30, rbowman wrote:
    On Mon, 8 Jan 2024 22:30:01 -0600, Physfitfreak wrote:
    My exposure to computer programming began at UTD in 1980 in graduate
    school, and the only computer programming course that was taught as part
    of physics department courses was this FORTRAN 4 course which I never
    took. When I wanted, first time, to take a programming course it was
    Summer and they didn't offer that course in Summer, so instead, I took a
    PL/I course in computer science which was I think still part of the math
    department.

    PL/I was weird. It was supposed to be a Swiss Army knife to replace
    Fortran and Cobol, inheriting the quirks of both. It was disliked by both Fortran and Cobol programmers.

    I've never heard of PL/I, was it any good?
    --
    user <candycane> is generated from /dev/urandom

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From DFS@21:1/5 to rbowman on Tue Jan 9 09:49:18 2024
    On 1/9/2024 1:41 AM, rbowman wrote:

    The freshman year started with Thomas' 'Calculus and Analytic Geometry'.

    How tf can you remember the author and title of a math book from 55
    years ago?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From DFS@21:1/5 to Larry Andrew Pietraskiewicz on Tue Jan 9 10:25:38 2024
    On 1/8/2024 3:39 PM, Larry Andrew Pietraskiewicz wrote:


    Programmers deal with data routinely and thus they should be
    masters of statistics (with its probability basis).

    For example, the very first thing when analyzing data is to
    determine what type of distribution does the data follow.
    This involves applying the Kolmogorov–Smirnov test:

    https://en.wikipedia.org/wiki/Kolmogorov%E2%80%93Smirnov_test

    Now, some "programmers" could resort to "R" to perform the
    K-S test,

    Why is programmers in quotes?

    Every time you open your piehole, out comes some bullshit implication
    that you're a "REAL programmer" that would never rely on R or python.
    But the reality?

    * "Why reinvent the fucking wheel? There's wget and curl for that
    purpose which I regularly use."

    * "A REAL MAN uses GNU split. Why reinvent the wheel? That's
    for masochists."

    * "My Gentoo system automagically..."


    Has the Bash Scripter / GuhNoo and Python Dependent toddler spoken?



    but who wrote the R code? Ultimately, the buck
    has to stop somewhere and that is where we find the REAL
    PROGRAMMER.

    And that mutherfucker had better be a master at numerical
    analysis as well as statistics,


    Don't you worry your greasy head about the R developers.

    https://www.r-project.org/contributors.html

    "R was initially written by Robert Gentleman and Ross Ihaka—also known
    as “R & R” of the Statistics Department of the University of Auckland."

    https://en.wikipedia.org/wiki/Robert_Gentleman_(statistician) https://en.wikipedia.org/wiki/Ross_Ihaka

    They make R easy to code in so programming frauds like you can get stats
    work done, too.



    But just in case there's a concern, YOU should spend years pretending to
    review all the R source code (mostly C) for your version of correctness.

    https://cran.case.edu/src/base/R-4/R-4.3.2.tar.gz

    Or you can choose to trust R because it was written by experts in their
    fields, and expert users report issues that are confirmed and fixed. And
    so it goes.

    Unfortunately, like most GuhNoo code, the R source is barely commented.



    otherwise we'll have junk code everywhere -- and
    the pseudo-programmers

    ie YOU



    Note: in R, you do K-S tests with the dgof package: https://cran.r-project.org/web/packages/dgof/index.html

    Some code from file ks.c ================================================================
    static double
    K(int n, double d)
    {
    /* Compute Kolmogorov's distribution.
    Code published in
    George Marsaglia and Wai Wan Tsang and Jingbo Wang (2003),
    "Evaluating Kolmogorov's distribution".
    Journal of Statistical Software, Volume 8, 2003, Issue 18.
    URL: http://www.jstatsoft.org/v08/i18/.
    */

    int k, m, i, j, g, eH, eQ;
    double h, s, *H, *Q;

    /*
    The faster right-tail approximation is omitted here.
    s = d*d*n;
    if(s > 7.24 || (s > 3.76 && n > 99))
    return 1-2*exp(-(2.000071+.331/sqrt(n)+1.409/n)*s);
    */
    k = (int) (n * d) + 1;
    m = 2 * k - 1;
    h = k - n * d;
    H = (double*) Calloc(m * m, double);
    Q = (double*) Calloc(m * m, double);
    for(i = 0; i < m; i++)
    for(j = 0; j < m; j++)
    if(i - j + 1 < 0)
    H[i * m + j] = 0;
    else
    H[i * m + j] = 1;
    for(i = 0; i < m; i++) {
    H[i * m] -= pow(h, i + 1);
    H[(m - 1) * m + i] -= pow(h, (m - i));
    }
    H[(m - 1) * m] += ((2 * h - 1 > 0) ? pow(2 * h - 1, m) : 0);
    for(i = 0; i < m; i++)
    for(j=0; j < m; j++)
    if(i - j + 1 > 0)
    for(g = 1; g <= i - j + 1; g++)
    H[i * m + j] /= g;
    eH = 0;
    m_power(H, eH, Q, &eQ, m, n);
    s = Q[(k - 1) * m + k - 1];
    for(i = 1; i <= n; i++) {
    s = s * i / n;
    if(s < 1e-140) {
    s *= 1e140;
    eQ -= 140;
    }
    }
    s *= pow(10., eQ);
    Free(H);
    Free(Q);
    return(s);
    }
    ======================================================================

    Whoops. They copied that function from a stats journal. Now you're toast.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to All on Tue Jan 9 19:27:30 2024
    On Tue, 9 Jan 2024 08:24:19 -0600, candycanearter07 wrote:

    On 1/9/24 01:30, rbowman wrote:
    On Mon, 8 Jan 2024 22:30:01 -0600, Physfitfreak wrote:
    My exposure to computer programming began at UTD in 1980 in graduate
    school, and the only computer programming course that was taught as
    part of physics department courses was this FORTRAN 4 course which I
    never took. When I wanted, first time, to take a programming course it
    was Summer and they didn't offer that course in Summer, so instead, I
    took a PL/I course in computer science which was I think still part of
    the math department.

    PL/I was weird. It was supposed to be a Swiss Army knife to replace
    Fortran and Cobol, inheriting the quirks of both. It was disliked by
    both Fortran and Cobol programmers.

    I've never heard of PL/I, was it any good?

    Not particularly. One of the problems was the early compilers sucked, It
    was sort of the Ada of its day, all things to all men, with the definition
    of the language evolving. By the time IBM got through fiddling around it
    had been superseded by C outside of IBM. DEC used it too.

    It was influenced by Algol 60 but then you can make the case that
    everything was influenced by Algol. CPL never quite made it but it was
    the basis for BCPL which became B which became C. Along the way curly
    braces and so forth replaced the BEGIN ... END BLOCK sort of structure.

    IBM was never bashful:
    PL/I Programming Language One
    APL A Programming Language

    APL has held up better despite having a strange character swet that
    usually required a special keyboard. Some of its quirks persist in R like
    the <- assignment operator.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to DFS on Tue Jan 9 19:50:16 2024
    On Tue, 9 Jan 2024 09:49:18 -0500, DFS wrote:

    On 1/9/2024 1:41 AM, rbowman wrote:

    The freshman year started with Thomas' 'Calculus and Analytic
    Geometry'.

    How tf can you remember the author and title of a math book from 55
    years ago?

    Okay, I cheated and looked up the exact title but we always referred to
    the calculus text as 'Thomas' and the 2 volume physics text as 'Resnick & Halliday' R&H morphed into H&R along the way, probably in 1970.

    https://en.wikipedia.org/wiki/Fundamentals_of_Physics

    The economics text was Samuelson. Unfortunately he was a Keynesian. For trivia, Larry Summers is his nephew. His father was an economist too and
    had changed the family name.

    The chemistry text was sort of a spiral bound thing by Bunce, one of the professors. afaik that never was published outside of RPI.

    I don't have a clue for any of the others like the diff-e or statistcs
    texts.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From chrisv@21:1/5 to rbowman on Tue Jan 9 14:40:33 2024
    rbowman wrote:

    some dumb fsck wrote:

    How tf can you remember the author and title of a math book from 55
    years ago?

    Okay, I cheated and looked up the exact title but we always referred to
    the calculus text as 'Thomas' and the 2 volume physics text as 'Resnick & >Halliday' R&H morphed into H&R along the way, probably in 1970.

    https://en.wikipedia.org/wiki/Fundamentals_of_Physics

    The economics text was Samuelson. Unfortunately he was a Keynesian. For >trivia, Larry Summers is his nephew. His father was an economist too and
    had changed the family name.

    The chemistry text was sort of a spiral bound thing by Bunce, one of the >professors. afaik that never was published outside of RPI.

    I don't have a clue for any of the others like the diff-e or statistcs
    texts.

    I took a course in Statistics for Engineering. Text book was by Jay
    L. Devore.

    But I don't know that computer programs need to be tested to make sure
    that they work. According to the dumbest fscking snit to ever haunt
    cola, anyway.

    --
    "ALL non-idiots support the use of testing over compile-time warnings
    to determine if the code functions correctly. You're one of the few
    idiots who thinks otherwise." - DumFSck, lying shamelessly

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Ahlstrom@21:1/5 to rbowman on Tue Jan 9 17:05:48 2024
    rbowman wrote this copyrighted missive and expects royalties:

    On Tue, 9 Jan 2024 08:24:19 -0600, candycanearter07 wrote:

    On 1/9/24 01:30, rbowman wrote:
    On Mon, 8 Jan 2024 22:30:01 -0600, Physfitfreak wrote:
    My exposure to computer programming began at UTD in 1980 in graduate
    school, and the only computer programming course that was taught as
    part of physics department courses was this FORTRAN 4 course which I
    never took. When I wanted, first time, to take a programming course it >>>> was Summer and they didn't offer that course in Summer, so instead, I
    took a PL/I course in computer science which was I think still part of >>>> the math department.

    PL/I was weird. It was supposed to be a Swiss Army knife to replace
    Fortran and Cobol, inheriting the quirks of both. It was disliked by
    both Fortran and Cobol programmers.

    I've never heard of PL/I, was it any good?

    Not particularly. One of the problems was the early compilers sucked, It
    was sort of the Ada of its day, all things to all men, with the definition
    of the language evolving. By the time IBM got through fiddling around it
    had been superseded by C outside of IBM. DEC used it too.

    It was influenced by Algol 60 but then you can make the case that
    everything was influenced by Algol.

    I used Algol in the early 70s over a phone line from school to a mainframe somewhere. A bit like Pascal.

    CPL never quite made it but it was
    the basis for BCPL which became B which became C. Along the way curly
    braces and so forth replaced the BEGIN ... END BLOCK sort of structure.

    IBM was never bashful:
    PL/I Programming Language One
    APL A Programming Language

    APL has held up better despite having a strange character swet that
    usually required a special keyboard. Some of its quirks persist in R like
    the <- assignment operator.

    Thanks for the history. As your reward, here's a definition example from Stan Kelly-Bootle's "The Devil's DP Dictionary":

    "There are three things a man must do / Before his life is done /
    Write 3 lines in APL / And make the buggers run."

    --
    Q: What's the difference between the 1950's and the 1980's?
    A: In the 80's, a man walks into a drugstore and states loudly, "I'd
    like some condoms," and then, leaning over the counter, whispers,
    "and some cigarettes."

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From candycanearter07@21:1/5 to rbowman on Tue Jan 9 16:27:41 2024
    On 1/9/24 13:27, rbowman wrote:
    On Tue, 9 Jan 2024 08:24:19 -0600, candycanearter07 wrote:

    On 1/9/24 01:30, rbowman wrote:
    On Mon, 8 Jan 2024 22:30:01 -0600, Physfitfreak wrote:
    My exposure to computer programming began at UTD in 1980 in graduate
    school, and the only computer programming course that was taught as
    part of physics department courses was this FORTRAN 4 course which I
    never took. When I wanted, first time, to take a programming course it >>>> was Summer and they didn't offer that course in Summer, so instead, I
    took a PL/I course in computer science which was I think still part of >>>> the math department.

    PL/I was weird. It was supposed to be a Swiss Army knife to replace
    Fortran and Cobol, inheriting the quirks of both. It was disliked by
    both Fortran and Cobol programmers.

    I've never heard of PL/I, was it any good?

    Not particularly. One of the problems was the early compilers sucked, It
    was sort of the Ada of its day, all things to all men, with the definition
    of the language evolving. By the time IBM got through fiddling around it
    had been superseded by C outside of IBM. DEC used it too.

    It was influenced by Algol 60 but then you can make the case that
    everything was influenced by Algol. CPL never quite made it but it was
    the basis for BCPL which became B which became C. Along the way curly
    braces and so forth replaced the BEGIN ... END BLOCK sort of structure.

    Figures.

    IBM was never bashful:
    PL/I Programming Language One
    APL A Programming Language

    APL has held up better despite having a strange character swet that
    usually required a special keyboard. Some of its quirks persist in R like
    the <- assignment operator.

    Yeah, I've seen the custom APL keyboards and they are crazy.
    --
    user <candycane> is generated from /dev/urandom

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From chrisv@21:1/5 to Physfitfreak on Tue Jan 9 16:29:59 2024
    Physfitfreak wrote:

    No, linear algebra is another rather advanced course (offered in third
    year undergraduate) in math that is taught in physics department. It has
    a lot of applications in physics.

    I took a pair of optional Linear Algebra classes at the tail end of my
    EE program. They were the toughest classes of the entire curriculum.
    The prof didn't use the textbook at all!

    Those classes damn near kicked my ass. After getting my ass handed to
    me in the first test, I considered dropping the class, not wanting to
    have my GPA slaughtered by a D. I talked to the prof and he convinced
    me to stick it out. I ended-up getting a B and (shockingly) an A...

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Ahlstrom@21:1/5 to chrisv on Tue Jan 9 18:22:17 2024
    chrisv wrote this copyrighted missive and expects royalties:

    Physfitfreak wrote:

    No, linear algebra is another rather advanced course (offered in third
    year undergraduate) in math that is taught in physics department. It has
    a lot of applications in physics.

    I took a pair of optional Linear Algebra classes at the tail end of my
    EE program. They were the toughest classes of the entire curriculum.
    The prof didn't use the textbook at all!

    Those classes damn near kicked my ass. After getting my ass handed to
    me in the first test, I considered dropping the class, not wanting to
    have my GPA slaughtered by a D. I talked to the prof and he convinced
    me to stick it out. I ended-up getting a B and (shockingly) an A...

    Good for you!

    In grad school I was flummoxed by quantum mechanics and Hamiltonians.
    The prof, amazingly, told me I seemed to grok the material better than
    others. Nah.

    In a basic mechanics class, the prof said "Why do you not understand these mechanics problems? They are easy!" :-D :-* :-(

    I switched to another field. Taking signals and systems courses later I realized what the hell quantum mechanics was about. I find much of the time
    my initial confusion is cleared up with another go at it years later.

    The "Theoretical Minimum" books are good, they give you another crack at physics in your "senior years" :-D

    --
    Your mode of life will be changed for the better because of new developments.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Wed Jan 10 03:42:44 2024
    On Tue, 9 Jan 2024 16:46:46 -0600, Physfitfreak wrote:

    On 1/9/2024 1:50 PM, rbowman wrote:
    The economics text was Samuelson.


    Hehe :) That was one of the texts in Tehran University's business school where one of my cousins was studying. He always thought Samuelson was a genius and told us that on many occasions. And when one of his friends
    once made an amazingly brilliant remark, from then on the cousin was
    calling him "Samuelson" :) This went on for years.

    The professor was Asian but didn't have much of an accent compared to a
    physics TA. I think she was teaching in Hindi but I'm not sure.

    This was the '60s with ITT under Geneen buying everything in sight. It
    was the start of the multinational mega-corporations and there was others.
    One of his favorite examples was the Prima Corporation. I wasn't familiar
    with it but I got the general concept. It was towards the end of the
    semester when he used Prima, Ford, and Chevroret in a sentence and the
    dime dropped.

    Yeah, a stereotype... Chip Wilson, the fouinder of Lululemon, is back in
    hot water for his remarks on the DEI direction the company has taken. Well before that he admitted in taking sadistic pleasure in listening to
    Japanese customers trying to pronounce the brand name.

    But back to Samuelson... It was actually my mentor in my second job who
    taught me practical economics when it comes to engineering. That was
    during the Carter inflationary years and trying to justify a project when
    you could park your money in a bank at over 10% was difficult.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to All on Wed Jan 10 03:53:22 2024
    On Tue, 9 Jan 2024 16:27:41 -0600, candycanearter07 wrote:


    Yeah, I've seen the custom APL keyboards and they are crazy.

    A company I worked for bought an IBM 5120 which was IBM's ideal of a small business computer. It was about $20k in 1980 dollars with two 8" floppies
    and a printer. It came with BASIC and APL in ROM with a toggle to switch between them. The key caps had the APL hieroglyphics. '

    I never found a use for the APL capability. It came with BRADS II which
    was sort of a bargain basement Rexsx in BASIC.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Wed Jan 10 03:25:20 2024
    On Tue, 9 Jan 2024 16:53:30 -0600, Physfitfreak wrote:

    In Tehran University, our calculus book was a translation of the old
    timers' Granville calculus. It was excellent! Then later, an Iranian
    wrote his own calculus book which was even better, but that was
    something that some students read and learned on their own.

    https://en.wikipedia.org/wiki/Calculus_Made_Easy

    'Calculus Made Easy' Silvanus Thompson.

    It's over 100 years old but calculus hasn't changed much.

    It's rare for a mathematician to explain anything in English. I've started reading 'Before Machine Learning Volume 1 - Linear Algebra for A.I' by
    Jorge Brasil. He is a little more informal. Avoid if you're offended by
    the f-bomb. 'Linear Algebra: Theory, Intuition, Code' by Mike Cohen is
    more formal but still takes a conversational tone and explains the
    nuances. You need to build the vocabulary but starting with what seems to
    be arcane symbology isn't a good first step.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Wed Jan 10 04:06:23 2024
    On Tue, 9 Jan 2024 17:16:56 -0600, Physfitfreak wrote:

    He was a famous physicist in Iran (nuclear physics) but not a very good teacher, mainly because of his German school background. His
    pronunciation of many physics terms confused the hell out of us. And he
    had a nasty temper on top of all that.

    Ah, Herr Doktor Wunderlich...

    http://old.polyacs.org/1001.html

    He taught chemistry 3 or maybe 4. Between the accent and the excursions
    into crystallization it was an interesting semester.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Wed Jan 10 03:59:47 2024
    On Tue, 9 Jan 2024 16:14:37 -0600, Physfitfreak wrote:

    I keep seeing this confusion among engineers about physics as the field
    of study in natural philosophy departments of a university. With physics
    you have no slacks to allow yourself; you cannot afford to leave
    something behind in preparing yourself for what's coming. So the
    curricula are designed to make sure of that. Inclusion of college
    algebra is just one of those steps.

    The four semesters of physics was the most valuable part of my college education. It really goes back to the pre-Socratics trying to figure out
    how the world works rather than blaming the Gods. My real fascination was
    with what would become cognitive science ten or twenty years later.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Chris Ahlstrom on Wed Jan 10 04:12:28 2024
    On Tue, 9 Jan 2024 18:22:17 -0500, Chris Ahlstrom wrote:

    In a basic mechanics class, the prof said "Why do you not understand
    these mechanics problems? They are easy!" :-D :-* :-(

    There was a heavily used phrase like 'This is inherently obvious to the
    astute observer.' which generally was greeted with WTF? It was as good as
    'That value approaches 0 for large values of theta so we'll ignore it' in
    a derivation.

    I guess it's a standard spiel but in freshman orientation the speaker saif 'Look to your right and then look to your left. One of you will graduate.' Sometimes I thought the first two years were designed to fulfill that
    prophecy.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to chrisv on Wed Jan 10 04:28:40 2024
    On Tue, 09 Jan 2024 14:40:33 -0600, chrisv wrote:

    But I don't know that computer programs need to be tested to make sure
    that they work. According to the dumbest fscking snit to ever haunt
    cola, anyway.

    I've worked with several programmers who thought testing was for the QA department. Unfortunately they also were of the opinion that nothing could possibly go wrong.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Chris Ahlstrom on Wed Jan 10 04:24:51 2024
    On Tue, 9 Jan 2024 17:05:48 -0500, Chris Ahlstrom wrote:


    I used Algol in the early 70s over a phone line from school to a
    mainframe somewhere. A bit like Pascal.

    Algol was the antecedent. Most of Wirth's creations resembled Algol right
    down to having no or very limited I/O capabilities. I've heard unaugmented Pascal described as a language very good at telling secrets to itself.

    "There are three things a man must do / Before his life is done /
    Write 3 lines in APL / And make the buggers run."

    https://www.goodreads.com/quotes/12051-a-human-being-should-be-able-to- change-a-diaper

    I can check most of Heinlein's boxes but I'll take a pass on APL. I did
    start a MIT video course on R which I think is sort of the 21st century
    APL. Python with the right packages can handle the same sort of tasks and
    is faster that R. That must put R in the tortoise class pf programming languages.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Wed Jan 10 04:44:36 2024
    On Tue, 9 Jan 2024 14:32:12 -0600, Physfitfreak wrote:

    I don't have experience in those areas to say much about them. But I can
    say this about each one of them: In every single area that you pointed
    at, when you attempt to add the error analysis to your program results,
    the concept of averages pops up! ...

    How's that for saying something with certainty about something I don't
    even know jack about

    Error analysis is a big part of ML both prior to and after deployment. For process/machine control, not so much. The effects are in the real world
    and are readily apparent. Problems are picked up during development.

    For the computer aided dispatch sysyems I've worked on for the last 20
    years I don't know if would even be applicable. There are errors in configuration, spatial data and so forth but it wouldn't be a statistical approach.

    I understand what you're saying but it isn't the whole programming
    universe.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Ahlstrom@21:1/5 to rbowman on Wed Jan 10 07:12:09 2024
    rbowman wrote this copyrighted missive and expects royalties:

    On Tue, 9 Jan 2024 17:05:48 -0500, Chris Ahlstrom wrote:


    I used Algol in the early 70s over a phone line from school to a
    mainframe somewhere. A bit like Pascal.

    Algol was the antecedent. Most of Wirth's creations resembled Algol right down to having no or very limited I/O capabilities. I've heard unaugmented Pascal described as a language very good at telling secrets to itself.

    Yeah, I took a course in the orignal Pascal, not the Borland version. The size of a declared array was part of its "type".

    Years later our project group made the mistake of buying Borland C++ Builder, on the strength of the reputation of the older Borland C++ compiler, which was very good.

    Stepping through the debugger, I saw some crazy stuff going on.... it turns out that VCL was based on <gasp!> Delphi (the successor to Turbo Pascal).

    No wonder Microsoft cleaned Borland's (now Embarcadero) clock with Visual Studio (as bragged about in Jim McCarthy's book, "Dynamics of Software Development".

    "There are three things a man must do / Before his life is done /
    Write 3 lines in APL / And make the buggers run."

    https://www.goodreads.com/quotes/12051-a-human-being-should-be-able-to-change-a-diaper

    I wonder how our local transphobes would feel about Heinlein's book, "I Will Fear No Evil" :-D

    I can check most of Heinlein's boxes but I'll take a pass on APL. I did start a MIT video course on R which I think is sort of the 21st century
    APL. Python with the right packages can handle the same sort of tasks and
    is faster that R. That must put R in the tortoise class pf programming languages.

    I used to try to djinn up uses for R :-D

    --
    Never give an inch!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Ahlstrom@21:1/5 to rbowman on Wed Jan 10 06:57:36 2024
    rbowman wrote this copyrighted missive and expects royalties:

    On Tue, 9 Jan 2024 16:53:30 -0600, Physfitfreak wrote:

    In Tehran University, our calculus book was a translation of the old
    timers' Granville calculus. It was excellent! Then later, an Iranian
    wrote his own calculus book which was even better, but that was
    something that some students read and learned on their own.

    https://en.wikipedia.org/wiki/Calculus_Made_Easy

    'Calculus Made Easy' Silvanus Thompson.

    It's over 100 years old but calculus hasn't changed much.

    It's rare for a mathematician to explain anything in English. I've started reading 'Before Machine Learning Volume 1 - Linear Algebra for A.I' by
    Jorge Brasil. He is a little more informal. Avoid if you're offended by
    the f-bomb. 'Linear Algebra: Theory, Intuition, Code' by Mike Cohen is
    more formal but still takes a conversational tone and explains the
    nuances. You need to build the vocabulary but starting with what seems to
    be arcane symbology isn't a good first step.

    Can probably find old texts like that on Project Gutenberg.

    Or, for a relatively low price, from Dover Publications.

    Heh heh:

    --
    I have never let my schooling interfere with my education.
    -- Mark Twain

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From DFS@21:1/5 to shitv on Wed Jan 10 08:54:58 2024
    On 1/9/2024 3:40 PM, shitv wrote:

    But I don't know that computer programs need to be tested to make sure
    that they work. According to the dumbest fscking snit to ever haunt
    cola, anyway.

    You're lying about me again, shitbag.

    I never once said you didn't know code needed to be tested.



    -- "ALL non-idiots support the use of testing over compile-time warnings
    to determine if the code functions correctly. You're one of the few
    idiots who thinks otherwise." - DFS, stating reality


    Here we go again, for the 1000th time:

    Relf's reasonable and correct statement:
    "Testing is how you know if the code works or not, not compile-time
    warnings."

    shitv's idiocy:
    "Heh. One doesn't need to be a pro, to know how stupid that is..."


    Own your dumbassery for a change.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From chrisv@21:1/5 to rbowman on Wed Jan 10 08:06:08 2024
    rbowman wrote:

    chrisv wrote:

    But I don't know that computer programs need to be tested to make sure
    that they work. According to the dumbest fscking snit to ever haunt
    cola, anyway.

    I've worked with several programmers who thought testing was for the QA >department. Unfortunately they also were of the opinion that nothing could >possibly go wrong.

    That's difficult to believe.

    --
    "What's wrong with it?" - DumFSck, regarding the use of a #pragma to
    disable 17 different types of compiler warnings

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From vallor@21:1/5 to All on Thu Jan 11 00:59:02 2024
    On Wed, 10 Jan 2024 07:12:09 -0500, Chris Ahlstrom <OFeem1987@teleworm.us> wrote in <unm1ir$2fsus$1@dont-email.me>:

    rbowman wrote this copyrighted missive and expects royalties:

    On Tue, 9 Jan 2024 17:05:48 -0500, Chris Ahlstrom wrote:


    I used Algol in the early 70s over a phone line from school to a
    mainframe somewhere. A bit like Pascal.

    Algol was the antecedent. Most of Wirth's creations resembled Algol
    right down to having no or very limited I/O capabilities. I've heard
    unaugmented Pascal described as a language very good at telling secrets
    to itself.

    Yeah, I took a course in the orignal Pascal, not the Borland version.
    The size of a declared array was part of its "type".

    Years later our project group made the mistake of buying Borland C++
    Builder, on the strength of the reputation of the older Borland C++
    compiler, which was very good.

    Stepping through the debugger, I saw some crazy stuff going on.... it
    turns out that VCL was based on <gasp!> Delphi (the successor to Turbo Pascal).

    No wonder Microsoft cleaned Borland's (now Embarcadero) clock with
    Visual Studio (as bragged about in Jim McCarthy's book, "Dynamics of
    Software Development".

    "There are three things a man must do / Before his life is done /
    Write 3 lines in APL / And make the buggers run."

    https://www.goodreads.com/quotes/12051-a-human-being-should-be-able-to- change-a-diaper

    I wonder how our local transphobes would feel about Heinlein's book, "I
    Will Fear No Evil" :-D

    I've read it. Not his best work, but one of his most
    interesting reads. I guess if someone can write
    scientific papers about "What's it like to be a bat?",
    we can certainly have novels about "what happens when
    an old man finds themself in a woman's body...and it's
    occupied already?"

    My favorite Heinlein is the last complete novel
    he wrote before his stroke: _The Moon is a Harsh Mistress_.
    Didn't mind the Loonie patois once I got used to it.

    I can check most of Heinlein's boxes but I'll take a pass on APL. I
    did start a MIT video course on R which I think is sort of the 21st
    century APL. Python with the right packages can handle the same sort of
    tasks and is faster that R. That must put R in the tortoise class pf
    programming languages.

    I used to try to djinn up uses for R :-D

    When I write a regular program, I use perl. Can't
    decide if python is worth learning -- but the power
    of pytorch-enabled apps is certainly appealing.

    When I write a program that I know needs to
    go *scat*, though, I use C. Also use it
    for setuid-wrappers, or anything else that needs
    to be bulletproof.

    Once wrote a curses program that accepted student
    information for signing up for an account, but that
    was when I was working at college in 1992.

    (The student-access Unix host was in great demand at
    the time. Only, it wasn't Unix -- it was Linux. 🙌️ )

    Anyway, the tough part to learning C is knowing
    what's available in your libraries. For beginners
    on Linux, I recommend learning "apropos"
    as well as "man" to find what you're looking
    for.

    Or use Eclipse. (Ducks and runs.)

    --
    -v

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to chrisv on Thu Jan 11 02:45:20 2024
    On Wed, 10 Jan 2024 08:06:08 -0600, chrisv wrote:

    rbowman wrote:

    chrisv wrote:

    But I don't know that computer programs need to be tested to make sure
    that they work. According to the dumbest fscking snit to ever haunt
    cola, anyway.

    I've worked with several programmers who thought testing was for the QA >>department. Unfortunately they also were of the opinion that nothing
    could possibly go wrong.

    That's difficult to believe.

    I'll have to introduce you to the staff of our Vietnam division. It's a different culture to put in nicely.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Thu Jan 11 02:42:21 2024
    On Wed, 10 Jan 2024 02:01:40 -0600, Physfitfreak wrote:

    But generally, any activity with a sought result, can have an error
    analysis associated with it. Does it involve counting? Measuring? Speed
    of something? Frequency of something? Size of something? Does something
    vary in time, size, volume, frequency? Is there some relationship
    between things varying here to other things varying somewhere else? And
    so on and so on...

    Okay, for example a simple robotic arm ignoring any grippers, tools, and
    so forth. It exists in three dimensional Cartesian space. There is a home position which is the origin 0, 0, 0 for simplicity.

    To allow for three axis motion there are three independent stepper motors.
    You know how how far one step of the motor will move the device in the X,
    Y, and Z axes and how long each step will take. You want to go to 53, 117,
    42. The clunk way would be to go 53 units in the X plane, 117 in the Y,
    and so forth. A more graceful way is to control the three steppers simultaneously so it looks like it is tracing a vector directly to the
    desired point. Of course since you're in a discrete world it will be an approximation.

    It's deterministic. Either you get to the desired point or you don't.

    I can ension a more complex scenario. Take the old video game where you're controlling the elevation and muzzle velocity of a cannon to hit a target.
    It would ruin the game but you could measure the error distance from the
    target and feed that back into the two controllable parameters for the
    next shot. Obviously if you missed by 100' you will make grosser
    adjustments than if you missed by 1 foot and take that into account. Congratulations, you've invented a PID loop.

    That's not a lot different than backprop in a neural network during
    training. You're measuring the loss and using it to adjust the weights and biases.

    In either case you're not analyzing the error after the fact. It's fed
    back into the iterative process. If you can't achieve stability, it's back
    to the drawing board.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Chris Ahlstrom on Thu Jan 11 03:03:20 2024
    On Wed, 10 Jan 2024 07:12:09 -0500, Chris Ahlstrom wrote:

    No wonder Microsoft cleaned Borland's (now Embarcadero) clock with
    Visual Studio (as bragged about in Jim McCarthy's book, "Dynamics of
    Software Development".

    I bought Turbo Pascal for a CP/M machine. What can you lose for $50? I
    thought it was broken after the first compilation. The C compiler I was
    using got there eventually but it certainly didn't get there in a couple
    of seconds.

    Years later I bought the Borland C++/OWL package on the strength of that.
    I really liked it better than Microsoft's C++/MFC but the 700 pound
    gorilla won. The event/dispatch loop seemed a lot cleaner to me.

    Borland's biggest sin in my eyes was buying the Brief editor and burying
    it. I bought Brief back when it was still UnderWare and it was excellent.
    Of course, anything was a step up from WordStar. That came bundled with
    the Osborne and served its purpose though.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to vallor on Thu Jan 11 03:17:29 2024
    On Thu, 11 Jan 2024 00:59:02 -0000 (UTC), vallor wrote:

    When I write a regular program, I use perl. Can't decide if python is
    worth learning -- but the power of pytorch-enabled apps is certainly appealing.

    I haven't used Perl in about 20 years. I should donate my old books. Perl
    5.0 never quite happened so they're still more or less current.

    Python became Esri's scripting language of choice after VBA died and
    installing their products installs their version called ArcPy. Then when
    you throw in numpy, pytorch, and the rest of the well documented libraries
    it's hard not to like when you don't need blazing speed. That may change
    a little. Python 3.13 is still in alpha 2 but it's getting a JIT.

    https://tonybaloney.github.io/posts/python-gets-a-jit.html

    That has the overhead of spinning up the JIT but should show improvements
    on longer running processes.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Thu Jan 11 03:45:45 2024
    On Tue, 9 Jan 2024 23:02:41 -0600, Physfitfreak wrote:


    But logically, they say, the best calculus book was the one written by Apostol of Caltech. To logically advance forward, he doesn't start with differentiation before discussing integration, but he covers integration first. Understanding integration first, and from it, the differentiation
    is easier than going from differentiation to integration as is done in
    almost all calculus texts. Historically also, it was integration that
    was invented first (by Newton, or Leibniz, depending on which one you believe).

    iirc I started with integration. The classic was 'how do you find the area under a curve? Well if you draw these little rectangles you can
    approximate it. If you make the rectangles really, really narrow...' and
    off to the races. One problem I recall was given a rectangular plate
    with these dimensions in the side of a tank three feet below the surface
    of the water what is the force on the plate?' Stuff like that made more
    sense than 'what is the slope of the curve? How rapidly is the slope of
    the curve changing?'


    It is rare that outside school and especially in a computer programming
    job, you'd need anything beyond calculus, because as soon as the need
    for forming and solving differential equations come up, somebody else
    has already solved it before the task even gets down to a programmer.

    Unfortunately I have an overdeveloped curiosity. For example in a PyTorch tutorial it might say 'We'll use a ReLU activation function', which is one
    like of code. WTF is a ReLU? Nothing really magical, you just toss any
    value less than 0 so you don't wind up dithering around a point or blowing
    up the gradient, but what happens under the hood is conveniently
    abstracted away. I'm all for abstraction but there is a point when you
    become a monkey programmer that will be replaced by AI next month. They
    are already working on that with NAS although it's very resource
    intensive:

    https://en.wikipedia.org/wiki/Neural_architecture_search

    Now there is an area where you'd want to do some sort of analysis.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Chris Ahlstrom on Thu Jan 11 04:08:03 2024
    On Wed, 10 Jan 2024 06:57:36 -0500, Chris Ahlstrom wrote:

    Can probably find old texts like that on Project Gutenberg.

    https://www.gutenberg.org/ebooks/33283

    A true Gnu person will download the TeX version.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Thu Jan 11 04:01:25 2024
    On Wed, 10 Jan 2024 18:16:38 -0600, Physfitfreak wrote:


    He believes linear algebra should be taught before any calculus is
    taught. Therefore, I suspect he's aiming at pure applicaton-oriented discussion of it; as, logically, linear algebra can be thoroughly
    understood after completely understanding calculus of single and multivariables.


    What is the best prep for AI? is a common argument.

    Computers do not replace what's required of students and professionals
    and scientists. This is that important point!

    Yeah but I'll stick with computers. In a far distant era I had a slime,
    pocket sized book of nine-place tables. That's so far gone a google search
    only turns up dining room tables or kids' multiplication tables. That, and
    a ivory faced bamboo slide rule with a magnifier and you could conquer the world.

    Sad to think the US put a man on the moon with little better and now the Peregrine can't even get a lander there without running out of fuel. DDS.
    India did it on the second attempt with a shoestring budget.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Ahlstrom@21:1/5 to rbowman on Thu Jan 11 07:02:22 2024
    rbowman wrote this copyrighted missive and expects royalties:

    On Wed, 10 Jan 2024 06:57:36 -0500, Chris Ahlstrom wrote:

    Can probably find old texts like that on Project Gutenberg.

    https://www.gutenberg.org/ebooks/33283

    A true Gnu person will download the TeX version.

    I've partially read that one. The PDF, since I was using a small Android tablet. Also useful for reading Barnes & Noble purchases, though B&N stopped supporting their marketplace on their Android Nook app. Have to order from another device, then load it up on the Nook app. The latest book I ordered was Philip Farmer's "Flesh", which was actually better than I had feared.

    --
    Good day to let down old friends who need help.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Ahlstrom@21:1/5 to vallor on Thu Jan 11 06:56:13 2024
    vallor wrote this copyrighted missive and expects royalties:

    (The student-access Unix host was in great demand at
    the time. Only, it wasn't Unix -- it was Linux. 🙌️ )

    Anyway, the tough part to learning C is knowing
    what's available in your libraries. For beginners
    on Linux, I recommend learning "apropos"
    as well as "man" to find what you're looking
    for.

    Or use Eclipse. (Ducks and runs.)

    Michael Kerrisk's "The Linux Programming Interface" is great.
    I've used it heavily, especially for writing daemons.

    But no need for a fancy-assed IDE. I used vim/gvim, xfce4-terminal windows, tmux, cgdb/gdb, and automake (though I've started another project using meson, which is what I will use for all new projects).

    With 300-page manuals written using vi and LaTeX.

    I do use QtCreator to make the GUIs, though. Lazy-ass!

    With SomaFM playing via mpd/ncmpcpp to keep me sane.

    --
    Q: What's the difference betweeen USL and the Graf Zeppelin?
    A: The Graf Zeppelin represented cutting edge technology for its time.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Tue Jan 16 01:21:19 2024
    On Mon, 15 Jan 2024 17:20:51 -0600, Physfitfreak wrote:

    Then just 4 years back (when Covid hit) I began reading and using a C++
    book out of curiosity and having nothing better to do. I don't remember
    the title of it (it was a pdf). In there, for the first time, I saw the correct picture of how multidimensional arrays are handled in the
    computer and how to use pointers with clear understanding of them.

    https://www.amazon.com/Puzzle-Book-Alan-R-Feuer/dp/0201604612

    iiec the first edition came out in the '80s and used the K&R style. I
    aaume it's been updated. Again my memory may be fuzzy but I think many of
    the puzzles involved pointers and arrays. You know you're in the deep end
    when you see

    int foo = ***ptr;

    A friend and I used to argue whether it should be 'char** argv' or 'char* argv[]'. I always used char?** argv. I outlived him so I win.

    https://c-faq.com/aryptr/aryptrparam.html

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Tue Jan 16 02:23:38 2024
    On Mon, 15 Jan 2024 16:40:13 -0600, Physfitfreak wrote:


    Glad to hear that I have 4 years of fond memories of successfully using
    it (mid 1980s).

    If you're feeling nostalgic

    https://geodesy.noaa.gov/PC_PROD/SPCS83/

    I edited the makefile to substitute f95 for the Windows paths and it
    compiled to a working utility. They have updated to the interactive NCAT
    but the geodetic software is still in Fortran.

    For reference geodetic coordinates are in the form of latitude/longitude.
    The assumption is made that the longitude is positive although it should
    be negative in the western hemisphere. SPCS is a localized projection to reduce distortion for a particular zone.

    The comments say many of the subroutines were written by T. Vincenty.

    https://en.wikipedia.org/wiki/Thaddeus_Vincenty

    I've used pieces after rewriting them in C. There is a f2c program that converts f77 to C. While it may compile, you don't want to try to read
    the C code or maintain it.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to Physfitfreak on Tue Jan 16 06:09:06 2024
    On Mon, 15 Jan 2024 20:23:49 -0600, Physfitfreak wrote:


    I don't think AI is ready to get worked on yet. What they're doing seems
    to be just making a more efficient Wikipedia lookup. It still cannot
    think.

    AI is an unfortunate choice of words. I prefer machine leaning.

    https://en.wikipedia.org/wiki/Machine_learning

    That's as good an overview as any. In the early '80s neural networks were oversold, in part because the processing power wasn't there. The focus
    switched to expert systems which are relatively deterministic. Fuzzy logic introduced a stochastic factor. There still wasn't much intelligence. The current approaches are back to neural networks. I'd say the intelligence
    part comes from not being able to say what the system is doing exactly.
    You can train a network to recognize cats from dogs in the classic 'hello world' of image recognition. You present a lot of images labeled cat or
    dog. In truth these are only images to the human eye. They're presented to
    the neural net as matrices of pixel values and all operations are done on matrices. During the training phase you compare the output against the
    label (cat or dog) to error rate, and feed it back into the weights and
    biases of the network. Eventually you get an acceptable system that can
    tell a cat from a dog 99% of the time but you're not really sure how. Is
    it intelligent?

    Expand that to facial recognition, obstacle recognition for autonomous vehicles, and so forth, do you ever get to intelligence? I don't think so although you do have a very useful system that would be difficult to do
    with deterministic programming.

    Another example is anomaly detection with unsupervised learning. You hook
    a number of sensors to a turbine for example, temperature sensors, accelerometers, microphones, and so forth. Turbines rarely fail and
    you're not sure how to detect potential failures. The network 'learns' the normal operating parameters and can classify non-normal conditions and
    alert the maintenace people. Intelligent? The same concept is starting to
    be applied to medical devices.


    First, and most importantly, one has to come up with a way to give a
    robot a rewarding system. If this important step is taken already, I do
    not know about that at all. It's not a trivial matter and I don't know
    how one would even try doing that. This fact is right at the core of the problem.

    Robots aren't white rats. Will they ever be to the point where you can
    toss them a chunk of cheese for a job well done? Maybe in Isaac Asimov's
    world; I'm not sure about this one.

    https://girlfriend.myanima.ai/

    That's not the only site and it's stuff like that which scares me. I've
    read too many sci-fi scenarios where most of the population spends its
    time in virtual worlds.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From candycanearter07@21:1/5 to Physfitfreak on Wed Jan 17 09:46:28 2024
    On 1/16/24 15:42, Physfitfreak wrote:
    On 1/16/2024 12:09 AM, rbowman wrote:
    The
    current approaches are back to neural networks. I'd say the intelligence
    part comes from not being able to say what the system is doing exactly.
    You can train a network to recognize cats from dogs in the classic 'hello
    world' of image recognition. You present a lot of images labeled cat or
    dog. In truth these are only images to the human eye. They're
    presented to
    the neural net as matrices of pixel values and all operations are done on
    matrices. During the training phase you compare the output against the
    label (cat or dog) to error rate, and feed it back into the weights and
    biases of the network. Eventually you get an acceptable system that can
    tell a cat from a dog 99% of the time but you're not really sure how. Is
    it intelligent?


    Intelligence is a down the road consequence of having a rewarding
    mechanism. This was what I try to point at. It is not time to
    concentrate on creating intelligence unless your aim is not to create AI
    at all, but to create a much more efficient _tool_. A tool that _youi_
    use, not the AI.

    AI, as a robot that can think, do, decide, and so on, requires a
    rewarding mechanism first. With the rewarding mechanism in place, the
    robot itself will become extremely intelligent all by himself, and very
    fast. You wouldn't have to tell him things and label them for him. He'll
    know what's best for him to do and he will do it, one of them being
    getting intelligent enough, because a more intelligent robot can achieve
    the goals that his rewarding system demands, much better than a newbie
    robot. So robots, themselves, will go for learning stuff they need.

    As you see, I don't think the purpose has been to create a machine like
    that, because humans cannot compete with them. Instead, you guys are yet again creating another tool to do what _you_ want. Something that
    pursues _your_ goals, not theirs.

    So this AI business, to me, is just another typical sham. You guys don't
    mean what you say.

    Agreed, it seems to be marketed as a way to "avoid work", but that work
    and learning is the point of art. I'm more worried about execs and
    scammers who want money to abuse artists further.
    --
    user <candycane> is generated from /dev/urandom

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)