GNU/Linux, despite all the candy-ass distros, is intended for
programmers (not coders).
A programmer is knowledgeable in computer science, which requires
a deep knowledge of mathematics, including probability mathematics.
Solve this problem (then hand it to the lackey code monkeys):
In a certain town, a taxi cab sideswipes a parked car then flees.
This is a big crime.
In this town, there are only two taxi cab companies: Blue and Green,
with Blue operating 15% of the taxi cabs.
An eye witness says that he saw a Blue taxi cab do the crime, but
this eyewitness is known to be reliable only 80% of the time.
What is the probability that a Blue taxi cab committed the crime?
Tyrone <none@none.none> wrote:
Mathematics has nothing to do with programming.
Tell that to my college, my highest math in high school was part of the advanced algebra/trig class, which is basically where I began in
college, five credits that semester, five more for precalculus in my
second semester, and *then* the first math that even *counted* toward my computer science major, calculus 1 in summer school. I completed calc 2
in my third semester, before dropping out to pursue my drug career.
80%.
Mathematics has nothing to do with programming.
That's right, still 15%.
GNU/Linux, despite all the candy-ass distros, is intended for
programmers (not coders).
A programmer is knowledgeable in computer science, which requires
a deep knowledge of mathematics, including probability mathematics.
Solve this problem (then hand it to the lackey code monkeys):
In a certain town, a taxi cab sideswipes a parked car then flees.
This is a big crime.
In this town, there are only two taxi cab companies: Blue and Green,
with Blue operating 15% of the taxi cabs.
An eye witness says that he saw a Blue taxi cab do the crime, but
this eyewitness is known to be reliable only 80% of the time.
What is the probability that a Blue taxi cab committed the crime?
GNU/Linux is for highly skilled programmers (not coders).
If you cannot solve this problem then get the fuck out of here.
On Sun, 7 Jan 2024 15:56:10 -0600, Physfitfreak wrote:
80%.
No. The answer is 41%.
The first step is to determine the outcome space. There
are 4 possible outcomes:
1) Cab is Blue and witness is correct.
2) Cab is Green and witness is correct.
3) Cab is Blue and witness is not correct.
4) Cab is Green and witness is not correct.
Now determine the probabilities for each outcome and
then apply the basic relation for conditional probabilities:
https://www.freecodecamp.org/news/content/images/2020/07/Screenshot-2020-07-19-at-22.58.48.png
Here, P(A/B) = P(Blue Cab/Witness Correct)
On 1/7/2024 1:48 PM, Lester Thorpe wrote:
GNU/Linux, despite all the candy-ass distros, is intended for
everyone
programmers (not coders).
Only a frustrated, failed programmer would make up a such a false distinction.
In the real world: programming = coding = developing = engineering
A programmer is knowledgeable in computer science, which requires
a deep knowledge of mathematics, including probability mathematics.
Solve this problem (then hand it to the lackey code monkeys):
No programming knowledge is required for this problem.
In a certain town, a taxi cab sideswipes a parked car then flees.
This is a big crime.
In this town, there are only two taxi cab companies: Blue and Green,
with Blue operating 15% of the taxi cabs.
An eye witness says that he saw a Blue taxi cab do the crime, but
this eyewitness is known to be reliable only 80% of the time.
What is the probability that a Blue taxi cab committed the crime?
15%
Had you asked what is the probability that the eyewitness correctly
reported a blue taxi cab, that would be 80%.
On Monday, January 8, 2024 at 9:45:14 AM UTC-5, DFS wrote:
You asked:
"What is the probability that a Blue taxi cab committed the crime?"
And that answer is 15%.
Nope. Fail again.
If no one had come forth as a witness, then yes.
But a witness, with a known 80% reliability, claimed that the cab was Blue. This adds a whole new dimension
The problem is similar to the cancer-test problem.
A person has a cancer test that indicates a 90% probability of being
positive (has cancer). There is thus a 10% chance of not having
cancer.
But the test is known to fail 5% of the time giving a false positive.
What then is the actual probablility of having cancer?
Probably (no pun intended) 99% of all college graduates cannot
give the correct answer which reveals just how useless the
education system is in the USA.
Lord Master wrote this copyrighted missive and expects royalties:
On Monday, January 8, 2024 at 9:45:14 AM UTC-5, DFS wrote:
You asked:
"What is the probability that a Blue taxi cab committed the crime?"
And that answer is 15%.
Nope. Fail again.
If no one had come forth as a witness, then yes.
But a witness, with a known 80% reliability, claimed that the cab was Blue. >> This adds a whole new dimension
The problem is similar to the cancer-test problem.
A person has a cancer test that indicates a 90% probability of being
positive (has cancer). There is thus a 10% chance of not having
cancer.
But the test is known to fail 5% of the time giving a false positive.
What then is the actual probablility of having cancer?
Probably (no pun intended) 99% of all college graduates cannot
give the correct answer which reveals just how useless the
education system is in the USA.
Bayes theorem is a good antidote.
Didn't bite for your time-waster, but did look at the Wikipedia entry for the "Monty Hall Problem", and what a long article it turned out to be. Informative
on more than one aspect of decision-making.
On Monday, January 8, 2024 at 10:35:26 AM UTC-5, Chris Ahlstrom wrote:
Bayes theorem is a good antidote.Probability theory is the basis for statistics, and statistics is
extremely important in computer programming.
On Monday, January 8, 2024 at 10:35:26 AM UTC-5, Chris Ahlstrom wrote:
Bayes theorem is a good antidote.
Probability theory is the basis for statistics, and statistics is extremely important in computer programming.
Lord Master wrote this copyrighted missive and expects royalties:
On Monday, January 8, 2024 at 10:35:26 AM UTC-5, Chris Ahlstrom wrote:
Bayes theorem is a good antidote.
Probability theory is the basis for statistics, and statistics is
extremely important in computer programming.
Also useful in advertising, except it isn't used; could lend actual likelihoods or risk/benefit to the results of those random-named drugs.
Reminds me, I need to take my Dammitol.
Statistics always made me uneasy. The goal of the class appeared to be how many widgets per thousand you needed to test to ensure only 5% of the
output was defective.
Programmers deal with data routinely and thus they should be masters of statistics (with its probability basis).
That's why it is taught in first year college, usually the last part of
their college algebra course, which often falls on the 2nd semester in school.
Yes, we had a FORTRAN 4 course as well Back then they were telling us,
"C is the language of the future", and yet they didn't teach it to us!..
Sorry, anywhere I said "modern algebra" I meant to say, "college
algebra". Modern algebra is something else, pure math, and an optional
course for those who want to take it.
College algebra, on the other hand, is an absolute must to learn before anything else is done in physics. A light cover of it is done in high schools, but the nice, full, treatment it gets in universities under the "college algebra" course is absolutely essential to learn. From
beginning of that text to its end (which is the probability theory in
fact) a student should not leave _anything_ uncomprehended, because
literally every concept in it will soon be used.
In finding averages, often the knowledge of integrals is required, and
where in the programming world you don't encounter the concept of
"averages"? It's all over. So you must know how to do integrals.
If you really haven't seen a need for knowing these material, then you
have been, as Farley named it, "Code Monkeys". And I'm not joking.
Much of what you say (in general) goes above my head. And if you were in university in 1965 trying to learn programming then you were from a generation that were 10 years ahead of me.
My exposure to computer programming began at UTD in 1980 in graduate
school, and the only computer programming course that was taught as part
of physics department courses was this FORTRAN 4 course which I never
took. When I wanted, first time, to take a programming course it was
Summer and they didn't offer that course in Summer, so instead, I took a
PL/I course in computer science which was I think still part of the math department.
In that course, taught by an extremely pedantic professor (and ardent teacher), I heard of C for the first time, and as I mentioned it above,
he referred to it as "the programming language of the future". But the
funny thing was that there and after, I never heard of a C programming
course in any of the physics or math or computer departments in that
school.
But soon, I saw the probable reasons why they didn't teach C. As soon as
I began preparing for writing scientific programs for my own projects, I realized that literally everything that I looked into was in FORTRAN.
There were all these bits and pieces of various subroutines that were
being used all the time by scientists, some lingering around since 1960s (they had dates!), which were all in FORTRAN. So I put the PL/I aside
and read a FORTRAN 77 (I believe) text from begin to end to start
writing in that language. Or was it FORTRAN 9 ? Or 99? I'm not sure
anymore. Everybody in the department was writing their programs in
FORTRAN.
About Wirth, all I can say is that even in the dead comp.programming
forum there was a sudden rush of posts about his recent passing. So he must've indeed been an important guy in that field.
Modern Algebra is totally abstract. An optional thing. It is a pure math course often taught only in math departments, but sometimes some
students (including myself) went there and took it. I almost never used
it in anything. It only gets a bit of its application in physics in
advanced graduate courses, and even then in only some subjects. It deals
with groups, rings, etc.
On Mon, 8 Jan 2024 22:30:01 -0600, Physfitfreak wrote:
My exposure to computer programming began at UTD in 1980 in graduate
school, and the only computer programming course that was taught as part
of physics department courses was this FORTRAN 4 course which I never
took. When I wanted, first time, to take a programming course it was
Summer and they didn't offer that course in Summer, so instead, I took a
PL/I course in computer science which was I think still part of the math
department.
PL/I was weird. It was supposed to be a Swiss Army knife to replace
Fortran and Cobol, inheriting the quirks of both. It was disliked by both Fortran and Cobol programmers.
The freshman year started with Thomas' 'Calculus and Analytic Geometry'.
Programmers deal with data routinely and thus they should be
masters of statistics (with its probability basis).
For example, the very first thing when analyzing data is to
determine what type of distribution does the data follow.
This involves applying the Kolmogorov–Smirnov test:
https://en.wikipedia.org/wiki/Kolmogorov%E2%80%93Smirnov_test
Now, some "programmers" could resort to "R" to perform the
K-S test,
but who wrote the R code? Ultimately, the buck
has to stop somewhere and that is where we find the REAL
PROGRAMMER.
And that mutherfucker had better be a master at numerical
analysis as well as statistics,
otherwise we'll have junk code everywhere -- and
the pseudo-programmers
On 1/9/24 01:30, rbowman wrote:
On Mon, 8 Jan 2024 22:30:01 -0600, Physfitfreak wrote:
My exposure to computer programming began at UTD in 1980 in graduate
school, and the only computer programming course that was taught as
part of physics department courses was this FORTRAN 4 course which I
never took. When I wanted, first time, to take a programming course it
was Summer and they didn't offer that course in Summer, so instead, I
took a PL/I course in computer science which was I think still part of
the math department.
PL/I was weird. It was supposed to be a Swiss Army knife to replace
Fortran and Cobol, inheriting the quirks of both. It was disliked by
both Fortran and Cobol programmers.
I've never heard of PL/I, was it any good?
On 1/9/2024 1:41 AM, rbowman wrote:
The freshman year started with Thomas' 'Calculus and Analytic
Geometry'.
How tf can you remember the author and title of a math book from 55
years ago?
some dumb fsck wrote:
How tf can you remember the author and title of a math book from 55
years ago?
Okay, I cheated and looked up the exact title but we always referred to
the calculus text as 'Thomas' and the 2 volume physics text as 'Resnick & >Halliday' R&H morphed into H&R along the way, probably in 1970.
https://en.wikipedia.org/wiki/Fundamentals_of_Physics
The economics text was Samuelson. Unfortunately he was a Keynesian. For >trivia, Larry Summers is his nephew. His father was an economist too and
had changed the family name.
The chemistry text was sort of a spiral bound thing by Bunce, one of the >professors. afaik that never was published outside of RPI.
I don't have a clue for any of the others like the diff-e or statistcs
texts.
On Tue, 9 Jan 2024 08:24:19 -0600, candycanearter07 wrote:
On 1/9/24 01:30, rbowman wrote:
On Mon, 8 Jan 2024 22:30:01 -0600, Physfitfreak wrote:
My exposure to computer programming began at UTD in 1980 in graduate
school, and the only computer programming course that was taught as
part of physics department courses was this FORTRAN 4 course which I
never took. When I wanted, first time, to take a programming course it >>>> was Summer and they didn't offer that course in Summer, so instead, I
took a PL/I course in computer science which was I think still part of >>>> the math department.
PL/I was weird. It was supposed to be a Swiss Army knife to replace
Fortran and Cobol, inheriting the quirks of both. It was disliked by
both Fortran and Cobol programmers.
I've never heard of PL/I, was it any good?
Not particularly. One of the problems was the early compilers sucked, It
was sort of the Ada of its day, all things to all men, with the definition
of the language evolving. By the time IBM got through fiddling around it
had been superseded by C outside of IBM. DEC used it too.
It was influenced by Algol 60 but then you can make the case that
everything was influenced by Algol.
CPL never quite made it but it was
the basis for BCPL which became B which became C. Along the way curly
braces and so forth replaced the BEGIN ... END BLOCK sort of structure.
IBM was never bashful:
PL/I Programming Language One
APL A Programming Language
APL has held up better despite having a strange character swet that
usually required a special keyboard. Some of its quirks persist in R like
the <- assignment operator.
On Tue, 9 Jan 2024 08:24:19 -0600, candycanearter07 wrote:
On 1/9/24 01:30, rbowman wrote:
On Mon, 8 Jan 2024 22:30:01 -0600, Physfitfreak wrote:
My exposure to computer programming began at UTD in 1980 in graduate
school, and the only computer programming course that was taught as
part of physics department courses was this FORTRAN 4 course which I
never took. When I wanted, first time, to take a programming course it >>>> was Summer and they didn't offer that course in Summer, so instead, I
took a PL/I course in computer science which was I think still part of >>>> the math department.
PL/I was weird. It was supposed to be a Swiss Army knife to replace
Fortran and Cobol, inheriting the quirks of both. It was disliked by
both Fortran and Cobol programmers.
I've never heard of PL/I, was it any good?
Not particularly. One of the problems was the early compilers sucked, It
was sort of the Ada of its day, all things to all men, with the definition
of the language evolving. By the time IBM got through fiddling around it
had been superseded by C outside of IBM. DEC used it too.
It was influenced by Algol 60 but then you can make the case that
everything was influenced by Algol. CPL never quite made it but it was
the basis for BCPL which became B which became C. Along the way curly
braces and so forth replaced the BEGIN ... END BLOCK sort of structure.
IBM was never bashful:
PL/I Programming Language One
APL A Programming Language
APL has held up better despite having a strange character swet that
usually required a special keyboard. Some of its quirks persist in R like
the <- assignment operator.
No, linear algebra is another rather advanced course (offered in third
year undergraduate) in math that is taught in physics department. It has
a lot of applications in physics.
Physfitfreak wrote:
No, linear algebra is another rather advanced course (offered in third
year undergraduate) in math that is taught in physics department. It has
a lot of applications in physics.
I took a pair of optional Linear Algebra classes at the tail end of my
EE program. They were the toughest classes of the entire curriculum.
The prof didn't use the textbook at all!
Those classes damn near kicked my ass. After getting my ass handed to
me in the first test, I considered dropping the class, not wanting to
have my GPA slaughtered by a D. I talked to the prof and he convinced
me to stick it out. I ended-up getting a B and (shockingly) an A...
On 1/9/2024 1:50 PM, rbowman wrote:
The economics text was Samuelson.
Hehe :) That was one of the texts in Tehran University's business school where one of my cousins was studying. He always thought Samuelson was a genius and told us that on many occasions. And when one of his friends
once made an amazingly brilliant remark, from then on the cousin was
calling him "Samuelson" :) This went on for years.
Yeah, I've seen the custom APL keyboards and they are crazy.
In Tehran University, our calculus book was a translation of the old
timers' Granville calculus. It was excellent! Then later, an Iranian
wrote his own calculus book which was even better, but that was
something that some students read and learned on their own.
He was a famous physicist in Iran (nuclear physics) but not a very good teacher, mainly because of his German school background. His
pronunciation of many physics terms confused the hell out of us. And he
had a nasty temper on top of all that.
I keep seeing this confusion among engineers about physics as the field
of study in natural philosophy departments of a university. With physics
you have no slacks to allow yourself; you cannot afford to leave
something behind in preparing yourself for what's coming. So the
curricula are designed to make sure of that. Inclusion of college
algebra is just one of those steps.
In a basic mechanics class, the prof said "Why do you not understand
these mechanics problems? They are easy!" :-D :-* :-(
But I don't know that computer programs need to be tested to make sure
that they work. According to the dumbest fscking snit to ever haunt
cola, anyway.
I used Algol in the early 70s over a phone line from school to a
mainframe somewhere. A bit like Pascal.
"There are three things a man must do / Before his life is done /
Write 3 lines in APL / And make the buggers run."
I don't have experience in those areas to say much about them. But I can
say this about each one of them: In every single area that you pointed
at, when you attempt to add the error analysis to your program results,
the concept of averages pops up! ...
How's that for saying something with certainty about something I don't
even know jack about
On Tue, 9 Jan 2024 17:05:48 -0500, Chris Ahlstrom wrote:
I used Algol in the early 70s over a phone line from school to a
mainframe somewhere. A bit like Pascal.
Algol was the antecedent. Most of Wirth's creations resembled Algol right down to having no or very limited I/O capabilities. I've heard unaugmented Pascal described as a language very good at telling secrets to itself.
"There are three things a man must do / Before his life is done /https://www.goodreads.com/quotes/12051-a-human-being-should-be-able-to-change-a-diaper
Write 3 lines in APL / And make the buggers run."
I can check most of Heinlein's boxes but I'll take a pass on APL. I did start a MIT video course on R which I think is sort of the 21st century
APL. Python with the right packages can handle the same sort of tasks and
is faster that R. That must put R in the tortoise class pf programming languages.
On Tue, 9 Jan 2024 16:53:30 -0600, Physfitfreak wrote:
In Tehran University, our calculus book was a translation of the old
timers' Granville calculus. It was excellent! Then later, an Iranian
wrote his own calculus book which was even better, but that was
something that some students read and learned on their own.
https://en.wikipedia.org/wiki/Calculus_Made_Easy
'Calculus Made Easy' Silvanus Thompson.
It's over 100 years old but calculus hasn't changed much.
It's rare for a mathematician to explain anything in English. I've started reading 'Before Machine Learning Volume 1 - Linear Algebra for A.I' by
Jorge Brasil. He is a little more informal. Avoid if you're offended by
the f-bomb. 'Linear Algebra: Theory, Intuition, Code' by Mike Cohen is
more formal but still takes a conversational tone and explains the
nuances. You need to build the vocabulary but starting with what seems to
be arcane symbology isn't a good first step.
But I don't know that computer programs need to be tested to make sure
that they work. According to the dumbest fscking snit to ever haunt
cola, anyway.
-- "ALL non-idiots support the use of testing over compile-time warnings
to determine if the code functions correctly. You're one of the few
idiots who thinks otherwise." - DFS, stating reality
chrisv wrote:
But I don't know that computer programs need to be tested to make sure
that they work. According to the dumbest fscking snit to ever haunt
cola, anyway.
I've worked with several programmers who thought testing was for the QA >department. Unfortunately they also were of the opinion that nothing could >possibly go wrong.
rbowman wrote this copyrighted missive and expects royalties:
On Tue, 9 Jan 2024 17:05:48 -0500, Chris Ahlstrom wrote:
I used Algol in the early 70s over a phone line from school to a
mainframe somewhere. A bit like Pascal.
Algol was the antecedent. Most of Wirth's creations resembled Algol
right down to having no or very limited I/O capabilities. I've heard
unaugmented Pascal described as a language very good at telling secrets
to itself.
Yeah, I took a course in the orignal Pascal, not the Borland version.
The size of a declared array was part of its "type".
Years later our project group made the mistake of buying Borland C++
Builder, on the strength of the reputation of the older Borland C++
compiler, which was very good.
Stepping through the debugger, I saw some crazy stuff going on.... it
turns out that VCL was based on <gasp!> Delphi (the successor to Turbo Pascal).
No wonder Microsoft cleaned Borland's (now Embarcadero) clock with
Visual Studio (as bragged about in Jim McCarthy's book, "Dynamics of
Software Development".
"There are three things a man must do / Before his life is done /https://www.goodreads.com/quotes/12051-a-human-being-should-be-able-to- change-a-diaper
Write 3 lines in APL / And make the buggers run."
I wonder how our local transphobes would feel about Heinlein's book, "I
Will Fear No Evil" :-D
I can check most of Heinlein's boxes but I'll take a pass on APL. I
did start a MIT video course on R which I think is sort of the 21st
century APL. Python with the right packages can handle the same sort of
tasks and is faster that R. That must put R in the tortoise class pf
programming languages.
I used to try to djinn up uses for R :-D
rbowman wrote:
chrisv wrote:
But I don't know that computer programs need to be tested to make sure
that they work. According to the dumbest fscking snit to ever haunt
cola, anyway.
I've worked with several programmers who thought testing was for the QA >>department. Unfortunately they also were of the opinion that nothing
could possibly go wrong.
That's difficult to believe.
But generally, any activity with a sought result, can have an error
analysis associated with it. Does it involve counting? Measuring? Speed
of something? Frequency of something? Size of something? Does something
vary in time, size, volume, frequency? Is there some relationship
between things varying here to other things varying somewhere else? And
so on and so on...
No wonder Microsoft cleaned Borland's (now Embarcadero) clock with
Visual Studio (as bragged about in Jim McCarthy's book, "Dynamics of
Software Development".
When I write a regular program, I use perl. Can't decide if python is
worth learning -- but the power of pytorch-enabled apps is certainly appealing.
But logically, they say, the best calculus book was the one written by Apostol of Caltech. To logically advance forward, he doesn't start with differentiation before discussing integration, but he covers integration first. Understanding integration first, and from it, the differentiation
is easier than going from differentiation to integration as is done in
almost all calculus texts. Historically also, it was integration that
was invented first (by Newton, or Leibniz, depending on which one you believe).
It is rare that outside school and especially in a computer programming
job, you'd need anything beyond calculus, because as soon as the need
for forming and solving differential equations come up, somebody else
has already solved it before the task even gets down to a programmer.
Can probably find old texts like that on Project Gutenberg.
He believes linear algebra should be taught before any calculus is
taught. Therefore, I suspect he's aiming at pure applicaton-oriented discussion of it; as, logically, linear algebra can be thoroughly
understood after completely understanding calculus of single and multivariables.
Computers do not replace what's required of students and professionals
and scientists. This is that important point!
On Wed, 10 Jan 2024 06:57:36 -0500, Chris Ahlstrom wrote:
Can probably find old texts like that on Project Gutenberg.
https://www.gutenberg.org/ebooks/33283
A true Gnu person will download the TeX version.
(The student-access Unix host was in great demand at
the time. Only, it wasn't Unix -- it was Linux. 🙌️ )
Anyway, the tough part to learning C is knowing
what's available in your libraries. For beginners
on Linux, I recommend learning "apropos"
as well as "man" to find what you're looking
for.
Or use Eclipse. (Ducks and runs.)
Then just 4 years back (when Covid hit) I began reading and using a C++
book out of curiosity and having nothing better to do. I don't remember
the title of it (it was a pdf). In there, for the first time, I saw the correct picture of how multidimensional arrays are handled in the
computer and how to use pointers with clear understanding of them.
Glad to hear that I have 4 years of fond memories of successfully using
it (mid 1980s).
I don't think AI is ready to get worked on yet. What they're doing seems
to be just making a more efficient Wikipedia lookup. It still cannot
think.
First, and most importantly, one has to come up with a way to give a
robot a rewarding system. If this important step is taken already, I do
not know about that at all. It's not a trivial matter and I don't know
how one would even try doing that. This fact is right at the core of the problem.
On 1/16/2024 12:09 AM, rbowman wrote:
The
current approaches are back to neural networks. I'd say the intelligence
part comes from not being able to say what the system is doing exactly.
You can train a network to recognize cats from dogs in the classic 'hello
world' of image recognition. You present a lot of images labeled cat or
dog. In truth these are only images to the human eye. They're
presented to
the neural net as matrices of pixel values and all operations are done on
matrices. During the training phase you compare the output against the
label (cat or dog) to error rate, and feed it back into the weights and
biases of the network. Eventually you get an acceptable system that can
tell a cat from a dog 99% of the time but you're not really sure how. Is
it intelligent?
Intelligence is a down the road consequence of having a rewarding
mechanism. This was what I try to point at. It is not time to
concentrate on creating intelligence unless your aim is not to create AI
at all, but to create a much more efficient _tool_. A tool that _youi_
use, not the AI.
AI, as a robot that can think, do, decide, and so on, requires a
rewarding mechanism first. With the rewarding mechanism in place, the
robot itself will become extremely intelligent all by himself, and very
fast. You wouldn't have to tell him things and label them for him. He'll
know what's best for him to do and he will do it, one of them being
getting intelligent enough, because a more intelligent robot can achieve
the goals that his rewarding system demands, much better than a newbie
robot. So robots, themselves, will go for learning stuff they need.
As you see, I don't think the purpose has been to create a machine like
that, because humans cannot compete with them. Instead, you guys are yet again creating another tool to do what _you_ want. Something that
pursues _your_ goals, not theirs.
So this AI business, to me, is just another typical sham. You guys don't
mean what you say.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 300 |
Nodes: | 16 (2 / 14) |
Uptime: | 07:16:00 |
Calls: | 6,706 |
Files: | 12,236 |
Messages: | 5,350,633 |