Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
Lynn
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
Lynn
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
Lynn
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop.. almost >daring you to hit them.
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
On Sat, 09 Oct 2021 10:30:17 -0400, Michael Trew
<michael.trew@att.net> wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop.. almost
daring you to hit them.
I lived in Chicago for a couple of years. Loved the Els.
But people must have been a lot smarter, back then.
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop.. almost daring you to hit them.
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
... One of my
friends from college in Southern California grew up in the Chicago area.
He remarked that 9 months walking around Pasadena got him used to
drivers actually stopping when the pedestrian had right-of-way, and on return to Chicago each summer had to relearn not to depend on this.
On 10/9/2021 7:30 AM, Michael Trew wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop..
almost daring you to hit them.
I suspect this is true of most major cities. In San Francisco people
will stand in the middle of the street with cameras for 10? 15? minutes waiting for the sunrise or a cable car to reach a certain point. And
you have to take it as a given that half the population simply ignore
the stop lights and pedestrian lights at all intersections. Yesterday I watched some man in a business suit towing a suitcase stop crossing in
the middle of the street to pull his phone out and spend a full minute standing there fiddling with it.
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
On 10/9/2021 3:46 AM, Charles Packer wrote:
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars<snip>
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
(Rather like the telepathic robot in "Liar!".)
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
I can imagine the debate now: Should road rage be added to the
AI?
Funny! But we should try to understand how self driving cars work
to understand their limitations.
For instance, you have a large truck with an attitude tail gating
you and a cardboard box flies out in front of you, or a dog, or a
kid.
Do you want the car to panic stop, or not? It's not going to
swerve, only stop. Can a computer make such a judgement of whether
taking the hit from behind or swerving into an obstacle is worth
it or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense
left.
If there's profit in them capitalism will trump common sense (again).
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:
<snip>I can imagine the debate now: Should road rage be added to the
AI?
Funny! But we should try to understand how self driving cars work
to understand their limitations.
For instance, you have a large truck with an attitude tail gating
you and a cardboard box flies out in front of you, or a dog, or a
kid.
Do you want the car to panic stop, or not? It's not going to
swerve, only stop. Can a computer make such a judgement of whether
taking the hit from behind or swerving into an obstacle is worth
it or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense
left.
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>> true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense left.
On 10/9/21 1:16 PM, Dimensional Traveler wrote:
On 10/9/2021 7:30 AM, Michael Trew wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>> true.
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop..
almost daring you to hit them.
I suspect this is true of most major cities. In San Francisco people
will stand in the middle of the street with cameras for 10? 15?
minutes waiting for the sunrise or a cable car to reach a certain
point. And you have to take it as a given that half the population
simply ignore the stop lights and pedestrian lights at all
intersections. Yesterday I watched some man in a business suit towing
a suitcase stop crossing in the middle of the street to pull his phone
out and spend a full minute standing there fiddling with it.
Years ago, some MIT students did a study that revealed that the safest
way to cross a street in Cambridge was to step right off the curb with one’s nose buried in a book.
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
If you want to drive a car, then drive the car. If you're too lazy, too tired, or simply can't be bothered, then get a taxi, bus, or train.
On Sat, 09 Oct 2021 10:30:17 -0400, Michael Trew
<michael.trew@att.net> wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
Lynn
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop.. almost
daring you to hit them.
I lived in Chicago for a couple of years. Loved the Els.
But people must have been a lot smarter, back then.
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense
left.
Ask Musk that~
They'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
And the lane departure keeps the car centered in the
lane even in shallow turns. On the highway cars
already drive themselves under controlled conditions.
On 10/10/21 2:20 pm, Your Name wrote:
snip
.
If you want to drive a car, then drive the car. If you're too lazy, too
tired, or simply can't be bothered, then get a taxi, bus, or train.
In previous discussions at rasfw on this topic, I have been referred to
Waymo in Phoenix, Arizona who have save self-driving cars and there are
some fascinating youtube videos. Is there any reason why these, or
smaller versions, could not be privately owned for personal use?
On 10/9/2021 2:37 PM, John W Kennedy wrote:
On 10/9/21 1:16 PM, Dimensional Traveler wrote:
On 10/9/2021 7:30 AM, Michael Trew wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>>> true.
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop..
almost daring you to hit them.
I suspect this is true of most major cities. In San Francisco people
will stand in the middle of the street with cameras for 10? 15?
minutes waiting for the sunrise or a cable car to reach a certain
point. And you have to take it as a given that half the population
simply ignore the stop lights and pedestrian lights at all
intersections. Yesterday I watched some man in a business suit
towing a suitcase stop crossing in the middle of the street to pull
his phone out and spend a full minute standing there fiddling with it.
Years ago, some MIT students did a study that revealed that the safest
way to cross a street in Cambridge was to step right off the curb with
one’s nose buried in a book.
I'm seem moms pushing baby carriages stroll into the
busy street without looking up one little bit.
In Miami I'm shocked at how often a pedestrian walks out into
a busy rush hour street without even taking a glance.
That's unthinkable to me. I can't count how many times
I've almost been rear ended because I had to stop
midway in a turn because of a pedestrian with ear buds
and a phone in front of him.
Not long ago this guy on a bicycle was peddling down
the wrong way in middle of a busy city street, in
the...middle, with his left hand on the bar and
his right arm cradling a toddler. When I tapped
the horn at him to ask 'are you nuts'? He lifted up
his left hand off the bar and gave me the finger.
Look ma, no hands and a baby.
On 10/9/2021 6:45 PM, Titus G wrote:
On 10/10/21 2:20 pm, Your Name wrote:They're in San Francisco too. No idea what their safety record is.
snip
.
If you want to drive a car, then drive the car. If you're too lazy, too
tired, or simply can't be bothered, then get a taxi, bus, or train.
In previous discussions at rasfw on this topic, I have been referred to
Waymo in Phoenix, Arizona who have save self-driving cars and there are
some fascinating youtube videos. Is there any reason why these, or
smaller versions, could not be privately owned for personal use?
On 10/9/2021 12:02 PM, Paul S Person wrote:
On Sat, 09 Oct 2021 10:30:17 -0400, Michael Trew
<michael.trew@att.net> wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>> true.
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop.. almost >>> daring you to hit them.
I lived in Chicago for a couple of years. Loved the Els.
But people must have been a lot smarter, back then.
It certainly would represent a change from the late 1960s. One of my
friends from college in Southern California grew up in the Chicago area.
He remarked that 9 months walking around Pasadena got him used to
drivers actually stopping when the pedestrian had right-of-way, and on
return to Chicago each summer had to relearn not to depend on this.
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
(Rather like the telepathic robot in "Liar!".)--
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>>> true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense left.
Ask Musk that~
They'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
And the lane departure keeps the car centered in the--
lane even in shallow turns. On the highway cars
already drive themselves under controlled conditions.
On 10/9/2021 3:18 PM, Dorothy J Heydt wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
That doesn't make any sense to design a robot
in that way at all. Every time a moral dilemma
presents itself the robot shuts off and is destroyed?
No programmer would ever think of allowing that
in the real world. Remember our products need
to be lawyer proofed. Shutting down would
take first place on the not to do list.
This question is no longer a science fiction plot
but a current problem we need to solve now, if
our cars are faced with just such a moral dilemma
risk the driver or the pedestrian, do you want the car
to just shut off?
Of course not. That would be unthinkable.
Frontiers in Robotics and AI
Ethics in Robotics and Artificial Intelligence
Risk of Injury in Moral Dilemmas With Autonomous Vehicles
As autonomous machines, such as automated vehicles (AVs) and
robots, become pervasive in society, they will inevitably
face moral dilemmas where they must make decisions that risk
injuring humans. However, prior research has framed these
dilemmas in starkly simple terms, i.e., framing decisions
as life and death and neglecting the influence of risk of
injury to the involved parties on the outcome. Here, we
focus on this gap and present experimental work that
systematically studies the effect of risk of injury on the
decisions people make in these dilemmas.
In four experiments, participants were asked to program
their AVs to either save five pedestrians, which we refer to
as the utilitarian choice, or save the driver, which we
refer to as the nonutilitarian choice.
https://www.frontiersin.org/articles/10.3389/frobt.2020.572529/full
The movie was all about what could happen if
once the AI or robots were fully integrated
into society what might happen if the next
generation were given sinister programming
for a dictator to take over.
Here's a clip from the movie.
I, Robot - Whose Revolution https://www.youtube.com/watch?v=hrGco_ztJkw&ab_channel=Avatarthelastbenderheaven
I would say cars have far more potential for
sinister acts and destruction than a robot.
(Rather like the telepathic robot in "Liar!".)
On 10/9/2021 12:02 PM, Paul S Person wrote:
On Sat, 09 Oct 2021 10:30:17 -0400, Michael Trew
<michael.trew@att.net> wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>> true.
Lynn
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop.. almost >>> daring you to hit them.
I lived in Chicago for a couple of years. Loved the Els.
But people must have been a lot smarter, back then.
Someone told me that's how people were in Chicago, and I decided, on a
road trip about 3 years ago "for the fun of it", to hop off the freeway
on a random downtown Chicago exit. Boy, that was a mistake... It took a half hour to find my way back, and I experienced - I can't say how many
times - pedestrians just wandering out or darting into traffic, not looking. I was driving an old '94 Geo Metro, and they tested my brakes
a few times for sure.
On Sat, 9 Oct 2021 13:06:32 -0400, Mark Jackson
<mjackson@alumni.caltech.edu> wrote:
On 10/9/2021 12:02 PM, Paul S Person wrote:
On Sat, 09 Oct 2021 10:30:17 -0400, Michael Trew
<michael.trew@att.net> wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>>> true.
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop.. almost >>>> daring you to hit them.
I lived in Chicago for a couple of years. Loved the Els.
But people must have been a lot smarter, back then.
It certainly would represent a change from the late 1960s. One of my
friends from college in Southern California grew up in the Chicago area.
He remarked that 9 months walking around Pasadena got him used to
drivers actually stopping when the pedestrian had right-of-way, and on
return to Chicago each summer had to relearn not to depend on this.
When I was briefly in Lincoln, NE and was walking down quiet
residential streets with no marked crosswalks, cars would stop for me
when I was at a corner if I even /looked/ like I was thinking of
crossing. That was in 1974; things may have changed since then.
On 10/9/21 7:41 PM, Jonathan wrote:
On 10/9/2021 3:18 PM, Dorothy J Heydt wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
That doesn't make any sense to design a robot
in that way at all. Every time a moral dilemma
presents itself the robot shuts off and is destroyed?
No programmer would ever think of allowing that
in the real world. Remember our products need
to be lawyer proofed. Shutting down would
take first place on the not to do list.
Asimov had conceptualized his robots before the announcements of the
ASSC (aka Mark I) and the ENIAC. He thought in analogue terms for his “positronic brains”, as is especially visible in “Runaround”, which repeatedly uses the word “potential” (as in “voltage”) with the approximate meaning, “idea or notion in the brain of a robot”. No shame to him, really—apart from a handful of engineers and mathematicians, everybody thought that way.
This question is no longer a science fiction plot
but a current problem we need to solve now, if
our cars are faced with just such a moral dilemma
risk the driver or the pedestrian, do you want the car
to just shut off?
Of course not. That would be unthinkable.
Frontiers in Robotics and AI
Ethics in Robotics and Artificial Intelligence
Risk of Injury in Moral Dilemmas With Autonomous Vehicles
As autonomous machines, such as automated vehicles (AVs) and
robots, become pervasive in society, they will inevitably
face moral dilemmas where they must make decisions that risk
injuring humans. However, prior research has framed these
dilemmas in starkly simple terms, i.e., framing decisions
as life and death and neglecting the influence of risk of
injury to the involved parties on the outcome. Here, we
focus on this gap and present experimental work that
systematically studies the effect of risk of injury on the
decisions people make in these dilemmas.
In four experiments, participants were asked to program
their AVs to either save five pedestrians, which we refer to
as the utilitarian choice, or save the driver, which we
refer to as the nonutilitarian choice.
https://www.frontiersin.org/articles/10.3389/frobt.2020.572529/full
The movie was all about what could happen if
once the AI or robots were fully integrated
into society what might happen if the next
generation were given sinister programming
for a dictator to take over.
Here's a clip from the movie.
I, Robot - Whose Revolution
https://www.youtube.com/watch?v=hrGco_ztJkw&ab_channel=Avatarthelastbenderheaven
I would say cars have far more potential for
sinister acts and destruction than a robot.
(Rather like the telepathic robot in "Liar!".)
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J Heydt)
wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable.
And saving /both/ was not possible. This exposes a problem with the
Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would
have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as
others have pointed out ... remote. I would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It comes off, at
best, as the situation toward the end of the book, where the really
large brains have quietly taken control of the world -- except, in the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic interest.
(Rather like the telepathic robot in "Liar!".)
Anywhere else in the world the cars fill both lanes
until the construction starts then alternate when
it's time to merge.
Not in Michigan, I naturally went up the completely
open half mile of right lane and you should have heard
the honking and shouting, and people were swerving
out in front of me. I couldn't believe it, what's the
matter with these people I thought~
On 10/9/2021 9:48 PM, Michael Trew wrote:
Someone told me that's how people were in Chicago, and I decided, on a
road trip about 3 years ago "for the fun of it", to hop off the
freeway on a random downtown Chicago exit. Boy, that was a mistake...
It took a half hour to find my way back, and I experienced - I can't
say how many times - pedestrians just wandering out or darting into
traffic, not looking. I was driving an old '94 Geo Metro, and they
tested my brakes a few times for sure.
I had a similar experience, but in NY City. I was on a long ride
on my motorcycle from D.C to New Jersey, and since a map is
hard to use while riding I overshot the exit by about 30 miles.
I was suddenly confronted with the George Washington Bridge
going into NY City. I had to go over it and took the first exit
to try to get back but ended up in...the Bronx.
I was all loaded up with back packs and was an easy
target roaming around in what looked like a scene
from the movie French Connection.
So I stopped and tried to gather my senses and this guy
was looking at me intently, started wondering if
this was gonna be my last ride. But astonishingly he
immediately understand my predicament.
He yelled ..."Hey dude turn left, then right and you're
back in New Jersey".
Thank you thank you thank you I shouted back as I rode off.
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
(Rather like the telepathic robot in "Liar!".)
On 10/9/2021 12:02 PM, Paul S Person wrote:
On Sat, 09 Oct 2021 10:30:17 -0400, Michael Trew
<michael.trew@att.net> wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>> true.
Lynn
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype goes;
they just walk out in front of your car, and expect you to stop.. almost >>> daring you to hit them.
I lived in Chicago for a couple of years. Loved the Els.
But people must have been a lot smarter, back then.
Someone told me that's how people were in Chicago, and I decided, on a
road trip about 3 years ago "for the fun of it", to hop off the freeway
on a random downtown Chicago exit. Boy, that was a mistake... It took a half hour to find my way back, and I experienced - I can't say how many
times - pedestrians just wandering out or darting into traffic, not looking. I was driving an old '94 Geo Metro, and they tested my brakes
a few times for sure.
On 2021-10-09 23:51:42 +0000, Jonathan said:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people
will
just jaywalk everywhere, expecting the AI to avoid them. Yup,
probably true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense
left.
Ask Musk that~
Musk is an obnoxious idiot.
They'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
The one at the ring-fenced Paraolympics site crashed into a
sportsperson. Even if you have a completely closed ciruit, there are
still far too many random unknowns that a computer simply can't deal with.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
Just more useless gimmickry for ther terminally lazy and simply
something else to go wrong and expensive to repair.
With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
And the lane departure keeps the car centered in the
lane even in shallow turns. On the highway cars
already drive themselves under controlled conditions.
Doesn't work when the lane markings are too worn, obscured by heavy rain
/ bright sun, or simply non-existant on more rual roads.
If you want to drive a car, then drive the car. If you're too lazy, too tired, or simply can't be bothered, then get a taxi, bus, or train.
On 10/9/2021 5:11 PM, Jonathan wrote:
On 10/9/2021 2:37 PM, John W Kennedy wrote:The one time I was in Miami the drivers were even worse, driving like
On 10/9/21 1:16 PM, Dimensional Traveler wrote:
On 10/9/2021 7:30 AM, Michael Trew wrote:
On 10/8/2021 10:11 PM, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people
will
just jaywalk everywhere, expecting the AI to avoid them. Yup,
probably
true.
That's basically already happening, before AI cars.. have you ever
driven through Chicago? People really are like the stereotype
goes; they just walk out in front of your car, and expect you to
stop.. almost daring you to hit them.
I suspect this is true of most major cities. In San Francisco
people will stand in the middle of the street with cameras for 10?
15? minutes waiting for the sunrise or a cable car to reach a
certain point. And you have to take it as a given that half the
population simply ignore the stop lights and pedestrian lights at
all intersections. Yesterday I watched some man in a business suit
towing a suitcase stop crossing in the middle of the street to pull
his phone out and spend a full minute standing there fiddling with it. >>>>
Years ago, some MIT students did a study that revealed that the
safest way to cross a street in Cambridge was to step right off the
curb with one’s nose buried in a book.
I'm seem moms pushing baby carriages stroll into the
busy street without looking up one little bit.
In Miami I'm shocked at how often a pedestrian walks out into
a busy rush hour street without even taking a glance.
That's unthinkable to me. I can't count how many times
I've almost been rear ended because I had to stop
midway in a turn because of a pedestrian with ear buds
and a phone in front of him.
Not long ago this guy on a bicycle was peddling down
the wrong way in middle of a busy city street, in
the...middle, with his left hand on the bar and
his right arm cradling a toddler. When I tapped
the horn at him to ask 'are you nuts'? He lifted up
his left hand off the bar and gave me the finger.
Look ma, no hands and a baby.
they were the only one on a road with no traffic lights, stop signs or
other traffic control. These were people who didn't even slow down to
take a left turn across four lanes of heavy traffic doing 40+. I'm surprised any pedestrians could _survive_ pulling that shit in that kind
of traffic environment.
On Sat, 9 Oct 2021 19:51:42 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>>>> true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense left. >>>
Ask Musk that~
They'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
Follows it wherever it goes?
Even if you wanted to go elsewhere?
Robostalking?
And the lane departure keeps the car centered in the
lane even in shallow turns. On the highway cars
already drive themselves under controlled conditions.
On 2021-10-09 23:51:42 +0000, Jonathan said:people will
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good,
idea and will never happen if the human race has any actual common sense<snip>just jaywalk everywhere, expecting the AI to avoid them. Yup, probably true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic
Ask Musk that~
Musk is an obnoxious idiot.
sportsperson. Even if you have a completely closed ciruit, there areThey'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
The one at the ring-fenced Paraolympics site crashed into a
something else to go wrong and expensive to repair.But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
Just more useless gimmickry for ther terminally lazy and simply
rain / bright sun, or simply non-existant on more rual roads.With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
And the lane departure keeps the car centered in the
lane even in shallow turns. On the highway cars
already drive themselves under controlled conditions.
Doesn't work when the lane markings are too worn, obscured by heavy
If you want to drive a car, then drive the car. If you're too lazy,too tired, or simply can't be bothered, then get a taxi, bus, or train.
On 10/9/21 3:18 PM, Dorothy J Heydt wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
Lucious young babe is a bit much for an actress who was 33 at the time
and who has spent most of her career playing a dogged NYC prosecutor.
--A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
(Rather like the telepathic robot in "Liar!".)
On 10/10/21 12:37 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:51:42 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>>>>> true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense left. >>>>
Ask Musk that~
They'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
Follows it wherever it goes?
Even if you wanted to go elsewhere?
Robostalking?
Oh for heavens sake, its standard equipment on my 2020 VW. The point
is that, if you turn on cruise control, it will slow down at need to
keep you from rear-ending the slow guy in front of you, and speed back
up once youre clear in front.
In a sense, that is what makes Art Art: it has a meaning in and
of itself.
Well, the good stuff does, anyway. The rest (estimated at 90%,
IIRC) is "crap". And pretty much nobody [1] cares about the
"crap".
On 10/10/21 12:37 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:51:42 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good,
people will just jaywalk everywhere, expecting the AI to
avoid them. Yup, probably true.
Lynn
I can imagine the debate now: Should road rage be added to
the AI?
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a
moronic idea and will never happen if the human race has any
actual common sense left.
Ask Musk that~
They'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
Follows it wherever it goes?
Even if you wanted to go elsewhere?
Robostalking?
Oh for heaven’s sake, it’s standard equipment on my 2020 VW.
The point is that, if you turn on cruise control, it will slow
down at need to keep you from rear-ending the slow guy in front
of you, and speed back up once you’re clear in front.
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J Heydt)
wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable.
And saving /both/ was not possible. This exposes a problem with the
Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would
have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as
others have pointed out ... remote. I would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It comes off, at
best, as the situation toward the end of the book, where the really
large brains have quietly taken control of the world -- except, in the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic
interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
On 10/9/2021 9:20 PM, Your Name wrote:
On 2021-10-09 23:51:42 +0000, Jonathan said:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup,
probably true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars work to >>>>> understand their limitations.
For instance, you have a large truck with an attitude tail gating you >>>>> and a cardboard box flies out in front of you, or a dog, or a kid.
Do you want the car to panic stop, or not? It's not going to swerve, >>>>> only stop. Can a computer make such a judgement of whether taking the >>>>> hit from behind or swerving into an obstacle is worth it or not? For a >>>>> kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense
left.
Ask Musk that~
Musk is an obnoxious idiot.
They'll find a niche, highways and delivering pizza and such. But
around town in rush hour traffic forget it the technology is only
useful as a safety feature to help the driver, not replace the driver.
The one at the ring-fenced Paraolympics site crashed into a
sportsperson. Even if you have a completely closed ciruit, there are
still far too many random unknowns that a computer simply can't deal
with.
But the effort has been producing all kinds of safety features with the
sensors coming along so fast.
Safety features like Automatic Breaking Systems Blind Spot Detection,
Road Sign Recognition Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise control or lane
departure systems? They're fantastic.
Just more useless gimmickry for ther terminally lazy and simply
something else to go wrong and expensive to repair.
They're significant improvements in safety though. The lane departure
systems are a great aid in reducing accidents from distracted driving
which is a big problem. So are the various collision warning systems
from people speeding and
running red lights and so on.
With active cruise control you just set the maximum speed you want to
go and the following distance and the car does the rest. The radar
spots the car in front and follows it with the programmed distance
regardless of whether it changes speed.
And the lane departure keeps the car centered in the lane even in
shallow turns. On the highway cars already drive themselves under
controlled conditions.
Doesn't work when the lane markings are too worn, obscured by heavy
rain / bright sun, or simply non-existant on more rual roads.
That's probably the biggest hurdle, using cameras to spot the painted stripes, like you say that's not very reliable at all. They'll have to
put sensors in the roads and that might not be happening anytime soon.
If you want to drive a car, then drive the car. If you're too lazy, too
tired, or simply can't be bothered, then get a taxi, bus, or train.
Would you say the same about seat belts, air bags or anti lock brakes?
All huge safety improvements just like the recent trend to smart cars.
Normally a home pc has one main mother board that runs a couple of
cards, sound and video card and so on. On a modern top of the line BMW
has 2 main computers controlling as many as 47 cards each with their
own software packages, communicating with each other along half a dozen
bus lines. And Alexa too!
A modern car makes a home pc look like a toy children play with in comparison.
That's a lot of capability and it's been a sea change in safety and
even performance. You can get 750 hp off the showroom floor these days
pretty easily.
On Sun, 10 Oct 2021 13:31:07 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/9/21 3:18 PM, Dorothy J Heydt wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
“Lucious young babe” is a bit much for an actress who was 33 at the time >> and who has spent most of her career playing a dogged NYC prosecutor.
Actually, I think Dr. Calvin in the book goes from not-so-old to very
much older over the course of the book.
And this is a man's version of a female professional scientist. A
version very common at the time.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
(Rather like the telepathic robot in "Liar!".)
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J Heydt)
wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable.
And saving /both/ was not possible. This exposes a problem with the
Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would
have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as
others have pointed out ... remote. I would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It comes off, at
best, as the situation toward the end of the book, where the really
large brains have quietly taken control of the world -- except, in the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic
interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass a law
making it illegal?
As to "original script only using elements from Asimov", a few
observations:
-- What do you think most adaptations /are/ but "original scripts only
using elements from <who/whatever>"? Is it not a constantly renewed observation that "based on X" means "has no relation to X except
perhaps the name"? As adaptations go, /I, Robot/ has a lot more than
the name!
-- Consider a work of Art, such as Aristophanes /Lysistrata/. Unless
the story of how it was written has been preserved (and I'm fairly
sure it has not), we know /nothing at all/ about how it came about or
what the author intended. /All/ of our appreciation of it is based on
/the work itself/.
And the same goes for, say, the Mona Lisa.
/all/ works of art for which we do not know the actual details of how
and why it was done are evaluated entirely based on the work itself
(and similar works, of course).
-- So, statements like "original script with additions" is not
relevant to the film considered as Art. Considered as Art, it is the
film itself that must be considered. And the film itself is a
juiced-up version of the final parts of the book. If it, and
everything about it, vanishes for 1000 years and only the film is rediscovered, /that/ is how it will be evaluated: entirely based on
the work itself (and the book, and similar works).
In a sense, that is what makes Art Art: it has a meaning in and of
itself.
Well, the good stuff does, anyway. The rest (estimated at 90%, IIRC)
is "crap". And pretty much nobody [1] cares about the "crap".
[1] I suppose there could be, or someday be, or have been in the past,
a scholar interesting in documenting, say, how really badly done films
got made. Scholars, after all, are always looking for topics that can
be mined to produce published papers.
On Sun, 10 Oct 2021 14:20:59 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/10/21 12:37 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:51:42 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
Lynn
I can imagine the debate now: Should road rage be added to the AI? >>>>>>
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea >>>>> and will never happen if the human race has any actual common sense left. >>>>>
Ask Musk that~
They'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
Follows it wherever it goes?
Even if you wanted to go elsewhere?
Robostalking?
Oh for heaven’s sake, it’s standard equipment on my 2020 VW. The point >> is that, if you turn on cruise control, it will slow down at need to
keep you from rear-ending the slow guy in front of you, and speed back
up once you’re clear in front.
So, it doesn't actually /follow/ the car.
It just practices social distancing.
On 10/10/21 12:37 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:51:42 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people >>>>>>> will
just jaywalk everywhere, expecting the AI to avoid them. Yup,
probably
true.
Lynn
I can imagine the debate now: Should road rage be added to the AI?
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea
and will never happen if the human race has any actual common sense
left.
Ask Musk that~
They'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
Follows it wherever it goes?
Even if you wanted to go elsewhere?
Robostalking?
Oh for heaven’s sake, it’s standard equipment on my 2020 VW. The point
is that, if you turn on cruise control, it will slow down at need to
keep you from rear-ending the slow guy in front of you, and speed back
up once you’re clear in front.
And the lane departure keeps the car centered in the
lane even in shallow turns. On the highway cars
already drive themselves under controlled conditions.
On 2021-10-10 14:21:22 +0000, Jonathan said:
On 10/9/2021 9:20 PM, Your Name wrote:<snip>
On 2021-10-09 23:51:42 +0000, Jonathan said:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real
good, people will just jaywalk everywhere, expecting the
AI to avoid them. Yup, probably true.
Lynn
I can imagine the debate now: Should road rage be added to
the AI?
Funny! But we should try to understand how self driving
cars work to understand their limitations.
For instance, you have a large truck with an attitude tail
gating you and a cardboard box flies out in front of you,
or a dog, or a kid.
Do you want the car to panic stop, or not? It's not going
to swerve, only stop. Can a computer make such a judgement
of whether taking the hit from behind or swerving into an
obstacle is worth it or not? For a kid yes, for a dog no
right?
Just one of the many reasons why self-driving cars are a
moronic idea and will never happen if the human race has any
actual common sense left.
Ask Musk that~
Musk is an obnoxious idiot.
They'll find a niche, highways and delivering pizza and such.
But around town in rush hour traffic forget it the technology
is only useful as a safety feature to help the driver, not
replace the driver.
The one at the ring-fenced Paraolympics site crashed into a
sportsperson. Even if you have a completely closed ciruit,
there are still far too many random unknowns that a computer
simply can't deal with.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems Blind Spot
Detection, Road Sign Recognition Cross-Traffic Alert,
Collision Warnings and Pedestrian Detection.
Have you driven a car with active radar guided cruise control
or lane departure systems? They're fantastic.
Just more useless gimmickry for ther terminally lazy and
simply something else to go wrong and expensive to repair.
They're significant improvements in safety though. The lane
departure systems are a great aid in reducing accidents from
distracted driving which is a big problem. So are the various
collision warning systems from people speeding and
running red lights and so on.
With active cruise control you just set the maximum speed you
want to go and the following distance and the car does the
rest. The radar spots the car in front and follows it with
the programmed distance regardless of whether it changes
speed.
And the lane departure keeps the car centered in the lane
even in shallow turns. On the highway cars already drive
themselves under controlled conditions.
Doesn't work when the lane markings are too worn, obscured by
heavy rain / bright sun, or simply non-existant on more rual
roads.
That's probably the biggest hurdle, using cameras to spot the
painted stripes, like you say that's not very reliable at all.
They'll have to put sensors in the roads and that might not be
happening anytime soon.
If you want to drive a car, then drive the car. If you're too
lazy, too tired, or simply can't be bothered, then get a taxi,
bus, or train.
Would you say the same about seat belts, air bags or anti lock
brakes? All huge safety improvements just like the recent trend
to smart cars.
Those are actual safety features.
These latest gimmicks simply encourage drivers to be less
attentive and more lazy
(as proven by the number of morons who
believe Tesla cars can drive themselves!)
That's a lot of capability and it's been a sea change in safety
and even performance. You can get 750 hp off the showroom floor
these days pretty easily.
Nobody actually needs that though and the ridiculous top speeds
are simply legally unusable in most countries. Cars for public
comsumption should be physically limited and unbypassable ...
*that* would be an extra safety feature!
On 10/10/2021 2:20 PM, John W Kennedy wrote:
On 10/10/21 12:37 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:51:42 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, >>>>>>>> probably true.
Lynn
I can imagine the debate now: Should road rage be added to the AI? >>>>>>
Funny! But we should try to understand how self driving cars
work to understand their limitations.
For instance, you have a large truck with an attitude
tail gating you and a cardboard box flies out in front
of you, or a dog, or a kid.
Do you want the car to panic stop, or not?
It's not going to swerve, only stop.
Can a computer make such a judgement of
whether taking the hit from behind or
swerving into an obstacle is worth it
or not? For a kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea >>>>> and will never happen if the human race has any actual common sense left. >>>>
Ask Musk that~
They'll find a niche, highways and delivering pizza
and such. But around town in rush hour traffic forget it
the technology is only useful as a safety feature
to help the driver, not replace the driver.
But the effort has been producing all kinds of safety
features with the sensors coming along so fast.
Safety features like Automatic Breaking Systems
Blind Spot Detection, Road Sign Recognition
Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise
control or lane departure systems? They're fantastic.
With active cruise control you just set the maximum
speed you want to go and the following distance and
the car does the rest. The radar spots the car in
front and follows it with the programmed distance
regardless of whether it changes speed.
Follows it wherever it goes?
Even if you wanted to go elsewhere?
Robostalking?
Oh for heaven’s sake, it’s standard equipment on my 2020 VW. The
point is that, if you turn on cruise control, it will slow down at need
to keep you from rear-ending the slow guy in front of you, and speed
back up once you’re clear in front.
Combined with lane departure system it's great for
long trips, you can be talking on the phone and
eating lunch at the same time, and let the car
do the driving.
On 10/8/2021 11:35 PM, Joy Beeson wrote:
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably
true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
In Miami I'm shocked daily by watching people walk
into a busy rush hour downtown street without
taking a glance first. Not a peep.
* Jonathan:
On 10/8/2021 11:35 PM, Joy Beeson wrote:
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>> true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
In Miami I'm shocked daily by watching people walk
into a busy rush hour downtown street without
taking a glance first. Not a peep.
Of course. Not (appearing to be) looking is a strategy, putting the onus squarely on the drivers not to cause an accident.
I see a bit of a similar strategy in reverse here where I live, with
cars turning at intersections ignoring pedestrians in the crosswalk. Insisting on keeping walking would be risky for me, but - in case of
left turns - even more so, risk causing chaos in the intersection, so I
often grudgingly let them pass with just a big frown.
More and more intersections get, if not entirely separate green phases
for cars and pedestrian, then at least a few seconds where pedestrians
can start crossing while cars have red in all directions, to alleviate
that problem. Sure, it's more relaxed this way, but this would not be an issue at all with AI driving, and in this case, just reliably following
the rules.
On 10/11/21 12:17 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 13:31:07 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/9/21 3:18 PM, Dorothy J Heydt wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
Lucious young babe is a bit much for an actress who was 33 at the time >>> and who has spent most of her career playing a dogged NYC prosecutor.
Actually, I think Dr. Calvin in the book goes from not-so-old to very
much older over the course of the book.
Shes just a student when she cameos in Robbie. (Theres another
mistake in Asimovs early robots. He thought that would be easy for AIs
to understand spoken natural language, but hard for them to speak. In >reality, IBM had speaking devices off the shelfwell, off the
industrial-size palette at leastby the mid-60s, but it was not until
the 2010s that you could get speech recognition with both generalized >vocabulary and no need for training to the particular speaker.)
--And this is a man's version of a female professional scientist. A
version very common at the time.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
(Rather like the telepathic robot in "Liar!".)
On 2021-10-12 04:22:33 +0000, Quinn C said:
* Jonathan:
On 10/8/2021 11:35 PM, Joy Beeson wrote:
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>>> true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
In Miami I'm shocked daily by watching people walk
into a busy rush hour downtown street without
taking a glance first. Not a peep.
Of course. Not (appearing to be) looking is a strategy, putting the onus
squarely on the drivers not to cause an accident.
I see a bit of a similar strategy in reverse here where I live, with
cars turning at intersections ignoring pedestrians in the crosswalk.
Insisting on keeping walking would be risky for me, but - in case of
left turns - even more so, risk causing chaos in the intersection, so I
often grudgingly let them pass with just a big frown.
More and more intersections get, if not entirely separate green phases
for cars and pedestrian, then at least a few seconds where pedestrians
can start crossing while cars have red in all directions, to alleviate
that problem. Sure, it's more relaxed this way, but this would not be an
issue at all with AI driving, and in this case, just reliably following
the rules.
Separating pedestrians and traffic is the entire point of the
dual-system traffic light controlled intersections (there are of course
some traffic light intersections that aren't meant for pedestrians) ...
the problem is that too many idiots don't think the rules apply to
them, so try to walk across even when the pedestrian signal is telling
them to wait, or drive across when the light has turned red. :-\
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J Heydt) >>>> wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable.
And saving /both/ was not possible. This exposes a problem with the
Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would
have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as
others have pointed out ... remote. I would have preferred one focused >>>> on Dr. Calvin and the two engineers (mechanics?). It comes off, at
best, as the situation toward the end of the book, where the really
large brains have quietly taken control of the world -- except, in the >>>> film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic
interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass a law
making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
As to "original script only using elements from Asimov", a few
observations:
-- What do you think most adaptations /are/ but "original scripts only
using elements from <who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to X except
perhaps the name"? As adaptations go, /I, Robot/ has a lot more than
the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science fiction
film ever made". The screenplay has never been filmed and
was eventually published in book form in 1994.
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
-- Consider a work of Art, such as Aristophanes /Lysistrata/. Unless
the story of how it was written has been preserved (and I'm fairly
sure it has not), we know /nothing at all/ about how it came about or
what the author intended. /All/ of our appreciation of it is based on
/the work itself/.
On 2021-10-10 14:21:22 +0000, Jonathan said:
On 10/9/2021 9:20 PM, Your Name wrote:<snip>
On 2021-10-09 23:51:42 +0000, Jonathan said:
On 10/9/2021 5:15 PM, Your Name wrote:
On 2021-10-09 16:31:50 +0000, Jonathan said:
On 10/9/2021 3:46 AM, Charles Packer wrote:<snip>
On Fri, 08 Oct 2021 21:11:48 -0500, Lynn McGuire wrote:Funny! But we should try to understand how self driving cars work to >>>>>> understand their limitations.
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, >>>>>>>> probably true.
Lynn
I can imagine the debate now: Should road rage be added to the AI? >>>>>>
For instance, you have a large truck with an attitude tail gating you >>>>>> and a cardboard box flies out in front of you, or a dog, or a kid. >>>>>>
Do you want the car to panic stop, or not? It's not going to swerve, >>>>>> only stop. Can a computer make such a judgement of whether taking the >>>>>> hit from behind or swerving into an obstacle is worth it or not? For a >>>>>> kid yes, for a dog no right?
Just one of the many reasons why self-driving cars are a moronic idea >>>>> and will never happen if the human race has any actual common sense
left.
Ask Musk that~
Musk is an obnoxious idiot.
They'll find a niche, highways and delivering pizza and such. But
around town in rush hour traffic forget it the technology is only
useful as a safety feature to help the driver, not replace the driver.
The one at the ring-fenced Paraolympics site crashed into a
sportsperson. Even if you have a completely closed ciruit, there are
still far too many random unknowns that a computer simply can't deal
with.
But the effort has been producing all kinds of safety features with the >>>> sensors coming along so fast.
Safety features like Automatic Breaking Systems Blind Spot Detection,
Road Sign Recognition Cross-Traffic Alert, Collision Warnings and
Pedestrian Detection.
Have you driven a car with active radar guided cruise control or lane
departure systems? They're fantastic.
Just more useless gimmickry for ther terminally lazy and simply
something else to go wrong and expensive to repair.
They're significant improvements in safety though. The lane departure
systems are a great aid in reducing accidents from distracted driving
which is a big problem. So are the various collision warning systems
from people speeding and
running red lights and so on.
With active cruise control you just set the maximum speed you want to
go and the following distance and the car does the rest. The radar
spots the car in front and follows it with the programmed distance
regardless of whether it changes speed.
And the lane departure keeps the car centered in the lane even in
shallow turns. On the highway cars already drive themselves under
controlled conditions.
Doesn't work when the lane markings are too worn, obscured by heavy
rain / bright sun, or simply non-existant on more rual roads.
That's probably the biggest hurdle, using cameras to spot the painted
stripes, like you say that's not very reliable at all. They'll have to
put sensors in the roads and that might not be happening anytime soon.
If you want to drive a car, then drive the car. If you're too lazy, too
tired, or simply can't be bothered, then get a taxi, bus, or train.
Would you say the same about seat belts, air bags or anti lock brakes?
All huge safety improvements just like the recent trend to smart cars.
Those are actual safety features.
These latest gimmicks simply encourage drivers to be less attentive and
more lazy (as proven by the number of morons who believe Tesla cars can
drive themselves!) ... which is the complete opposite of a safety
feature. Plus it's simply something else to go wrong and need expensive >repairs.
My mother's car has silly tyre pressure monitoring, which keeps saying
there is something wrong with one of the tyres desite the fact that
there isn't. It's supposedly been repaired twice, at great expense, yet
still starts complaiing soon afterwards.
Normally a home pc has one main mother board that runs a couple of
cards, sound and video card and so on. On a modern top of the line BMW
has 2 main computers controlling as many as 47 cards each with their
own software packages, communicating with each other along half a dozen
bus lines. And Alexa too!
A modern car makes a home pc look like a toy children play with in comparison.
All of which simply prove the point that there's a lot more that can go
wrong and require expensive repairs.
That's a lot of capability and it's been a sea change in safety and
even performance. You can get 750 hp off the showroom floor these days
pretty easily.
Nobody actually needs that though and the ridiculous top speeds are
simply legally unusable in most countries. Cars for public comsumption
should be physically limited and unbypassable ... *that* would be an
extra safety feature!
My mother's car has silly tyre pressure monitoring, which keeps saying
there is something wrong with one of the tyres desite the fact that
there isn't. It's supposedly been repaired twice, at great expense, yet
still starts complaiing soon afterwards.
Of course, then you would have to check the pressure manually,
from time to time.
On 10/11/2021 4:24 PM, Your Name wrote:
My mother's car has silly tyre pressure monitoring, which keeps saying
there is something wrong with one of the tyres desite the fact that
there isn't. It's supposedly been repaired twice, at great expense, yet
still starts complaiing soon afterwards.
I've driven several vehicles with tire pressure monitoring and it has
worked properly on exactly none of them.
The worst was on an Opel Vivaro (van) I rented to drive family around
on a visit to Ireland. My first experience of driving on the left (and
of regularly using a manual shift since the early 1980s) and I didn't
need the worry of having the tire warning come on whenever I'd been
driving at highway speeds for a while. Since it would go off (and then
back on) after a while, and never came on at city speeds (when the tire temperatures, hence pressures) were lower, I decided it was safe to
ignore.
On Mon, 11 Oct 2021 16:39:36 -0400, John W Kennedy...
<john.w.kennedy@gmail.com> wrote:
On 10/11/21 12:17 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 13:31:07 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/9/21 3:18 PM, Dorothy J Heydt wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
“Lucious young babe” is a bit much for an actress who was 33 at the time
and who has spent most of her career playing a dogged NYC prosecutor.
Actually, I think Dr. Calvin in the book goes from not-so-old to very
much older over the course of the book.
She’s just a student when she cameos in “Robbie”. (There’s another >> mistake in Asimov’s early robots. He thought that would be easy for AIs
to understand spoken natural language, but hard for them to speak. In
reality, IBM had speaking devices off the shelf—well, off the
industrial-size palette at least—by the mid-60s, but it was not until
the 2010s that you could get speech recognition with both generalized
vocabulary and no need for training to the particular speaker.)
But that was the result of not having Positronic Brains to work with.
A failure that, no doubt, limits our efforts to this day.
On 2021-10-12 18:05:20 +0000, Mark Jackson said:
On 10/11/2021 4:24 PM, Your Name wrote:
My mother's car has silly tyre pressure monitoring, which
keeps saying there is something wrong with one of the tyres
desite the fact that there isn't. It's supposedly been
repaired twice, at great expense, yet still starts complaiing
soon afterwards.
I've driven several vehicles with tire pressure monitoring and
it has worked properly on exactly none of them.
The worst was on an Opel Vivaro (van) I rented to drive family
around on a visit to Ireland. My first experience of driving
on the left (and of regularly using a manual shift since the
early 1980s) and I didn't need the worry of having the tire
warning come on whenever I'd been driving at highway speeds for
a while. Since it would go off (and then back on) after a
while, and never came on at city speeds (when the tire
temperatures, hence pressures) were lower, I decided it was
safe to ignore.
Even after a service with tyre rotation, the stupid pressure
monitor gimmick still starts complaining about that same tyre
*location*, so it's obvious a useless sytem
that is faulty
rather than the actual tyre ... of course the dealership's
service centre won't acknowledge there's anything wrong with
their "perfect" vehicle. :-\
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J Heydt) >>>>> wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable. >>>>> And saving /both/ was not possible. This exposes a problem with the
Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would >>>>> have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as
others have pointed out ... remote. I would have preferred one focused >>>>> on Dr. Calvin and the two engineers (mechanics?). It comes off, at
best, as the situation toward the end of the book, where the really
large brains have quietly taken control of the world -- except, in the >>>>> film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic >>>>> interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass a law
making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a few
observations:
-- What do you think most adaptations /are/ but "original scripts only
using elements from <who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to X except
perhaps the name"? As adaptations go, /I, Robot/ has a lot more than
the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science fiction
film ever made". The screenplay has never been filmed and
was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may not have
been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do with
Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't relevant/.
What matters is what the film /is/, not how it was made.
And, if you watch enough allegedly-special "features" on DVD/BD, you
will find many examples where learning /how and why a film was made/
does nothing to enhance the experience of viewing it. Quite often it resembles making sausages more than anything else.
/The African Queen/ had its ending replaced when the director, viewing
the dailies, realized he was making a comedy (that's right, the idea
had never occurred to him before) and so the principals had to
survive.
/Ready or Not/ had its ending replaced when the directors asked to try something different from what was scripted -- and it worked better.
Note that that too is a comedy.
What the filmmakers /intend/ and what they /produce/ are not the same
thing. What they /intended/ (and the script is part of that intent) is
simply not relevant to what it /is/.
-- Consider a work of Art, such as Aristophanes /Lysistrata/. Unless
the story of how it was written has been preserved (and I'm fairly
sure it has not), we know /nothing at all/ about how it came about or
what the author intended. /All/ of our appreciation of it is based on
/the work itself/.
<snippo agreement -- nice as it is -- which is then used for the
inevitable descent into emergence and similar stuff>
On 2021-10-12 04:22:33 +0000, Quinn C said:
* Jonathan:
On 10/8/2021 11:35 PM, Joy Beeson wrote:
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup,
probably
true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
In Miami I'm shocked daily by watching people walk
into a busy rush hour downtown street without
taking a glance first. Not a peep.
Of course. Not (appearing to be) looking is a strategy, putting the onus
squarely on the drivers not to cause an accident.
I see a bit of a similar strategy in reverse here where I live, with
cars turning at intersections ignoring pedestrians in the crosswalk.
Insisting on keeping walking would be risky for me, but - in case of
left turns - even more so, risk causing chaos in the intersection, so I
often grudgingly let them pass with just a big frown.
More and more intersections get, if not entirely separate green phases
for cars and pedestrian, then at least a few seconds where pedestrians
can start crossing while cars have red in all directions, to alleviate
that problem. Sure, it's more relaxed this way, but this would not be an
issue at all with AI driving, and in this case, just reliably following
the rules.
Separating pedestrians and traffic is the entire point of the
dual-system traffic light controlled intersections (there are of course
some traffic light intersections that aren't meant for pedestrians) ...
the problem is that too many idiots don't think the rules apply to them,
so try to walk across even when the pedestrian signal is telling them to wait, or drive across when the light has turned red. :-\
* Jonathan:
On 10/8/2021 11:35 PM, Joy Beeson wrote:
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>> true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
In Miami I'm shocked daily by watching people walk
into a busy rush hour downtown street without
taking a glance first. Not a peep.
Of course. Not (appearing to be) looking is a strategy, putting the onus squarely on the drivers not to cause an accident.
I see a bit of a similar strategy in reverse here where I live, with
cars turning at intersections ignoring pedestrians in the crosswalk. Insisting on keeping walking would be risky for me, but - in case of
left turns - even more so, risk causing chaos in the intersection, so I
often grudgingly let them pass with just a big frown.
More and more intersections get, if not entirely separate green phases
for cars and pedestrian, then at least a few seconds where pedestrians
can start crossing while cars have red in all directions, to alleviate
that problem. Sure, it's more relaxed this way, but this would not be an issue at all with AI driving, and in this case, just reliably following
the rules.
On 10/12/2021 12:22 AM, Quinn C wrote:
* Jonathan:
On 10/8/2021 11:35 PM, Joy Beeson wrote:
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>>> true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
In Miami I'm shocked daily by watching people walk
into a busy rush hour downtown street without
taking a glance first. Not a peep.
Of course. Not (appearing to be) looking is a strategy, putting the onus
squarely on the drivers not to cause an accident.
I hadn't thought of that, I bet many do.
I see a bit of a similar strategy in reverse here where I live, with
cars turning at intersections ignoring pedestrians in the crosswalk.
Insisting on keeping walking would be risky for me, but - in case of
left turns - even more so, risk causing chaos in the intersection, so I
often grudgingly let them pass with just a big frown.
I've had a couple near misses on my motorcycle making a right turn and
having to jam on my brakes in front of a car for someone stepping out
without looking. In a car it's a fender bender but on a motorcycle it
could be catastrophic.
I wouldn't think of stepping out without looking both ways and making
sure I'm not going to cause an accident.
More and more intersections get, if not entirely separate green phases
for cars and pedestrian, then at least a few seconds where pedestrians
can start crossing while cars have red in all directions, to alleviate
that problem. Sure, it's more relaxed this way, but this would not be an
issue at all with AI driving, and in this case, just reliably following
the rules.
In downtown Miami third-world rules are followed, first come first
serve regardless of signs or lights. For instance a common tactic is
when the light turns green the guy in the opposing left turn lane
speeds out to beat the oncoming traffic and everyone in the turn lane
behind him forms a train until they've all turned, happened just today.
On 10/12/2021 11:42 AM, Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com
(Dorothy J Heydt) wrote:
In article
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>, Jonathan
<Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
they'd cast a luscious young babe as Dr. Calvin), and now
I have another reason.
A proper Asenion robot would break down with conflict
paralysis and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not
acceptable. And saving /both/ was not possible. This
exposes a problem with the Three Laws: how does the robot
handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a
human would have focused on saving the child.
It's actually a fine action film. It's relation to the book
is, as others have pointed out ... remote. I would have
preferred one focused on Dr. Calvin and the two engineers
(mechanics?). It comes off, at best, as the situation
toward the end of the book, where the really large brains
have quietly taken control of the world -- except, in the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as
the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass
a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a
few observations:
-- What do you think most adaptations /are/ but "original
scripts only using elements from <who/whatever>"? Is it not a
constantly renewed observation that "based on X" means "has
no relation to X except perhaps the name"? As adaptations go,
/I, Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science
fiction film ever made". The screenplay has never been filmed
and was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may not
have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do with
Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't
relevant/. What matters is what the film /is/, not how it was
made.
And, if you watch enough allegedly-special "features" on
DVD/BD, you will find many examples where learning /how and why
a film was made/ does nothing to enhance the experience of
viewing it. Quite often it resembles making sausages more than
anything else.
I used to know someone that before watching a movie
she would research it, even checking the ending before
going. Why do that? Why not let the artist present
the work as they meant it to be seen without
prejudging it?
On 10/12/2021 11:42 AM, Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
-- Consider a work of Art, such as Aristophanes /Lysistrata/. Unless
the story of how it was written has been preserved (and I'm fairly
sure it has not), we know /nothing at all/ about how it came about or
what the author intended. /All/ of our appreciation of it is based on
/the work itself/.
I did a paper on Lysistrata in college for extra credit.
When I watched the play presented by the college drama
department I had no idea what the play was about, I went
in cold. I think that's the best way.
But I think it's safe to say the original play didn't
have the male actors wearing three foot long rubber
phallus's strapped to their waists <g>.
I guess the drama dept was looking for extra credit too.
<snippo agreement -- nice as it is -- which is then used for the
inevitable descent into emergence and similar stuff>
To quote Austin Powers...
It's my bag baby!
But isn't science at it's heart all about unraveling
the secrets to nature? Well that secret can be seen
in the Mona Lisa smile or even a passing cloud.
I think that's a big discovery.
On 10/12/2021 4:35 PM, Jonathan wrote:
On 10/12/2021 11:42 AM, Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
<snip--go ahead and restore from Jonathan's last if you want>
-- Consider a work of Art, such as Aristophanes /Lysistrata/. Unless >>>>> the story of how it was written has been preserved (and I'm fairly
sure it has not), we know /nothing at all/ about how it came about or >>>>> what the author intended. /All/ of our appreciation of it is based on >>>>> /the work itself/.
I did a paper on Lysistrata in college for extra credit.
When I watched the play presented by the college drama
department I had no idea what the play was about, I went
in cold. I think that's the best way.
But I think it's safe to say the original play didn't
have the male actors wearing three foot long rubber
phallus's strapped to their waists <g>.
I guess the drama dept was looking for extra credit too.
Well, not RUBBER, no, but only because it hadn't been discovered yet.
I *am* pretty sure they were wearing 3-foot long LEATHER phalli in the original production, because Lysistrata is NOTORIOUS for showing the
priapism caused by the sex strike.
And you once called yourself "history_major". Bah.
<snippo agreement -- nice as it is -- which is then used for the
inevitable descent into emergence and similar stuff>
To quote Austin Powers...
It's my bag baby!
But isn't science at it's heart all about unraveling
the secrets to nature? Well that secret can be seen
in the Mona Lisa smile or even a passing cloud.
I think that's a big discovery.
On 10/12/2021 12:22 AM, Quinn C wrote:
* Jonathan:
On 10/8/2021 11:35 PM, Joy Beeson wrote:
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will >>>>> just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>>>> true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
In Miami I'm shocked daily by watching people walk
into a busy rush hour downtown street without
taking a glance first. Not a peep.
Of course. Not (appearing to be) looking is a strategy, putting the onus
squarely on the drivers not to cause an accident.
I hadn't thought of that, I bet many do.
I see a bit of a similar strategy in reverse here where I live, with
cars turning at intersections ignoring pedestrians in the crosswalk.
Insisting on keeping walking would be risky for me, but - in case of
left turns - even more so, risk causing chaos in the intersection, so I
often grudgingly let them pass with just a big frown.
I've had a couple near misses on my motorcycle making a right turn
and having to jam on my brakes in front of a car for someone
stepping out without looking. In a car it's a fender bender
but on a motorcycle it could be catastrophic.
I wouldn't think of stepping out without looking both ways
and making sure I'm not going to cause an accident.
On 10/12/2021 4:35 PM, Jonathan wrote:
On 10/12/2021 11:42 AM, Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
<snip--go ahead and restore from Jonathan's last if you want>
-- Consider a work of Art, such as Aristophanes /Lysistrata/. Unless >>>>> the story of how it was written has been preserved (and I'm fairly
sure it has not), we know /nothing at all/ about how it came about or >>>>> what the author intended. /All/ of our appreciation of it is based on >>>>> /the work itself/.
I did a paper on Lysistrata in college for extra credit.
When I watched the play presented by the college drama
department I had no idea what the play was about, I went
in cold. I think that's the best way.
But I think it's safe to say the original play didn't
have the male actors wearing three foot long rubber
phallus's strapped to their waists <g>.
I guess the drama dept was looking for extra credit too.
Well, not RUBBER, no, but only because it hadn't been discovered yet.
I *am* pretty sure they were wearing 3-foot long LEATHER phalli in the original production, because Lysistrata is NOTORIOUS for showing the
priapism caused by the sex strike.
And you once called yourself "history_major". Bah.
<snippo agreement -- nice as it is -- which is then used for the
inevitable descent into emergence and similar stuff>
To quote Austin Powers...
It's my bag baby!
But isn't science at it's heart all about unraveling
the secrets to nature? Well that secret can be seen
in the Mona Lisa smile or even a passing cloud.
I think that's a big discovery.
Paul S Person <psperson1@ix.netcom.invalid> wrote in >news:0obbmg9v5p2ulqfgp9kkp6uudm2mdp0r0h@4ax.com:
Of course, then you would have to check the pressure manually,Sounds like they have to do that *now*.
from time to time.
Your Name <YourName@YourISP.com> wrote in
news:sk4pp2$1vle$1@gioia.aioe.org:
On 2021-10-12 18:05:20 +0000, Mark Jackson said:
On 10/11/2021 4:24 PM, Your Name wrote:
My mother's car has silly tyre pressure monitoring, which
keeps saying there is something wrong with one of the tyres
desite the fact that there isn't. It's supposedly been
repaired twice, at great expense, yet still starts complaiing
soon afterwards.
I've driven several vehicles with tire pressure monitoring and
it has worked properly on exactly none of them.
The worst was on an Opel Vivaro (van) I rented to drive family
around on a visit to Ireland. My first experience of driving
on the left (and of regularly using a manual shift since the
early 1980s) and I didn't need the worry of having the tire
warning come on whenever I'd been driving at highway speeds for
a while. Since it would go off (and then back on) after a
while, and never came on at city speeds (when the tire
temperatures, hence pressures) were lower, I decided it was
safe to ignore.
Even after a service with tyre rotation, the stupid pressure
monitor gimmick still starts complaining about that same tyre
*location*, so it's obvious a useless sytem
A defective (or broken) sensor is not - excpet in your idiotic
fantasy land - the same thing as a useless system.
that is faultyThat's not a problem with the system, that's a problem with the
rather than the actual tyre ... of course the dealership's
service centre won't acknowledge there's anything wrong with
their "perfect" vehicle. :-\
dealership's mechancis.
If you try to solve something else, the problem remains. They
failed to solve the sensor problem. You failed to solve the
mechanic problem.
On Tue, 12 Oct 2021 15:22:03 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Your Name <YourName@YourISP.com> wrote in >>news:sk4pp2$1vle$1@gioia.aioe.org:
On 2021-10-12 18:05:20 +0000, Mark Jackson said:
On 10/11/2021 4:24 PM, Your Name wrote:
My mother's car has silly tyre pressure monitoring, which
keeps saying there is something wrong with one of the tyres
desite the fact that there isn't. It's supposedly been
repaired twice, at great expense, yet still starts
complaiing soon afterwards.
I've driven several vehicles with tire pressure monitoring
and it has worked properly on exactly none of them.
The worst was on an Opel Vivaro (van) I rented to drive
family around on a visit to Ireland. My first experience of
driving on the left (and of regularly using a manual shift
since the early 1980s) and I didn't need the worry of having
the tire warning come on whenever I'd been driving at highway
speeds for a while. Since it would go off (and then back on)
after a while, and never came on at city speeds (when the
tire temperatures, hence pressures) were lower, I decided it
was safe to ignore.
Even after a service with tyre rotation, the stupid pressure
monitor gimmick still starts complaining about that same tyre
*location*, so it's obvious a useless sytem
A defective (or broken) sensor is not - excpet in your idiotic
fantasy land - the same thing as a useless system.
that is faultyThat's not a problem with the system, that's a problem with the >>dealership's mechancis.
rather than the actual tyre ... of course the dealership's
service centre won't acknowledge there's anything wrong with
their "perfect" vehicle. :-\
If you try to solve something else, the problem remains. They
failed to solve the sensor problem. You failed to solve the
mechanic problem.
But perhaps there is no sensor problem.
Perhaps there is no mechanic problem.
Pershaps everything is working as it was intended to.
Gotta make money /somehow/.
On 10/8/2021 10:35 PM, Joy Beeson wrote:
On Fri, 8 Oct 2021 21:11:48 -0500, Lynn McGuire
<lynnmcguire5@gmail.com> wrote:
Freefall: AI Driving Cars
http://freefall.purrsia.com/ff3700/fc03655.htm
So his theory is that once AI driving cars gets real good, people will
just jaywalk everywhere, expecting the AI to avoid them. Yup, probably >>> true.
People are already doing that to human drivers, and often up the
challenge by changing to all-black clothing at sunset.
Since my small subdivision out in the county does not have sidewalks or streetlights or shoulders on the country roads, I put on a white shirt
when going for my daily mile or two walk at dusk.
Lynn
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
(Rather like the telepathic robot in "Liar!".)
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J Heydt) >>>>> wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable. >>>>> And saving /both/ was not possible. This exposes a problem with the
Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would >>>>> have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as
others have pointed out ... remote. I would have preferred one focused >>>>> on Dr. Calvin and the two engineers (mechanics?). It comes off, at
best, as the situation toward the end of the book, where the really
large brains have quietly taken control of the world -- except, in the >>>>> film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic >>>>> interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass a law
making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a few
observations:
-- What do you think most adaptations /are/ but "original scripts only
using elements from <who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to X except
perhaps the name"? As adaptations go, /I, Robot/ has a lot more than
the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science fiction
film ever made". The screenplay has never been filmed and
was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may not have
been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do with
Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't relevant/.
What matters is what the film /is/, not how it was made.
On 2021-10-09 1:18 p.m., Dorothy J Heydt wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
cast a luscious young babe as Dr. Calvin), and now I have another
reason.
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
One of several reasons why three laws based robots are crap. Just
imagine all those robots immediately frying uselessly whenever more than
one person happens to be in danger. (However the reason why the robot
saved him first over his objectionss instead of ignoring him an was
because he was closer and it looked to the robot like she was probably >already dead. His orders to it to save the girl were what actually gave
him a higher priority because they told the robot that he was still very
much alive.)
--
(Rather like the telepathic robot in "Liar!".)
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J Heydt) >>>>>> wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd >>>>>>> cast a luscious young babe as Dr. Calvin), and now I have another >>>>>>> reason.
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable. >>>>>> And saving /both/ was not possible. This exposes a problem with the >>>>>> Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would >>>>>> have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as >>>>>> others have pointed out ... remote. I would have preferred one focused >>>>>> on Dr. Calvin and the two engineers (mechanics?). It comes off, at >>>>>> best, as the situation toward the end of the book, where the really >>>>>> large brains have quietly taken control of the world -- except, in the >>>>>> film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic >>>>>> interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass a law
making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a few
observations:
-- What do you think most adaptations /are/ but "original scripts only >>>> using elements from <who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to X except
perhaps the name"? As adaptations go, /I, Robot/ has a lot more than
the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science fiction
film ever made". The screenplay has never been filmed and
was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may not have
been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do with
Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't relevant/.
What matters is what the film /is/, not how it was made.
But what the film is, is "not an adaptation of either I Robot".
Nobody else seemed to have noticed this, though.
Paul S Person <psperson1@ix.netcom.invalid> wrote in >news:5b3emg1hoddl53no873a732r07328uq2ur@4ax.com:
On Tue, 12 Oct 2021 15:22:03 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Your Name <YourName@YourISP.com> wrote in >>>news:sk4pp2$1vle$1@gioia.aioe.org:
On 2021-10-12 18:05:20 +0000, Mark Jackson said:
On 10/11/2021 4:24 PM, Your Name wrote:
My mother's car has silly tyre pressure monitoring, which
keeps saying there is something wrong with one of the tyres
desite the fact that there isn't. It's supposedly been
repaired twice, at great expense, yet still starts
complaiing soon afterwards.
I've driven several vehicles with tire pressure monitoring
and it has worked properly on exactly none of them.
The worst was on an Opel Vivaro (van) I rented to drive
family around on a visit to Ireland. My first experience of
driving on the left (and of regularly using a manual shift
since the early 1980s) and I didn't need the worry of having
the tire warning come on whenever I'd been driving at highway
speeds for a while. Since it would go off (and then back on)
after a while, and never came on at city speeds (when the
tire temperatures, hence pressures) were lower, I decided it
was safe to ignore.
Even after a service with tyre rotation, the stupid pressure
monitor gimmick still starts complaining about that same tyre
*location*, so it's obvious a useless sytem
A defective (or broken) sensor is not - excpet in your idiotic
fantasy land - the same thing as a useless system.
that is faultyThat's not a problem with the system, that's a problem with the >>>dealership's mechancis.
rather than the actual tyre ... of course the dealership's
service centre won't acknowledge there's anything wrong with
their "perfect" vehicle. :-\
If you try to solve something else, the problem remains. They
failed to solve the sensor problem. You failed to solve the
mechanic problem.
But perhaps there is no sensor problem.
Perhaps there is no mechanic problem.
Pershaps everything is working as it was intended to.
Gotta make money /somehow/.
Shiny side in, or out?
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J
Heydt)
wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd >>>>>>> cast a luscious young babe as Dr. Calvin), and now I have another >>>>>>> reason.
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
A proper Asenion robot would break down with conflict paralysis
and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable. >>>>>> And saving /both/ was not possible. This exposes a problem with the >>>>>> Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would >>>>>> have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as >>>>>> others have pointed out ... remote. I would have preferred one
focused
on Dr. Calvin and the two engineers (mechanics?). It comes off, at >>>>>> best, as the situation toward the end of the book, where the really >>>>>> large brains have quietly taken control of the world -- except, in >>>>>> the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic >>>>>> interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass a law
making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a few
observations:
-- What do you think most adaptations /are/ but "original scripts only >>>> using elements from <who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to X except
perhaps the name"? As adaptations go, /I, Robot/ has a lot more than
the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science fiction
film ever made". The screenplay has never been filmed and
was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may not have
been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do with
Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't relevant/.
What matters is what the film /is/, not how it was made.
But what the film is, is "not an adaptation of either I Robot".
But what the film is, is "not an adaptation of either I Robot".
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan <Mailinstead@gmail.com> >>>>> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J
Heydt)
wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd >>>>>>>> cast a luscious young babe as Dr. Calvin), and now I have another >>>>>>>> reason.
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
A proper Asenion robot would break down with conflict paralysis >>>>>>>> and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable. >>>>>>> And saving /both/ was not possible. This exposes a problem with the >>>>>>> Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would >>>>>>> have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as >>>>>>> others have pointed out ... remote. I would have preferred one
focused
on Dr. Calvin and the two engineers (mechanics?). It comes off, at >>>>>>> best, as the situation toward the end of the book, where the really >>>>>>> large brains have quietly taken control of the world -- except, in >>>>>>> the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic >>>>>>> interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass a law
making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a few
observations:
-- What do you think most adaptations /are/ but "original scripts only >>>>> using elements from <who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to X except
perhaps the name"? As adaptations go, /I, Robot/ has a lot more than >>>>> the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science fiction
film ever made". The screenplay has never been filmed and
was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may not have
been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do with
Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't relevant/.
What matters is what the film /is/, not how it was made.
But what the film is, is "not an adaptation of either I Robot".
But both books are fix-ups, anyway. If I were briefed to adapt either
one to the screen, my firs decision would be to make it a TV miniseries. >(Same thing for the Silmarillion or the Lensmen.
On Thu, 14 Oct 2021 03:09:32 -0600, David Johnston ><davidjohnston29@yahoo.com> wrote:
But what the film is, is "not an adaptation of either I Robot".
"With Folded Hands" is in its ancestry somewhere.
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com
(Dorothy J Heydt)
wrote:
In article
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
that they'd cast a luscious young babe as Dr. Calvin),
and now I have another reason.
A proper Asenion robot would break down with conflict
paralysis and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not
acceptable. And saving /both/ was not possible. This
exposes a problem with the Three Laws: how does the robot
handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a
human would have focused on saving the child.
It's actually a fine action film. It's relation to the
book is, as others have pointed out ... remote. I would
have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It
comes off, at best, as the situation toward the end of
the book, where the really large brains have quietly
taken control of the world -- except, in the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as
the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress
pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a
few observations:
-- What do you think most adaptations /are/ but "original
scripts only using elements from <who/whatever>"? Is it not
a constantly renewed observation that "based on X" means
"has no relation to X except perhaps the name"? As
adaptations go, /I, Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science
fiction film ever made". The screenplay has never been
filmed and was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may
not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do
with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't
relevant/. What matters is what the film /is/, not how it was
made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to adapt
either one to the screen, my firs decision would be to make it a
TV miniseries. (Same thing for the Silmarillion or the
Lensmen.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match their
creators' idea of what they should have been.
Now, /that/ might be worth watching!
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com>
wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan <Mailinstead@gmail.com> >>>>>> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J >>>>>>>> Heydt)
wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd >>>>>>>>> cast a luscious young babe as Dr. Calvin), and now I have another >>>>>>>>> reason.
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
A proper Asenion robot would break down with conflict paralysis >>>>>>>>> and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable. >>>>>>>> And saving /both/ was not possible. This exposes a problem with the >>>>>>>> Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would >>>>>>>> have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as >>>>>>>> others have pointed out ... remote. I would have preferred one >>>>>>>> focused
on Dr. Calvin and the two engineers (mechanics?). It comes off, at >>>>>>>> best, as the situation toward the end of the book, where the really >>>>>>>> large brains have quietly taken control of the world -- except, in >>>>>>>> the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic >>>>>>>> interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass a law >>>>>> making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a few
observations:
-- What do you think most adaptations /are/ but "original scripts only >>>>>> using elements from <who/whatever>"? Is it not a constantly renewed >>>>>> observation that "based on X" means "has no relation to X except
perhaps the name"? As adaptations go, /I, Robot/ has a lot more than >>>>>> the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science fiction
film ever made". The screenplay has never been filmed and
was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may not have
been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do with
Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't relevant/. >>>> What matters is what the film /is/, not how it was made.
But what the film is, is "not an adaptation of either I Robot".
But both books are fix-ups, anyway. If I were briefed to adapt either
one to the screen, my firs decision would be to make it a TV miniseries.
(Same thing for the “Silmarillion” or the “Lensmen”.
A miniseries, hopefully, that reproduces each of the short stories,
and does so without "updating" them to match their creators' idea of
what they should have been.
Now, /that/ might be worth watching!
On Thu, 14 Oct 2021 03:09:32 -0600, David Johnston <davidjohnston29@yahoo.com> wrote:
But what the film is, is "not an adaptation of either I Robot".
"With Folded Hands" is in its ancestry somewhere.
On Thu, 14 Oct 2021 20:19:35 -0400, Joy Beeson
<jbeeson@invalid.net.invalid> wrote:
On Thu, 14 Oct 2021 03:09:32 -0600, David Johnston
<davidjohnston29@yahoo.com> wrote:
But what the film is, is "not an adaptation of either I Robot".
"With Folded Hands" is in its ancestry somewhere.
You are talking to someone who appears to believe that /how something
was produced/ fully determines /what it is/.
Paul S Person <psperson1@ix.netcom.invalid> wrote in >news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com
(Dorothy J Heydt)
wrote:
In article
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
that they'd cast a luscious young babe as Dr. Calvin),
and now I have another reason.
A proper Asenion robot would break down with conflict
paralysis and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not
acceptable. And saving /both/ was not possible. This
exposes a problem with the Three Laws: how does the robot
handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a
human would have focused on saving the child.
It's actually a fine action film. It's relation to the
book is, as others have pointed out ... remote. I would
have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It
comes off, at best, as the situation toward the end of
the book, where the really large brains have quietly
taken control of the world -- except, in the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as
the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress
pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a
few observations:
-- What do you think most adaptations /are/ but "original
scripts only using elements from <who/whatever>"? Is it not
a constantly renewed observation that "based on X" means
"has no relation to X except perhaps the name"? As
adaptations go, /I, Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science
fiction film ever made". The screenplay has never been
filmed and was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may
not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do
with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't
relevant/. What matters is what the film /is/, not how it was
made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to adapt
either one to the screen, my firs decision would be to make it a
TV miniseries. (Same thing for the Silmarillion or the
Lensmen.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match their
creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his rinning
mate in 2024, too.
On 10/15/21 11:54 AM, Paul S Person wrote:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan <Mailinstead@gmail.com> >>>>> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan <Mailinstead@gmail.com> >>>>>>> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com (Dorothy J >>>>>>>>> Heydt)
wrote:
In article <PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>, >>>>>>>>>> Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read that they'd >>>>>>>>>> cast a luscious young babe as Dr. Calvin), and now I have another >>>>>>>>>> reason.
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
A proper Asenion robot would break down with conflict paralysis >>>>>>>>>> and be permanently nonfunctional.
You may be right, but not saving /either/ of them is not acceptable. >>>>>>>>> And saving /both/ was not possible. This exposes a problem with the >>>>>>>>> Three Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that a human would >>>>>>>>> have focused on saving the child.
It's actually a fine action film. It's relation to the book is, as >>>>>>>>> others have pointed out ... remote. I would have preferred one >>>>>>>>> focused
on Dr. Calvin and the two engineers (mechanics?). It comes off, at >>>>>>>>> best, as the situation toward the end of the book, where the really >>>>>>>>> large brains have quietly taken control of the world -- except, in >>>>>>>>> the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and as the romantic >>>>>>>>> interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress pass a law >>>>>>> making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov", a few
observations:
-- What do you think most adaptations /are/ but "original scripts only >>>>>>> using elements from <who/whatever>"? Is it not a constantly renewed >>>>>>> observation that "based on X" means "has no relation to X except >>>>>>> perhaps the name"? As adaptations go, /I, Robot/ has a lot more than >>>>>>> the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would lead
to "the first really adult, complex, worthwhile science fiction
film ever made". The screenplay has never been filmed and
was eventually published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may not have
been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do with
Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't relevant/. >>>>> What matters is what the film /is/, not how it was made.
But what the film is, is "not an adaptation of either I Robot".
But both books are fix-ups, anyway. If I were briefed to adapt either
one to the screen, my firs decision would be to make it a TV miniseries. >>> (Same thing for the Silmarillion or the Lensmen.
A miniseries, hopefully, that reproduces each of the short stories,
and does so without "updating" them to match their creators' idea of
what they should have been.
Now, /that/ might be worth watching!
So you think that Nahum Tate should be praised for restoring King
Lears happy ending as found in Geoffrey of Monmouth and Holinshed?
On 2021-10-15 9:57 a.m., Paul S Person wrote:
On Thu, 14 Oct 2021 20:19:35 -0400, Joy Beeson
<jbeeson@invalid.net.invalid> wrote:
On Thu, 14 Oct 2021 03:09:32 -0600, David Johnston
<davidjohnston29@yahoo.com> wrote:
But what the film is, is "not an adaptation of either I Robot".
"With Folded Hands" is in its ancestry somewhere.
You are talking to someone who appears to believe that /how something
was produced/ fully determines /what it is/.
No. She isn't. But she is talking to someone who believes that it
takes more than simply applying the title of a previous work to make the
new work an adaptation. Bladerunner is not an adaptation of The
Bladerunner, and Asimov's I, Robot is not an adaptation of Eando
Binder's I, Robot.
On Fri, 15 Oct 2021 09:12:09 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in >>news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com
(Dorothy J Heydt)
wrote:
In article
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child
based on it's statistical analysis of the best
decision?
that they'd cast a luscious young babe as Dr. Calvin),
and now I have another reason.
A proper Asenion robot would break down with conflict
paralysis and be permanently nonfunctional.
You may be right, but not saving /either/ of them is
not acceptable. And saving /both/ was not possible.
This exposes a problem with the Three Laws: how does
the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that
a human would have focused on saving the child.
It's actually a fine action film. It's relation to the
book is, as others have pointed out ... remote. I would
have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It
comes off, at best, as the situation toward the end of
the book, where the really large brains have quietly
taken control of the world -- except, in the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and
as the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress
pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov",
a few observations:
-- What do you think most adaptations /are/ but "original
scripts only using elements from <who/whatever>"? Is it
not a constantly renewed observation that "based on X"
means "has no relation to X except perhaps the name"? As
adaptations go, /I, Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would
lead to "the first really adult, complex, worthwhile
science fiction film ever made". The screenplay has never
been filmed and was eventually published in book form in
1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may
not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do
with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't
relevant/. What matters is what the film /is/, not how it
was made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to adapt
either one to the screen, my firs decision would be to make it
a TV miniseries. (Same thing for the Silmarillion or the
Lensmen.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match their
creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his rinning
mate in 2024, too.
Well, Biden /is/ a politician.
Whatever it takes to win.
On Fri, 15 Oct 2021 14:35:05 -0600, David Johnston <davidjohnston29@yahoo.com> wrote:
On 2021-10-15 9:57 a.m., Paul S Person wrote:
On Thu, 14 Oct 2021 20:19:35 -0400, Joy Beeson
<jbeeson@invalid.net.invalid> wrote:
On Thu, 14 Oct 2021 03:09:32 -0600, David Johnston
<davidjohnston29@yahoo.com> wrote:
But what the film is, is "not an adaptation of either I Robot".
"With Folded Hands" is in its ancestry somewhere.
You are talking to someone who appears to believe that /how something
was produced/ fully determines /what it is/.
No. She isn't. But she is talking to someone who believes that it
takes more than simply applying the title of a previous work to make the
new work an adaptation. Bladerunner is not an adaptation of The
Bladerunner, and Asimov's I, Robot is not an adaptation of Eando
Binder's I, Robot.
I merely note that that is totally incoherent.
And, BTW, I gave /reasons/ why the film /I, Robot/ adapts the book.
You have read the book, I suppose? Particularly the last few chapters?
You know, the part where large positronic brains take over the world?
Paul S Person <psperson1@ix.netcom.invalid> wrote in
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match their
creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his rinning
mate in 2024, too.
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now he's a sock
puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they can't
control Trump, so no, not what it takes to win. What it takes to
control.
On 2021-10-15 9:57 a.m., Paul S Person wrote:
On Thu, 14 Oct 2021 20:19:35 -0400, Joy Beeson
<jbeeson@invalid.net.invalid> wrote:
On Thu, 14 Oct 2021 03:09:32 -0600, David Johnston
<davidjohnston29@yahoo.com> wrote:
But what the film is, is "not an adaptation of either I Robot".
"With Folded Hands" is in its ancestry somewhere.
You are talking to someone who appears to believe that /how something
was produced/ fully determines /what it is/.
No. She isn't. But she is talking to someone who believes that it
takes more than simply applying the title of a previous work to make the
new work an adaptation. Bladerunner is not an adaptation of The Bladerunner, and Asimov's I, Robot is not an adaptation of Eando
Binder's I, Robot.
Paul S Person <psperson1@ix.netcom.invalid> wrote in >news:6bslmg1sek8mkehegsjjf827j417285riu@4ax.com:
On Fri, 15 Oct 2021 09:12:09 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in >>>news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com
(Dorothy J Heydt)
wrote:
In article
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning child >>>>>>>>>>>>> based on it's statistical analysis of the best
decision?
that they'd cast a luscious young babe as Dr. Calvin), >>>>>>>>>>>> and now I have another reason.
A proper Asenion robot would break down with conflict
paralysis and be permanently nonfunctional.
You may be right, but not saving /either/ of them is
not acceptable. And saving /both/ was not possible.
This exposes a problem with the Three Laws: how does
the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact that
a human would have focused on saving the child.
It's actually a fine action film. It's relation to the
book is, as others have pointed out ... remote. I would
have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It
comes off, at best, as the situation toward the end of
the book, where the really large brains have quietly
taken control of the world -- except, in the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and
as the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did Congress
pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from Asimov",
a few observations:
-- What do you think most adaptations /are/ but "original
scripts only using elements from <who/whatever>"? Is it
not a constantly renewed observation that "based on X"
means "has no relation to X except perhaps the name"? As
adaptations go, /I, Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps explain
the differences from Asimov's ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would
lead to "the first really adult, complex, worthwhile
science fiction film ever made". The screenplay has never
been filmed and was eventually published in book form in
1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or may
not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do
with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that /isn't
relevant/. What matters is what the film /is/, not how it
was made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to adapt
either one to the screen, my firs decision would be to make it
a TV miniseries. (Same thing for the Silmarillion or the
Lensmen.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match their
creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his rinning
mate in 2024, too.
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now he's a
sock puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they can't
control Trump, so no, not what it takes to win. What it takes to
control.
On 2021-10-16 9:50 a.m., Paul S Person wrote:
On Fri, 15 Oct 2021 14:35:05 -0600, David Johnston
<davidjohnston29@yahoo.com> wrote:
On 2021-10-15 9:57 a.m., Paul S Person wrote:
On Thu, 14 Oct 2021 20:19:35 -0400, Joy Beeson
<jbeeson@invalid.net.invalid> wrote:
On Thu, 14 Oct 2021 03:09:32 -0600, David Johnston
<davidjohnston29@yahoo.com> wrote:
But what the film is, is "not an adaptation of either I Robot".
"With Folded Hands" is in its ancestry somewhere.
You are talking to someone who appears to believe that /how something
was produced/ fully determines /what it is/.
No. She isn't. But she is talking to someone who believes that it
takes more than simply applying the title of a previous work to make the >>> new work an adaptation. Bladerunner is not an adaptation of The
Bladerunner, and Asimov's I, Robot is not an adaptation of Eando
Binder's I, Robot.
I merely note that that is totally incoherent.
And, BTW, I gave /reasons/ why the film /I, Robot/ adapts the book.
You have read the book, I suppose? Particularly the last few chapters?
You know, the part where large positronic brains take over the world?
No. I read the part where humanity deliberately and willingly hands >responsibility for admistering the world over to large positronic
brains, who then decide that them being in charge is bad for humanity
and abdicate from their role. This bears no resemblance to what the
large positronic brain in the cinematic I, Robot does. In fact the plot
of I, Robot (cinematic) has more resemblance to what happens in The >Humanoids. So perhaps it is an adaptation of that book. Even though
the scriptwriter very likely never read it.
On Sat, 16 Oct 2021 18:08:33 GMT, Ninapenda Jibini
<taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in >>news:6bslmg1sek8mkehegsjjf827j417285riu@4ax.com:
On Fri, 15 Oct 2021 09:12:09 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in >>>>news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com
(Dorothy J Heydt)
wrote:
In article
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read >>>>>>>>>>>>> that they'd cast a luscious young babe as Dr.
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning
child based on it's statistical analysis of the
best decision?
Calvin), and now I have another reason.
A proper Asenion robot would break down with
conflict paralysis and be permanently nonfunctional.
You may be right, but not saving /either/ of them is
not acceptable. And saving /both/ was not possible.
This exposes a problem with the Three Laws: how does
the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact
that a human would have focused on saving the child.
It's actually a fine action film. It's relation to
the book is, as others have pointed out ... remote. I
would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It
comes off, at best, as the situation toward the end
of the book, where the really large brains have
quietly taken control of the world -- except, in the
film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and
as the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did
Congress pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from
Asimov", a few observations:
-- What do you think most adaptations /are/ but
"original scripts only using elements from
<who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to
X except perhaps the name"? As adaptations go, /I,
Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's I, Robot, which helps
explain the differences from Asimov's ideas in the
movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would
lead to "the first really adult, complex, worthwhile
science fiction film ever made". The screenplay has
never been filmed and was eventually published in book
form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or
may not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do
with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that
/isn't relevant/. What matters is what the film /is/, not
how it was made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to
adapt either one to the screen, my firs decision would be to
make it a TV miniseries. (Same thing for the Silmarillion
or the Lensmen.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match their
creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his
rinning mate in 2024, too.
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now he's a
sock puppet.
Ah, projection.
Trump was (is?) Putin's sock-puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they can't
control Trump, so no, not what it takes to win. What it takes to
control.
Oh, I don't know.
Putin seems to have handled him quite well.
I suspect you just have to have enough dirt on him.
Like any other petty criminal.
On Sat, 16 Oct 2021 14:55:00 -0600, David Johnston <davidjohnston29@yahoo.com> wrote:
On 2021-10-16 9:50 a.m., Paul S Person wrote:
On Fri, 15 Oct 2021 14:35:05 -0600, David Johnston
<davidjohnston29@yahoo.com> wrote:
On 2021-10-15 9:57 a.m., Paul S Person wrote:
On Thu, 14 Oct 2021 20:19:35 -0400, Joy Beeson
<jbeeson@invalid.net.invalid> wrote:
On Thu, 14 Oct 2021 03:09:32 -0600, David Johnston
<davidjohnston29@yahoo.com> wrote:
But what the film is, is "not an adaptation of either I Robot".
"With Folded Hands" is in its ancestry somewhere.
You are talking to someone who appears to believe that /how something >>>>> was produced/ fully determines /what it is/.
No. She isn't. But she is talking to someone who believes that it
takes more than simply applying the title of a previous work to make the >>>> new work an adaptation. Bladerunner is not an adaptation of The
Bladerunner, and Asimov's I, Robot is not an adaptation of Eando
Binder's I, Robot.
I merely note that that is totally incoherent.
And, BTW, I gave /reasons/ why the film /I, Robot/ adapts the book.
You have read the book, I suppose? Particularly the last few chapters?
You know, the part where large positronic brains take over the world?
No. I read the part where humanity deliberately and willingly hands
responsibility for admistering the world over to large positronic
brains, who then decide that them being in charge is bad for humanity
and abdicate from their role. This bears no resemblance to what the
large positronic brain in the cinematic I, Robot does. In fact the plot
of I, Robot (cinematic) has more resemblance to what happens in The
Humanoids. So perhaps it is an adaptation of that book. Even though
the scriptwriter very likely never read it.
Apparently, you read a different version of the book than I did.
Are you sure you aren't confusing it with something else?
Paul S Person <psperson1@ix.netcom.invalid> wrote in news:ujiomgt5r8vq226klf9ho40im864rm5fs1@4ax.com:
On Sat, 16 Oct 2021 18:08:33 GMT, Ninapenda Jibini
<taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:6bslmg1sek8mkehegsjjf827j417285riu@4ax.com:
On Fri, 15 Oct 2021 09:12:09 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com >>>>>>>>>>>>> (Dorothy J Heydt)
wrote:
In articleYou may be right, but not saving /either/ of them is >>>>>>>>>>>>> not acceptable. And saving /both/ was not possible.
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan� <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read >>>>>>>>>>>>>> that they'd cast a luscious young babe as Dr.
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning
child based on it's statistical analysis of the
best decision?
Calvin), and now I have another reason.
A proper Asenion robot would break down with
conflict paralysis and be permanently nonfunctional. >>>>>>>>>>>>>
This exposes a problem with the Three Laws: how does >>>>>>>>>>>>> the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact
that a human would have focused on saving the child. >>>>>>>>>>>>>
It's actually a fine action film. It's relation to
the book is, as others have pointed out ... remote. I >>>>>>>>>>>>> would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It >>>>>>>>>>>>> comes off, at best, as the situation toward the end
of the book, where the really large brains have
quietly taken control of the world -- except, in the >>>>>>>>>>>>> film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and >>>>>>>>>>>>> as the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did
Congress pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from
Asimov", a few observations:
-- What do you think most adaptations /are/ but
"original scripts only using elements from
<who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to
X except perhaps the name"? As adaptations go, /I,
Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's� I, Robot, which helps
explain the differences from Asimov's ideas in the
movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would
lead to "the first really adult, complex, worthwhile
science fiction film ever made". The screenplay has
never been filmed and was eventually published in book
form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or
may not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do
with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that
/isn't relevant/. What matters is what the film /is/, not
how it was made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to
adapt either one to the screen, my firs decision would be to
make it a TV miniseries. (Same thing for the �Silmarillion�
or the �Lensmen�.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match their
creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his
rinning mate in 2024, too.
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now he's a
sock puppet.
Ah, projection.
Trump was (is?) Putin's sock-puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they can't
control Trump, so no, not what it takes to win. What it takes to
control.
Oh, I don't know.
Putin seems to have handled him quite well.
I suspect you just have to have enough dirt on him.
Like any other petty criminal.
Biden's demetia we well documented, and getting worse.
https://en.wikipedia.org/wiki/Trump_derangement_syndrome
On 2021-10-17 10:32 a.m., Ninapenda Jibini wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:ujiomgt5r8vq226klf9ho40im864rm5fs1@4ax.com:
On Sat, 16 Oct 2021 18:08:33 GMT, Ninapenda Jibini
<taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:6bslmg1sek8mkehegsjjf827j417285riu@4ax.com:
On Fri, 15 Oct 2021 09:12:09 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com >>>>>>>>>>>>>> (Dorothy J Heydt)
wrote:
In articleYou may be right, but not saving /either/ of them is >>>>>>>>>>>>>> not acceptable. And saving /both/ was not possible. >>>>>>>>>>>>>> This exposes a problem with the Three Laws: how does >>>>>>>>>>>>>> the robot handle moral dilemmas?
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>, >>>>>>>>>>>>>>> Jonathan? <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read >>>>>>>>>>>>>>> that they'd cast a luscious young babe as Dr.
Remember the great scene in the movie 'I Robot' >>>>>>>>>>>>>>>> where the detective hated the robots because
it decided to save him instead of the drowning >>>>>>>>>>>>>>>> child based on it's statistical analysis of the >>>>>>>>>>>>>>>> best decision?
Calvin), and now I have another reason.
A proper Asenion robot would break down with
conflict paralysis and be permanently nonfunctional. >>>>>>>>>>>>>>
Indeed, the detective's hate is based on that fact >>>>>>>>>>>>>> that a human would have focused on saving the child. >>>>>>>>>>>>>>
It's actually a fine action film. It's relation to >>>>>>>>>>>>>> the book is, as others have pointed out ... remote. I >>>>>>>>>>>>>> would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It >>>>>>>>>>>>>> comes off, at best, as the situation toward the end >>>>>>>>>>>>>> of the book, where the really large brains have
quietly taken control of the world -- except, in the >>>>>>>>>>>>>> film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and >>>>>>>>>>>>>> as the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did
Congress pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from
Asimov", a few observations:
-- What do you think most adaptations /are/ but
"original scripts only using elements from
<who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to >>>>>>>>>>>> X except perhaps the name"? As adaptations go, /I,
Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond >>>>>>>>>>> to moral dilemmas in the three Asimov books I found this >>>>>>>>>>> passage about Will Smith's? I, Robot, which helps
explain the differences from Asimov's ideas in the
movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would
lead to "the first really adult, complex, worthwhile
science fiction film ever made". The screenplay has
never been filmed and was eventually published in book
form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or
may not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do
with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that
/isn't relevant/. What matters is what the film /is/, not
how it was made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to
adapt either one to the screen, my firs decision would be to
make it a TV miniseries. (Same thing for the ?Silmarillion?
or the ?Lensmen?.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match their
creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his
rinning mate in 2024, too.
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now he's a
sock puppet.
Ah, projection.
Trump was (is?) Putin's sock-puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they can't
control Trump, so no, not what it takes to win. What it takes to
control.
Oh, I don't know.
Putin seems to have handled him quite well.
I suspect you just have to have enough dirt on him.
Like any other petty criminal.
Biden's demetia we well documented, and getting worse.
https://en.wikipedia.org/wiki/Trump_derangement_syndrome
How does a Wikipedia article about something that doesn't exist document >anything about Biden?
Paul S Person <psperson1@ix.netcom.invalid> wrote in news:ujiomgt5r8vq226klf9ho40im864rm5fs1@4ax.com:
On Sat, 16 Oct 2021 18:08:33 GMT, Ninapenda Jibini
<taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:6bslmg1sek8mkehegsjjf827j417285riu@4ax.com:
On Fri, 15 Oct 2021 09:12:09 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT, djheydt@kithrup.com >>>>>>>>>>>>> (Dorothy J Heydt)
wrote:
In articleYou may be right, but not saving /either/ of them is >>>>>>>>>>>>> not acceptable. And saving /both/ was not possible.
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>,
Jonathan� <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having read >>>>>>>>>>>>>> that they'd cast a luscious young babe as Dr.
Remember the great scene in the movie 'I Robot'
where the detective hated the robots because
it decided to save him instead of the drowning
child based on it's statistical analysis of the
best decision?
Calvin), and now I have another reason.
A proper Asenion robot would break down with
conflict paralysis and be permanently nonfunctional. >>>>>>>>>>>>>
This exposes a problem with the Three Laws: how does >>>>>>>>>>>>> the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact
that a human would have focused on saving the child. >>>>>>>>>>>>>
It's actually a fine action film. It's relation to
the book is, as others have pointed out ... remote. I >>>>>>>>>>>>> would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). It >>>>>>>>>>>>> comes off, at best, as the situation toward the end
of the book, where the really large brains have
quietly taken control of the world -- except, in the >>>>>>>>>>>>> film, its not so quiet.
And Dr. Calvin works both as a robot psychologist and >>>>>>>>>>>>> as the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did
Congress pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from
Asimov", a few observations:
-- What do you think most adaptations /are/ but
"original scripts only using elements from
<who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation to
X except perhaps the name"? As adaptations go, /I,
Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots respond
to moral dilemmas in the three Asimov books I found this
passage about Will Smith's� I, Robot, which helps
explain the differences from Asimov's ideas in the
movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would
lead to "the first really adult, complex, worthwhile
science fiction film ever made". The screenplay has
never been filmed and was eventually published in book
form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or
may not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to do
with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was based
on an unrelated script by Jeff Vintar titled Hardwired,
with Asimov's ideas incorporated later after the rights
to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that
/isn't relevant/. What matters is what the film /is/, not
how it was made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to
adapt either one to the screen, my firs decision would be to
make it a TV miniseries. (Same thing for the �Silmarillion�
or the �Lensmen�.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match their
creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his
rinning mate in 2024, too.
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now he's a
sock puppet.
Ah, projection.
Trump was (is?) Putin's sock-puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they can't
control Trump, so no, not what it takes to win. What it takes to
control.
Oh, I don't know.
Putin seems to have handled him quite well.
I suspect you just have to have enough dirt on him.
Like any other petty criminal.
Biden's demetia we well documented, and getting worse.
https://en.wikipedia.org/wiki/Trump_derangement_syndrome
On 10/17/21 1:32 PM, Ninapenda Jibini wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:ujiomgt5r8vq226klf9ho40im864rm5fs1@4ax.com:
On Sat, 16 Oct 2021 18:08:33 GMT, Ninapenda Jibini
<taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:6bslmg1sek8mkehegsjjf827j417285riu@4ax.com:
On Fri, 15 Oct 2021 09:12:09 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT,
djheydt@kithrup.com (Dorothy J Heydt)
wrote:
In article
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>, >>>>>>>>>>>>>>> Jonathan� <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having
Remember the great scene in the movie 'I Robot' >>>>>>>>>>>>>>>> where the detective hated the robots because
it decided to save him instead of the drowning >>>>>>>>>>>>>>>> child based on it's statistical analysis of the >>>>>>>>>>>>>>>> best decision?
read that they'd cast a luscious young babe as Dr. >>>>>>>>>>>>>>> Calvin), and now I have another reason.
A proper Asenion robot would break down with
conflict paralysis and be permanently
nonfunctional.
You may be right, but not saving /either/ of them
is not acceptable. And saving /both/ was not
possible. This exposes a problem with the Three
Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact >>>>>>>>>>>>>> that a human would have focused on saving the
child.
It's actually a fine action film. It's relation to >>>>>>>>>>>>>> the book is, as others have pointed out ... remote. >>>>>>>>>>>>>> I would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?).
It comes off, at best, as the situation toward the >>>>>>>>>>>>>> end of the book, where the really large brains have >>>>>>>>>>>>>> quietly taken control of the world -- except, in
the film, its not so quiet.
And Dr. Calvin works both as a robot psychologist
and as the romantic interest.
She's presented as a geek, not a sex symbol at all.
And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did
Congress pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from
Asimov", a few observations:
-- What do you think most adaptations /are/ but
"original scripts only using elements from
<who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation
to X except perhaps the name"? As adaptations go, /I,
Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots
respond to moral dilemmas in the three Asimov books I
found this passage about Will Smith's� I, Robot,
which helps explain the differences from Asimov's
ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would
lead to "the first really adult, complex, worthwhile
science fiction film ever made". The screenplay has
never been filmed and was eventually published in book
form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or
may not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to
do with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was
based on an unrelated script by Jeff Vintar titled
Hardwired, with Asimov's ideas incorporated later
after the rights to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that
/isn't relevant/. What matters is what the film /is/,
not how it was made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to
adapt either one to the screen, my firs decision would be
to make it a TV miniseries. (Same thing for the
�Silmarillion� or the �Lensmen�.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match
their creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his
rinning mate in 2024, too.
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now he's
a sock puppet.
Ah, projection.
Trump was (is?) Putin's sock-puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they
can't control Trump, so no, not what it takes to win. What it
takes to control.
Oh, I don't know.
Putin seems to have handled him quite well.
I suspect you just have to have enough dirt on him.
Like any other petty criminal.
Biden's demetia we well documented, and getting worse.
https://en.wikipedia.org/wiki/Trump_derangement_syndrome
That article (last updated in July) has no reference whatever to
President Biden, apart from a simple pro-forma note that Trump
came between Obama and Biden.
John W Kennedy <john.w.kennedy@gmail.com> wrote in news:HK6dnVTByaYhJfD8nZ2dnUU7-WfNnZ2d@giganews.com:
On 10/17/21 1:32 PM, Ninapenda Jibini wrote:The subject at hand, still quoted above, was a joke about a
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:ujiomgt5r8vq226klf9ho40im864rm5fs1@4ax.com:
On Sat, 16 Oct 2021 18:08:33 GMT, Ninapenda Jibini
<taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:6bslmg1sek8mkehegsjjf827j417285riu@4ax.com:
On Fri, 15 Oct 2021 09:12:09 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT,
djheydt@kithrup.com (Dorothy J Heydt)
wrote:
In article
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>, >>>>>>>>>>>>>>>> Jonathan� <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having >>>>>>>>>>>>>>>> read that they'd cast a luscious young babe as Dr. >>>>>>>>>>>>>>>> Calvin), and now I have another reason.
Remember the great scene in the movie 'I Robot' >>>>>>>>>>>>>>>>> where the detective hated the robots because >>>>>>>>>>>>>>>>> it decided to save him instead of the drowning >>>>>>>>>>>>>>>>> child based on it's statistical analysis of the >>>>>>>>>>>>>>>>> best decision?
A proper Asenion robot would break down with
conflict paralysis and be permanently
nonfunctional.
You may be right, but not saving /either/ of them >>>>>>>>>>>>>>> is not acceptable. And saving /both/ was not
possible. This exposes a problem with the Three
Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact >>>>>>>>>>>>>>> that a human would have focused on saving the
child.
It's actually a fine action film. It's relation to >>>>>>>>>>>>>>> the book is, as others have pointed out ... remote. >>>>>>>>>>>>>>> I would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). >>>>>>>>>>>>>>> It comes off, at best, as the situation toward the >>>>>>>>>>>>>>> end of the book, where the really large brains have >>>>>>>>>>>>>>> quietly taken control of the world -- except, in >>>>>>>>>>>>>>> the film, its not so quiet.
And Dr. Calvin works both as a robot psychologist >>>>>>>>>>>>>>> and as the romantic interest.
She's presented as a geek, not a sex symbol at all. >>>>>>>>>>>>>> And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did
Congress pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from
Asimov", a few observations:
-- What do you think most adaptations /are/ but
"original scripts only using elements from
<who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation >>>>>>>>>>>>> to X except perhaps the name"? As adaptations go, /I, >>>>>>>>>>>>> Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots
respond to moral dilemmas in the three Asimov books I
found this passage about Will Smith's� I, Robot,
which helps explain the differences from Asimov's
ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped would >>>>>>>>>>>> lead to "the first really adult, complex, worthwhile
science fiction film ever made". The screenplay has
never been filmed and was eventually published in book >>>>>>>>>>>> form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or
may not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to
do with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was
based on an unrelated script by Jeff Vintar titled
Hardwired, with Asimov's ideas incorporated later
after the rights to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that
/isn't relevant/. What matters is what the film /is/,
not how it was made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to
adapt either one to the screen, my firs decision would be
to make it a TV miniseries. (Same thing for the
�Silmarillion� or the �Lensmen�.
A miniseries, hopefully, that reproduces each of the short
stories, and does so without "updating" them to match
their creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his
rinning mate in 2024, too.
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now he's
a sock puppet.
Ah, projection.
Trump was (is?) Putin's sock-puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they
can't control Trump, so no, not what it takes to win. What it
takes to control.
Oh, I don't know.
Putin seems to have handled him quite well.
I suspect you just have to have enough dirt on him.
Like any other petty criminal.
Biden's demetia we well documented, and getting worse.
https://en.wikipedia.org/wiki/Trump_derangement_syndrome
That article (last updated in July) has no reference whatever to
President Biden, apart from a simple pro-forma note that Trump
came between Obama and Biden.
miniseries based on the Silmarillion being faithful to the book
being as likely as Biden choosing Trump as his running mate in
2024. That *immediately* degenrated into Paul's hallucinations
about Trump.
Try to pay attention, or have a grown up explain it to you.
On Sun, 17 Oct 2021 20:02:00 -0700, Alan <nope@nope.com> wrote:
On 2021-10-17 10:32 a.m., Ninapenda Jibini wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:ujiomgt5r8vq226klf9ho40im864rm5fs1@4ax.com:
On Sat, 16 Oct 2021 18:08:33 GMT, Ninapenda Jibini
<taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:6bslmg1sek8mkehegsjjf827j417285riu@4ax.com:
On Fri, 15 Oct 2021 09:12:09 -0700, Jibini Kula Tumbili
Kujisalimisha <taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:109jmgtc8eqjohj8i01vlmhk47o7k3fkvi@4ax.com:
On Thu, 14 Oct 2021 15:06:17 -0400, John W Kennedy
<john.w.kennedy@gmail.com> wrote:
On 10/14/21 5:09 AM, David Johnston wrote:
On 2021-10-12 9:42 a.m., Paul S Person wrote:
On Mon, 11 Oct 2021 19:41:05 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/11/2021 12:15 PM, Paul S Person wrote:
On Sun, 10 Oct 2021 20:31:27 -0400, Jonathan
<Mailinstead@gmail.com> wrote:
On 10/10/2021 12:34 PM, Paul S Person wrote:
On Sat, 9 Oct 2021 19:18:50 GMT,
djheydt@kithrup.com (Dorothy J Heydt)
wrote:
In article
<PbmdnZriFvJqXvz8nZ2dnUU7-dnNnZ2d@giganews.com>, >>>>>>>>>>>>>>>> Jonathan? <Mailinstead@gmail.com> wrote:
I never saw that movie (based chiefly on having >>>>>>>>>>>>>>>> read that they'd cast a luscious young babe as >>>>>>>>>>>>>>>> Dr. Calvin), and now I have another reason.
Remember the great scene in the movie 'I Robot' >>>>>>>>>>>>>>>>> where the detective hated the robots because >>>>>>>>>>>>>>>>> it decided to save him instead of the drowning >>>>>>>>>>>>>>>>> child based on it's statistical analysis of the >>>>>>>>>>>>>>>>> best decision?
A proper Asenion robot would break down with
conflict paralysis and be permanently
nonfunctional.
You may be right, but not saving /either/ of them >>>>>>>>>>>>>>> is not acceptable. And saving /both/ was not
possible. This exposes a problem with the Three
Laws: how does the robot handle moral dilemmas?
Indeed, the detective's hate is based on that fact >>>>>>>>>>>>>>> that a human would have focused on saving the
child.
It's actually a fine action film. It's relation to >>>>>>>>>>>>>>> the book is, as others have pointed out ...
remote. I would have preferred one focused
on Dr. Calvin and the two engineers (mechanics?). >>>>>>>>>>>>>>> It comes off, at best, as the situation toward the >>>>>>>>>>>>>>> end of the book, where the really large brains
have quietly taken control of the world -- except, >>>>>>>>>>>>>>> in the film, its not so quiet.
And Dr. Calvin works both as a robot psychologist >>>>>>>>>>>>>>> and as the romantic interest.
She's presented as a geek, not a sex symbol at all. >>>>>>>>>>>>>> And the Will Smith movie I, Robot is an original
script only using elements from Asimov, not a
remake of the original.
Romantic interest, nonetheless.
Who says geeks can't be romantic interests? Did
Congress pass a law making it illegal?
Right, and portraying a young and attractive woman
as a dedicated and brilliant scientist is a positive
role model, entirely 'woke'.
And there we differ: I would say "entirely realistic".
As to "original script only using elements from
Asimov", a few observations:
-- What do you think most adaptations /are/ but
"original scripts only using elements from
<who/whatever>"? Is it not a constantly renewed
observation that "based on X" means "has no relation >>>>>>>>>>>>> to X except perhaps the name"? As adaptations go,
/I, Robot/ has a lot more than the name!
I agree completely with your sentiment below that
what matters is the work, not the details of
who and why etc. I'm only responding to the
criticism that some aspects of the robots
aren't faithful to original Asimov ideas.
While looking up the evolution of how his robots
respond to moral dilemmas in the three Asimov books I
found this passage about Will Smith's? I, Robot,
which helps explain the differences from Asimov's
ideas in the movie.
"The robot series has led to film adaptations. With
Asimov's collaboration, in about 1977, Harlan Ellison
wrote a screenplay of I, Robot that Asimov hoped
would lead to "the first really adult, complex,
worthwhile science fiction film ever made". The
screenplay has never been filmed and was eventually
published in book form in 1994.
It was serialized in a magazine as well. I read it.
It starts with the funeral of the President who may or
may not have been a robot.
It is attended by several /alien races/.
Which, of course, means it had /nothing whatsoever/ to
do with Asimov's universe.
But what else can you expect from Ellison?
The 2004 movie I, Robot, starring Will Smith, was
based on an unrelated script by Jeff Vintar titled
Hardwired, with Asimov's ideas incorporated later
after the rights to Asimov's title were acquired."
https://en.wikipedia.org/wiki/Isaac_Asimov
My point isn't that it wasn't. My point is that that
/isn't relevant/. What matters is what the film /is/,
not how it was made.
But what the film is, is "not an adaptation of either I
Robot".
But both books are fix-ups, anyway. If I were briefed to
adapt either one to the screen, my firs decision would
be to make it a TV miniseries. (Same thing for the
?Silmarillion? or the ?Lensmen?.
A miniseries, hopefully, that reproduces each of the
short stories, and does so without "updating" them to
match their creators' idea of what they should have been.
Now, /that/ might be worth watching!
Yeah, that'll happen. And Biden will choose Trump as his
rinning mate in 2024, too.
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now
he's a sock puppet.
Ah, projection.
Trump was (is?) Putin's sock-puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they
can't control Trump, so no, not what it takes to win. What
it takes to control.
Oh, I don't know.
Putin seems to have handled him quite well.
I suspect you just have to have enough dirt on him.
Like any other petty criminal.
Biden's demetia we well documented, and getting worse.
https://en.wikipedia.org/wiki/Trump_derangement_syndrome
How does a Wikipedia article about something that doesn't exist
document anything about Biden?
I suspect Trump's derangement is all too real.
IOW, just another example of projection.
On 10/17/21 1:32 PM, Ninapenda Jibini wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:ujiomgt5r8vq226klf9ho40im864rm5fs1@4ax.com:
On Sat, 16 Oct 2021 18:08:33 GMT, Ninapenda Jibini
<taustinca@gmail.com> wrote:
Paul S Person <psperson1@ix.netcom.invalid> wrote in
news:6bslmg1sek8mkehegsjjf827j417285riu@4ax.com:
Well, Biden /is/ a politician.
Biden *was* a politician, before the demetia set in. Now he's a
sock puppet.
Ah, projection.
Trump was (is?) Putin's sock-puppet.
Whatever it takes to win.
And his puppeteers already know, for *certain*, that they can't
control Trump, so no, not what it takes to win. What it takes to
control.
Oh, I don't know.
Putin seems to have handled him quite well.
I suspect you just have to have enough dirt on him.
Like any other petty criminal.
Biden's demetia we well documented, and getting worse.
https://en.wikipedia.org/wiki/Trump_derangement_syndrome
That article (last updated in July) has no reference whatever to
President Biden, apart from a simple pro-forma note that Trump came
between Obama and Biden.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 296 |
Nodes: | 16 (2 / 14) |
Uptime: | 58:11:56 |
Calls: | 6,652 |
Calls today: | 4 |
Files: | 12,200 |
Messages: | 5,331,136 |