Jump to content

Autonomous Buses and Other Autonomous Vehicles


BusHunter

Recommended Posts

  On 3/19/2018 at 6:16 PM, BusHunter said:

existing human drivers if they kill someone are liable for a lawsuit but a good attorney knows the big tuna is uber, but uber's attorneys are probably better due to their deep pockets.

Expand  

It's probably the other way around, legally speaking. One would have to prove that a human driver was negligent, but in this case all one would have to prove is that Uber unleashed a dangerous instrumentality into the public.Not much different than the usual dynamite blasting damage case.

This one also seems distinctive from the Tesla case, as the Tesla driver was still not supposed to be asleep at the wheel.

Link to comment
Share on other sites

  On 3/19/2018 at 6:24 PM, Busjack said:

It's probably the other way around, legally speaking. One would have to prove that a human driver was negligent, but in this case all one would have to prove is that Uber unleashed a dangerous instrumentality into the public.Not much different than the usual dynamite blasting damage case.

This one also seems distinctive from the Tesla case, as the Tesla driver was still not supposed to be asleep at the wheel.

Expand  

Probably what is bad here is the car is not manufactured to be autonomous but  is built to be through a 3rd party. Can't really go after the auto manufacturer here 

Link to comment
Share on other sites

Putting this together with the comments on whether a bus can stop on a dime is this Tribune article that the Uber had on board cameras (of course), showing that the pedestrian suddenly stepped in front of the car, raising the question whether either a human driver or technology could have stopped it in time. So,. maybe the argument that Uber unleashed a dangerous instrumentality is not so clear.

Link to comment
Share on other sites

  On 3/22/2018 at 7:53 PM, Busjack said:

Putting this together with the comments on whether a bus can stop on a dime is this Tribune article that the Uber had on board cameras (of course), showing that the pedestrian suddenly stepped in front of the car, raising the question whether either a human driver or technology could have stopped it in time. So,. maybe the argument that Uber unleashed a dangerous instrumentality is not so clear.

Expand  

The article makes a great point when it said, to paraphrase, it's possible that the Uber system  didn't detect or identify the eventual victim as a pedestrian or potential hazard.  My question is what is the purpose of the back up driver?  Shouldn't  he be alert  enough to have noticed her?  What steps can a backup driver take, if any, should a situation arise?   I think a human driver might have avoided that tragedy.  By it being at night, perhaps , perhaps not.  

Link to comment
Share on other sites

  On 3/24/2018 at 3:22 PM, artthouwill said:

The article makes a great point when it said, to paraphrase, it's possible that the Uber system  didn't detect or identify the eventual victim as a pedestrian or potential hazard.  My question is what is the purpose of the back up driver?  Shouldn't  he be alert  enough to have noticed her?  What steps can a backup driver take, if any, should a situation arise?   I think a human driver might have avoided that tragedy.  By it being at night, perhaps , perhaps not.  

Expand  

While he's there, the implication I got was that the pedestrian stepped out so fast,, nobody   could have stopped.. But that  is  for the investigators to determine.

Link to comment
Share on other sites

  On 3/24/2018 at 3:22 PM, artthouwill said:

By it being at night, perhaps , perhaps not.

Expand  

The video of the collision (embedded below) shows that the pedestrian did step out quickly. The time between the pedestrian coming within range of the headlights and the impact is not more than two seconds. To give a conservative estimate, a human could have reacted in 1 second then started braking to at least reduce the impact. But reaction time is usually longer than 1 second and often longer than 2 seconds, so it is definitely questionable whether an "average" driver could have prevented this. I would guess that one's reaction time would be slower if accustomed to the vehicle operating autonomously.

The driver appeared to be looking down at something immediately before the collision and did not look up until it was too late, but it is uncertain whether this made a difference in preventing the accident.

The pedestrian was not in a marked crosswalk, was wearing dark clothing, and had no visible lights or reflectors. I don't know whether she could have been visible to a driver earlier than the aforementioned two seconds, as the camera blurriness may have obscured her initially.

The radar and lidar systems should have at least been able to identify the presence of the pedestrian, but this is where the limitations of the technology are showcased. How does one program a vehicle to identify what a moving hazard is, predict how it will move, and respond appropriately? I'll have to wait to see what the investigators determine before I can comment further on this, specifically.

I don't know the exact speed the vehicle was traveling or the speed limit on this road. When driving at night, one should not drive at a speed at which one is unable to stop within the range of the headlights. It is highly uncertain whether this vehicle was traveling too fast for conditions, given the constraints about reaction time and stopping distance. This also raises the question of whether it can be theoretically safe for an autonomous vehicle to "outdrive the headlights" because it can supposedly detect hazards that would not be visible in darkness.

 

 

Link to comment
Share on other sites

  On 3/25/2018 at 2:25 AM, Pace831 said:

The video of the collision (embedded below) shows that the pedestrian did step out quickly. The time between the pedestrian coming within range of the headlights and the impact is not more than two seconds. To give a conservative estimate, a human could have reacted in 1 second then started braking to at least reduce the impact. But reaction time is usually longer than 1 second and often longer than 2 seconds, so it is definitely questionable whether an "average" driver could have prevented this. I would guess that one's reaction time would be slower if accustomed to the vehicle operating autonomously.

The driver appeared to be looking down at something immediately before the collision and did not look up until it was too late, but it is uncertain whether this made a difference in preventing the accident.

The pedestrian was not in a marked crosswalk, was wearing dark clothing, and had no visible lights or reflectors. I don't know whether she could have been visible to a driver earlier than the aforementioned two seconds, as the camera blurriness may have obscured her initially.

The radar and lidar systems should have at least been able to identify the presence of the pedestrian, but this is where the limitations of the technology are showcased. How does one program a vehicle to identify what a moving hazard is, predict how it will move, and respond appropriately? I'll have to wait to see what the investigators determine before I can comment further on this, specifically.

I don't know the exact speed the vehicle was traveling or the speed limit on this road. When driving at night, one should not drive at a speed at which one is unable to stop within the range of the headlights. It is highly uncertain whether this vehicle was traveling too fast for conditions, given the constraints about reaction time and stopping distance. This also raises the question of whether it can be theoretically safe for an autonomous vehicle to "outdrive the headlights" because it can supposedly detect hazards that would not be visible in darkness.

 

 

Expand  

Based on the  video, completely  unavoudable.  I don't know if  any algorithms could've  calculated  that happening.   

Link to comment
Share on other sites

  On 3/25/2018 at 12:53 PM, Busjack said:

I don't know, She got to the middle of the car.

Expand  

I watched it  several times trying to Dee if she could've been  detected in peripheral vision.  I never saw her until at the last second and she was in the middle of the lane.  I had initially expected a median, possibly with shrubbery  or something that  could obstruct the  driver's or camera's view, but it looks like the car is in the right lane, which is more baffling.  Somehow the biker misjudged the distance and speed of the Uber car

Link to comment
Share on other sites

  On 3/25/2018 at 10:51 PM, artthouwill said:

In this case they didn't.  There's a reason for having a backup driver.   Could a human who was actually driving or alert have seen her and possibly taken action?  

Expand  

You have, for instance, Ford advertising that the woman who is asking Alexa for Starbbucks has the car stop automatically when some otter car goes by when she is backing out of the parking space. Hence, someone is advertising that you can trust the radar, even though maybe you shouldn't. Only difference is that in the ad the car is going backwards at 5 mph instead of forwards at 40.

Note all the disclaimers (including only a text one on pedestrian detection not working at night) in this Ford video:

 

Link to comment
Share on other sites

  On 3/26/2018 at 8:09 PM, BusHunter said:

If anything I would question the cars headlights. Why do we not see the bike and person walking it until the last few seconds.

Expand  

If there's any argument, it would be that lights are more for the car to be seen than see, maybe thereby putting less fault on the pedestrian.

Link to comment
Share on other sites

  On 4/2/2018 at 6:41 PM, Busjack said:

Another Tesla in autopilot mode hit a concrete barrier (in middle of Daily Herald article).

Expand  

Interesting that the article  noted the "driver" didn't have  his hands on the  wheel for six seconds and failed to  take action to prevent the car from hitting the  concrete barrier.

So it seems to me the purpose of the "driver" in these vehicles is to monitor the vehicle's performance and  take action when necessary.  Apparently  some of these guys are sleeping (figuratively) at the wheel.

What was  unclear  to me is who got lilled, the driver or passenger?

Link to comment
Share on other sites

  On 4/2/2018 at 7:17 PM, artthouwill said:

What was  unclear  to me is who got lilled, the driver or passenger?

Expand  

Sources say the driver. This doesn't seem to be a test, but another Tesla driver being inattentive.

The other thing mentioned in this earlier article is that the crash attenuater (the metal things at the point of the barrier) didn't stop the car, although not mentioned is why the radar or whatever didn't see the barrier, since it was not crossing the road.

Link to comment
Share on other sites

  On 4/2/2018 at 7:59 PM, artthouwill said:

And to think pilotless planes could be 20 years away!!

Expand  

Commercial aircraft are usually run in autopilot mode, although the pilot is still supposed to be alert. There was a Nova on an apparently inexplicable crash, and the explanation was that some sensors froze over and the systems sent off alarms so quickly that nobody could do anything about it.

Link to comment
Share on other sites

  On 4/2/2018 at 7:17 PM, artthouwill said:

So it seems to me the purpose of the "driver" in these vehicles is to monitor the vehicle's performance and  take action when necessary.  Apparently  some of these guys are sleeping (figuratively) at the wheel.

Expand  

To add to what I said earlier about the pedestrian hit by the Uber car, someone's reaction time would be significantly longer if they are not actively "driving" immediately prior to the event. If they don't have hands on the wheel and/or foot on the pedals, add another second or so.

Another factor is one's perception of what (if anything) the car will do to prevent a crash, even if the "driver" is aware of the hazard.

Link to comment
Share on other sites

  On 4/2/2018 at 8:26 PM, Pace831 said:

Another factor is one's perception of what (if anything) the car will do to prevent a crash, even if the "driver" is aware of the hazard.

Expand  

That perception seems reinforced by, for instance, the Ford  ad I mentioned. Too much fine print, and as the 2 Tesla accidents point out, the official guidance is not to rely on the autopilot, but the drivers are not getting that message.

  • Upvote 1
Link to comment
Share on other sites

  • 1 month later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...