On Oct 2 in San Francisco, a pedestrian on the street was tragically struck by a human driven vehicle. The unfortunate pedestrian was flung into the path of a Cruise robotaxi in the lane to the right. The robotaxi stopped hard and the pedestrian ended up under its wheels. Emergency crews had to free the pedestrian who was taken to hospital with severe injuries. The human driver reportedly fled the scene.
The pedestrian was in critical condition as of Tuesday. The hit-and-run driver has not be reported apprehended.
Both vehicles had been stopped at a red light and were moving forward when it turned green. The pedestrian was reportedly crossing in the opposite crosswalk from right to left, after the light had changed (and against a “don’t walk” sign, sometimes called “jaywalking”) and hit by the driver of a sedan in the left lane. The situation was precarious with traffic going both directions on the new green and not one in which a typical pedestrian would cross, based on the video recorded by the Cruise vehicle. She was thrown back into the right lane and struck by the Cruise, which came to a stop with the pedestrian’s leg or legs under the rear axle.
I was given the opportunity to watch the video by Cruise. The pedestrian is crossing at a very unwise time to cross, while the street is full of traffic. She probably expected vehicles to stop for her. Indeed, the display of the Cruise perception system shows it detected her and would presumably had stopped if she was in the lane of travel of the robotaxi, but she moved into the left lane and was attempting to stand on the median line when struck.
Based on this information, the Cruise vehicle (which had nobody on board) stopped abruptly, probably as abruptly as it could. With somebody under the car, it would be risky to attempt to move the vehicle without a full understanding of the situation, as this could cause more injury. On the other hand, the vehicle should be able to know that something—somebody—is under the rear wheels due to the tilt of the vehicle and know that it would be good to get off of them, but it’s a very challenging decision to make. A human driver might get out of the vehicle to assess if moving forward or backwards was the fastest way off the victim, but it’s unclear if there are any hard or fast rules on whether such moves—or remaining in place—are worse or best.
In this case, emergency crews which arrived on the scene instructed Cruise not to move their vehicle, and they used the famous “jaws of life” to extract the victim. Nonetheless this meant a long delay before she could be freed and taken to hospital, and witnesses report she showed signs of great pain.
In a strange irony, just 4 days ago online I brought up this nightmare question of a vehicle being aware or unaware of what it had done in an online thread. Robocars do not have the same awareness of their world that humans do. In many ways it is superior, but there are ways in which it is not, and so it is possible for a vehicle to hit somebody or something without full awareness of it, and also possible to drag a victim without awareness. Human drivers sometimes also are unaware, and many drivers claim they left the scene because they did not know they had hit anything.
It may well be that this happens to robocars less than to human drivers. There is no evidence otherwise as such events are extremely rare. On the other hand, they will play an outsized role in affecting public impressions of the technology, so teams are motivated to go above and beyond.
It’s not clear what to do, though. Sensors could be placed on the bottoms of cars, but in the case of cameras and lidars, it’s almost impossible to keep those clean. Ultrasonic sensors are cheap and more robust against dirt, but they only reveal a limited amount. As noted, the car should already be able to tell from its tilt if any wheels are over other than flat road surface, but that may not reveal what to do.
One futuristic approach would be to have the car able to deploy a drone, under the direction of human remote operators. Some prototypes have been built where a drone can deploy from a vehicle. Such a drone could inspect anywhere — under the car or to the side of it. It could also go above to get the big picture, which could be useful in getting out of many sticky situations. In an empty vehicle, the drone could just fly back in the window to be remounted later, as long as use is rare. If rare enough the drone could even be disposable, a not unreasonable approach in life or death situations.
In the city, though, a drone could be dispatched from a depot and get anywhere in the city in minutes. This requires a drone licensed to fly beyond line of sight, but this is worth doing for emergencies—not just for robocars. And it’s not out of the question that because in the city there will always been passers-by with phones, you could ask one to join a video call where a doctor is watching and can direct the phone to let the doctor see and talk to the patient and make a decision.
On reader suggested cars might come with built in jacks. Such jacks could lift a car off somebody (as long as they know they are not right above the victim) and also be used for faster maintenance in the field or even when stuck. These exist but would add cost.
There has also been discussion of placing airbags on the front of vehicles. Robocars, with their advanced sensors, can trigger an airbag in advance of an impact, while traditional airbags are only triggered after one. Nuro, the delivery company, has placed an airbag on the front of their vehicle to soften impacts with a pedestrian. Autoliv and Volvo made an airbag for the window pillars to protect a cyclist hit by a car, which could work because the impact of the cyclist’s legs/bicycle could trigger the airbag before their head reached the windshield pillars. This would not help with a person going under a car, though a bag going all the way down could cause a victim to be pushed rather than run over—crash tests would be needed to figure out if there is a safe course.
Single person robotaxis, which are the right size for 80% of trips, will be more able to avoid hitting a pedestrian, and also lower mass.
Predicting the problem
Robocars are constantly scanning all other parties on the road and making predictions about where they will go next. In particular, they alert when there is a decent chance that somebody or something on the road will intersect the planned path of the robocar. For the Cruise, that only became true when the pedestrian was thrown into their path, and they hit the brakes hard as soon as that was seen to be the case.
One potential option would be to get better at predicting the bigger situation. Unfortunately, while jaywalking is technically illegal, it is also very common and in fact plays a role in making cities more pleasant. It is often said that streets are for people, not cars, and it’s true—but the cars are generally fully of people who want good traffic flow when they are in a car and ease crossing the street when they are not and must decide how to trade those off.
As such, it’s not easy to suggest a vehicle should proactively stop on detecting a pedestrian who is not going to intersect your path. Instead, pedestrians and drivers usually do a dance on lightly trafficked roads to keep everybody moving. Here, one wishes to detect that the other car will hit the pedestrian. Detecting it early requires knowing that the other driver is the sort who might run down a pedestrian in the street, or expecting that the pedestrian might be erratic, as can happen. Cars can’t break every time they see a pedestrian who might be involved in a rare situation.
On the other hand, in this specific situation—no traffic behind and no passengers in the car—the car could have slowed down well in advance because of the pedestrian on the road with no impediment to anybody. That might be a good heuristic but does not solve the general problem.
Even doing this is not without issues. In the dance of pedestrians and drivers, pedestrians often prefer to cross the road behind a car, not in front of it. We will advance in the street but wait for the car to pas—and get annoyed when it slows so we can pass in front of it, as we don’t trust the driver and passing behind is 100% safe if no other cars are coming.
This is an unusual, but hardly unheard of situation where a pedestrian is doing a dangerous dance with traffic. Drivers don’t stop every time this happens but it could make sense to try to measure how unusual or risky the situation looks, and slow in the worst cases, even if slowing for all of them would impede traffic too much.
In particular, the Cruise prediction engine, which tries to assess where pedestrians will go, possibly should have considered that, with such a poor crossing situation, the pedestrian might reverse course and go back into the Cruise’s lane. She does not do this, but if the vehicle assigned a certain chance of it, it might consider slowing to be ready in that event. Almost all the time in a situation like this, however, the other car would stop for the pedestrian in front of them—this is required by law if possible—and so the predictor would not assign a significant probability to the event that actually took place.
There’s not a lot of data on these sort of situations, but we can expect that Cruise and other teams will put more situations of this sort into their simulators to see if they can develop approaches which always improve the situation. The way liability works, companies are afraid to do approaches which sometimes improve the situation but may make it worse, as it is only when you make it worse that it shows up in the courts or in the news.