I spent several years working at Google, during the early development and testing of the autonomous cars. It was a joke among some of the employees that we wanted to be warned when they were taking any of the fleet out for testing so we could stay off the road. All kidding aside, the track record for Google’s project has been excellent so far, but now cannot be called spotless. The first accident that can be attributed to the car itself (and not the other driver) has finally occurred. Part of the issue, especially back then, was the cars were far too passive in how they drove. Driving behind one of them was literally like driving behind someone’s grandma. Tweaks have been made to the car’s software to bring out a little more aggressiveness, to better simulate a real world driver.
Google has prided itself on the fact that its self-driving car fleet has never been responsible for any of its crashes — they’ve always been caused by another (decidedly more human) force — but that may have just changed. According to a California DMV filing first reported by writer Mark Harris, one of Google’s self-driving Lexus SUVs drove into the side of a bus at low speed.
Here’s the full description of the incident from the report:
A Google Lexus-model autonomous vehicle (“Google AV”) was traveling in autonomous mode eastbound on El Camino Real in Mountain View in the far right-hand lane approaching the Castro St. intersection. As the Google AV approached the intersection, it signaled its intent to make a right turn on red onto Castro St. The Google AV then moved to the right-hand side of the lane to pass traffic in the same lane that was stopped at the intersection and proceeding straight. However, the Google AV had to come to a stop and go around sandbags positioned around a storm drain that were blocking its path. When the light turned green, traffic in the lane continued past the Google AV. After a few cars had passed, the Google AV began to proceed back into the center of the lane to pass the sand bags. A public transit bus was approaching from behind. The Google AV test driver saw the bus approaching in the left side mirror but believed the bus would stop or slow to allow the Google AV to continue. Approximately three seconds later, as the Google AV was reentering the center of the lane it made contact with the side of the bus. The Google AV was operating in autonomous mode and traveling at less than 2 mph, and the bus was traveling at about 15 mph at the time of contact.
The Google AV sustained body damage to the left front fender, the left front wheel and one of its driver’s -side sensors. There were no injuries reported at the scene.
The Verge has obtained an excerpt from Google’s next monthly self-driving report, which is due to be released tomorrow. In it, Google says the car assumed that the bus would yield when it attempted to merge back into traffic:
Our self-driving cars spend a lot of time on El Camino Real, a wide boulevard of three lanes in each direction that runs through Google’s hometown of Mountain View and up the peninsula along San Francisco Bay. With hundreds of sets of traffic lights and hundreds more intersections, this busy and historic artery has helped us learn a lot over the years. And on Valentine’s Day we ran into a tricky set of circumstances on El Camino that’s helped us improve an important skill for navigating similar roads.
El Camino has quite a few right-hand lanes wide enough to allow two lines of traffic. Most of the time it makes sense to drive in the middle of a lane. But when you’re teeing up a right-hand turn in a lane wide enough to handle two streams of traffic, annoyed traffic stacks up behind you. So several weeks ago we began giving the self-driving car the capabilities it needs to do what human drivers do: hug the rightmost side of the lane. This is the social norm because a turning vehicle often has to pause and wait for pedestrians; hugging the curb allows other drivers to continue on their way by passing on the left. It’s vital for us to develop advanced skills that respect not just the letter of the traffic code but the spirit of the road.
On February 14, our vehicle was driving autonomously and had pulled toward the right-hand curb to prepare for a right turn. It then detected sandbags near a storm drain blocking its path, so it needed to come to a stop. After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the center of the lane at around 2 mph – and made contact with the side of a passing bus traveling at 15 mph. Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it. (You can read the details below in the report we submitted to the CA DMV.)
Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.
This is a classic example of the negotiation that’s a normal part of driving – we’re all trying to predict each other’s movements. In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.
We’ve now reviewed this incident (and thousands of variations on it) in our simulator in detail and made refinements to our software. From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.
And in fairness, unless every single car on the road is autonomous, Google is right: there is some degree of negotiation involved, and false assumptions in those negotiations are where the crashes can happen. We’re many, many years away from a road free of human drivers, and until then, self-driving cars are occasionally going to hit things. It just so happens that this is the first time the crash was directly attributable to the car, not another driver on the road, and it all comes back to a bit of (surprisingly human) bad judgment.
source: theverge.com, Google