Are Google's self driving cars as advanced as we think they are? And if they aren't, do you trust them to do the right thing?
MIT Technology Review has a news item written by Lee Gomes that puts Google's self driving car into a new perspective.
We are all impressed, or should that be overly impressed, that the experimental cars have driven over 700,000 miles with the only incidents happening when a human took over the wheel. The new smaller "urban" version of the self driving cars seem to be close to production and they are seen driving round the Googleplex, again incident free, without causing concern. The fact that they only go at a maximum of 25 mph might give you a small pause for thought that perhaps the technology isn't safe to zip along at 40 say, but you can put that down to Google's cautious approach.
What the article points out, however, is that there are a great many situations that the car cannot cope with and this rules out going to perhaps 90% of the places you might want to go.
This video indicates how Google would like you to think about its car:
My guess is that like me you assumed that the driverless car was a "smart" or "very smart" car, but it seems that it really isn't.
According to MIT Technology Review, and Google hasn't contradicted the main points even though it has clarified some points in the article, it needs the roads that it is to drive on to be mapped at a level that goes well beyond Google maps. We have known that the car makes use of maps for some time, but the degree of reliance on the quality of the map is something that hasn't really been emphasized before.
It needs a special sensor vehicle to make multiple passes and then manual and computer processing of the resulting data. It is claimed that while some mapping omissions can be handled by the car, it could still be confused by road changes. The map is used not only to allow the car to locate itself accurately, GPS isn't enough, but so that it can be aware of local features. If a stop sign isn't on the map, for example, the onboard sensors might still detect it, but if they miss it then basic obstacle avoidance is relied upon to bring the car to a halt if necessary. However, a complex junction complete with multiple traffic signals might confuse it.
The fact that the autonomous driving software is map-based, and only senor-assisted, means that the whole of the country would have to be accurately mapped, and currently only a few thousand miles have been surveyed. The idea is that once the map has been constructed the cars themselves would keep it up-to-date by sending in data on changes and anything new.
Then there is the problem of the weather. Currently the cars don't cope with rain or snow. In addition, they haven't tackled big complex parking lots or multi-level garages. While the cameras can detect the color of a traffic signal, the problem of what to do when the sun dazzles the camera hasn't been solved. At the moment there is no distinction between a detected pedestrian and a police officer trying to direct traffic. Finally potholes are an undetected problem waiting to jolt the car.
Given that the car is map-based and the onboard software doesn't use machine learning you have to ask what can the system cope with?
At its simplest driving a car is a feedback control problem - keep the vehicle on the road and avoid obstacles both moving and stationary. As you add common situations you need computer vision to help the car deal with intersections and other traffic hazards. You also have to add algorithms to cope with the impact, hopefully not literally, on other road users, such as braking rate and pulling out in to the path of other vehicles. When you move up a notch and start to consider some of the less common scenarios such as road crews, diversions, traffic accidents and other hazards, you start to suspect that real intelligence is needed.
It is tempting to think that an automatic car is just a train without rails and all we have to do is provide software-based rails. Deeper consideration reveals that a car is out there in the real world and can encounter a rich and complex range of situations that might well need instantaneous decision-making to solve - and this goes well beyond what a map-based sensor system can deal with.
It might well be that it is easy to get 90% of the way to a self-driving car, but the final 10% may take more ingenuity than we have at the moment.
The winners of Round Two of the Imagine Cup Earth, a worldwide contest organized by Microsoft in conjunction with NASA for students aged between 6 and 18, have been announced. The deadline for Ro [ ... ]
Intel is organizing another gaming workshop in London on June 4th. This time there is a focus on VR and Google Cardboards will be given to the first 200 attendees. It's also a chance to showcase your [ ... ]