Case Studies in Autonomous Vehicles, Part 3: Will Autonomous Vehicles Be Able to Handle All Driving Conditions?


Parts 1 and 2 of this series explored the practical challenges of adopting shared use autonomous vehicles from a human behavior standpoint. In this final piece, I explore the issues AVs might face in the most challenging of driving conditions.

Autonomous vehicle technology is generally evaluated against a spectrum of 6 levels, ranging from Level Zero, which is a completely human controlled vehicle, to Level 5 being “completely” autonomous with no human intervention save perhaps for directions. Generally autonomous technology has been focused on ultimately developing truly “driverless” Level 5 vehicles. In order for fully autonomous vehicles to succeed at this level, they must be able to perform in all driving conditions at all times. Otherwise, we risk such vehicles getting stranded, crashing or stopping without the ability of a human driver to course correct.

Even if such situations occur only once in 100,000 miles, they could cause serious challenges, including stranded traffic and passengers and threats to passenger safety. Many individuals will also have “range anxiety” about riding in vehicles that can only go a certain distance or drive in certain conditions. Many of the trickiest driving situations, what I call “The Other 1 Percent,” require good human judgment, intuition and non-verbal communication. Indeed, the industry is acknowledging that closing out on this last, most challenging climb has not been easy. It is not clear how autonomous vehicles will perform in these situations.

What situations will driverless vehicles face and how can we design the vehicles so that they can handle such situations?

Here are some examples to consider:

Two autonomous vehicles meet on a one lane road

Two autonomous vehicles are driving toward each other on a one lane road. When they meet, there is no room on the shoulder for one of them to yield. So they come to a stop. How will they extract themselves from the situation? Will one of them be programmed to back up to a point where the other can pass? Will the passenger be required to intervene and give a voice command? Will the vehicles be communicating with each other or will they act independently? This situation seems like a particularly tricky one that traditionally has been handled using human judgment: one of the drivers knows to yield or to back up to an appropriate place for the other car to pass. Will an autonomous vehicle know how to do the same thing?

Emergency vehicle blocking the road

An emergency vehicle such as a police car is blocking one side of the roadway due to an incident such as a crash or medical emergency. In such situations, vehicles on the blocked side of the road will often approach the incident and proceed around the emergency vehicle, temporarily entering the wrong side of the road as long as it is safe and then proceeding back onto the right side of the road to continue their journey. What will autonomous vehicles do in such situations? Will they be programmed to enter the wrong-side of the road if it is safe? How will they know if it is safe? Will they wait indefinitely for the emergency vehicle to leave?


An autonomous vehicle is driving several passengers to a reception. However, it approaches an unexpected riot, with people throwing rocks and bottles in the street, making it dangerous to proceed. In such situations, human drivers know to (1) recognize and not to approach the area or (2) if they do approach closely for whatever reason, to turn around and get away from the area. What will an autonomous vehicle do? Will it be able to recognize such a dynamic and evolving situation? How? And what will such vehicles be programmed to do in such situations? In a dynamic, evolving and unstable situation in which human intuition is helpful, a driverless vehicle may not know what to do.

Power outages

Due to a heat wave, the power goes out at street lights throughout a neighborhood. Traditionally, human drivers know to proceed as if at a four way stop sign. Traffic cops sometimes are needed as well as human judgment and communication to proceed safely. Nevertheless, these situations often result in long backups. Will autonomous vehicles know how to effectively navigate streets that are without power? Will they be able to navigate in a mixed use environment, where human driven vehicles are also navigating such intersections?

Natural disasters

A local river overflows its banks and suddenly begins inundating local roads. A human driver may be aware that certain streets tend to flood during heavy rains and will take an alternative route. Will autonomous vehicles have the same information and make the same decision? And will an autonomous vehicle be able to distinguish between a typical rainy roadway versus a flooded one that is too deep to safety traverse?

Inaccurate Mapping/Directions

Mapping apps such as Google, Waze, etc. sometimes give inaccurate or unhelpful driving directions (i.e. directing you to take a street that is now closed or directing you to make a left turn at a busy, uncontrolled intersection.) In such situations, human drivers improvise: they will immediately turn around, take a right turn instead of a left or other maneuver designed to save time. What will autonomous vehicles be programmed to do? Will they stubbornly attempt to proceed to a street that is closed? These decisions could have significant consequences. If shared use vehicles get stuck because of incorrect mapping and they are unable to quickly adjust, it will take longer to pick up and drop off riders, making the sharing of such vehicles less attractive to passengers.

These and other challenging situations are complicated by the fact that driverless cars will likely be in a “mixed use” environment for many years where human-driven and fully autonomous vehicles will share the road.


The “one percent” of driving situations raises an interesting point: what if autonomous vehicles are never able to achieve truly driverless status given the rare but still necessary need for human judgment in some situations? This could have serious implications on various projections for the future, including the need to get driver’s licenses, how fleets are owned and managed, and the design of vehicles and the urban landscape. For example, if cars still need some level of human attention far into the future, that might lend itself towards more continued car ownership.

These scenarios may also influence the design and progress of autonomous vehicles. For example, can we be sure that a steering wheel will never be needed in an autonomous vehicle?

In order for autonomous vehicles to reach their full potential, they need to be able to operate in all environments and at all times so that passengers know they can go anywhere and so that snafus on the roads will be minimized. It appears that this challenge will take a considerable amount of time to address.

Blair Schlecter is based in Los Angeles and writes about transportation policy and innovation. He can be reached at

Photo: Dllu (Own work) [CC BY-SA 4.0], via Wikimedia Commons


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

driving in remote areas

I'd add driving down a dirt track in a 4wd with a sheer drop off to one side. Are you going to trust that your map data is all updated?

I can envision a time when there might be road lanes where cars are required to be set on autopilot, and perhaps also long haul trucking to the extent the vehicles remain on freeways (some logging trucks will also be going down those remote roads before hitting blacktop), but so long as there are also old manually controlled cars on the road I don't see vehicles ever not having a manual override. I still see an old Model T around my neighborhood on some nice weekends and there's no shortage of 40-50 year old pickups.

One lane road?

What happens when two autonomous cars meet on a one-lane road?

Easy. The passengers will get out and trade vehicles, which will then go back the way they came.

More seriously, AI is such that cars can all learn from each other. None of the circumstances discussed by Mr. Schlechter require special programming. Yes, I agree that cars will need steering wheels and an override button for a few years after they become commonplace, but I think the need for that will soon disappear.

More serious yet is the privacy issue. To use any vehicle a person will need a credit card, and then their every move will be recorded in the cloud for all eternity. No more sneaking off away from the prying eyes of government.

Daniel Jelski