For those of us who live in areas with lots of snowy winter conditions, you can't just hand wave that requirement away (as we in industry do so often while developing software)!!!! Winter driving conditions are some of the most important times for driver assistance systems to help drivers out, especially young drivers. Tunnel vision while driving in snow is a very real thing that drivers encounter often enough where I live, and there have been plenty of times when I had to drive home on roads with barely visible boundaries; you figure out where the road is based on the ditch and the sound your tires make when they hit the edge of the road combined with a copious reduction in speed to give you time to recover.At this point it does not feel safe to trust self driving cars or assistive systems designed, built and tested primarily in California or the southern US for one simple reason: they do not get the range of adverse weather conditions that drivers in the rest of the world have to deal with and adapt safely to on a regular basis. It's easy to make a self driving car that "works" on California style freeways which are almost never under construction because they don't wear out as fast. In other places like eastern Ontario we sometimes have to deal with temperature shifts from -30C to +10C in 24 hours, salt our roads like crazy in the winter, and have a much wider range of typical weather conditions. These all take a significant toll on road infrastructure, and mean that what are rare corner cases in California become regular events elsewhere. We have 2 seasons where I live: construction season and winter. Based on several published reports of self driving cars hitting parked emergency vehicles or lane confusion in construction zones, I simply do not trust that the current widely available "self driving" vehicles are provably safe outside of the near ideal conditions present in California.
At least Waymo seems to be quite hesitant about rolling out to cities that have less favourable weather.
What I would like to see is for regulatory bodies with a safety first approach to accidents (similar to how the FAA investigates and regulates commercial aircraft) be involved in setting the criteria for the design, testing and regulation of self driving cars and driver assistance systems. Reading reports and watching shows about the root cause analysis of airplane crashes is fascinating, and it shows just how hard it is to learn how to make large and complex real world systems safe. It has taken plenty of deaths to get us to the point where commercial flights are safer than the trip to the airport, and it will take many more deaths before self driving cars are appreciably better than humans. Some of the most important lessons from aviation are about the interaction between pilot(s), crew and automation, and how those systems fail.
Test cases / data for self driving cars should be shared and made public. If we're trusting our lives to a piece of software, we should be able to see how well it does across standard test cases that the industry has encountered and developed, and be able to help add more. Capitalism does many things well, but making things safe for humans is not one of them.