My son got married last weekend and my sister drove in from out of town to visit and go to the wedding. She drove her new Subaru Outback. I was with her when she purchased it and at that time I told her that she didn't buy a car that she bought technology.
She was grumbling and complaining about the auto-driving feature of the car all the time. In the DC area if you don't move with traffic you are jeopardizing your life. Speed limits be damned. The car wanted to go 10 to 15 mph slower then the traffic was moving because of the database of posted speeds. She eventually turned it off.
Now researchers have foiled Tesla's auto driving system in their cars. And it's not even a software hack!
I keep a Case B210 lawn tractor in my back pocket assuming that one day these autonomous driving systems will hiccup from a central database meltdown and leave us stranded. I may not get to my destination quickly or with lots of creature comforts but I will get there ... eventually.
You post regarding autonomous cars, or partially autonomous cars brings up some interesting ethical points. First, the traffic rules are intended to keep everyone safe. Safe speeds, signaling before changing lanes, stopping at signs and yielding appropriately at intersections, etc. Here is the rub though; a violation (moving violation) is usually given when a driver breaks those rules, but is more likely to be enforced when the driver is creating a danger for others.
A clear cut case of this is the violation a person can receive if they drive with snow on their car that might fly off at speed. This behavior creates a danger for others. Speed violations (too fast or too slow) creates a hazard and a danger to oneself, but more importantly for others. Failing to yield at a stop sign is a violation, even when there is no other traffic, because it can cause a danger to others.
Now, on to autonomous cars; in the scenario you describe, if ALL cars were autonomous, they would all drive the speed limit and there would be no danger. In a case where a few cars are following the "proper rules" but others are violating them, while legally all of the manual drivers are creating the hazard, it is actually the slower vehicle that is creating the danger, even if it is not inline with the legal definition of a moving violation.
Enter your actual question though, imagine a scenario where the car's software malfunctions, and the car rolls through a stop light or a sign, or fails to yield. Even though the car's sensors will likely prevent an accident, it is still a violation. Who should receive the ticket? The driver wasn't in control, and likely would not have been able to gain control by the time they realized that the car was in violation. How about the manufacturer? The manufacturer of the software? Will the car tell on itself and report the incident?
The ethics around human safety and autonomous machines that interact with us is a difficult topic, that unfortunately, the law is not quite ready for. All said, it will be interesting as some of these items come to light in the courts, and eventually in front of law makers.
Autonomous driving is a real bad idea. I'm sure there are those that are wondering why those are options that have to be paid for currently. I, for one, will drive my '03 Yukon Denali gas guzzler into the ground before I update to newer technology. The vehicle has 273K miles on it and is going strong.
My sister's vehicle already has two recalls on it. both for software issues. Not mechanical! LOL