I don't want to forego a stimulating intelectual discourse, but isn't it obvious?
Replace 'autonomous machine' with 'child' and it is clear a parent is responsible. For autonomous machines these are the owners. Machines may be autonomous, but if a real person is liable, then I am sure there will be plenty of checks and tests done to prevent any disasters.
In fact, the results of the accounting scandals not so long ago may be a good template. Investors were fully dependent on this autonomous, faceless, financial machine to churn out profits. That blew up in their face and they had no-one to blame. One result was the enactment of Sarbanes-Oxley which set, amongst other things, criminal penalties on individuals.
Interesting. However you forget one thing: Parents are not liable for damage their children do. E.g. in Germany as long as parents do not violate the obligatory supervision neither can they nor their children be made liable for any acts of their children.
Your example implicitly means that you do agree someone needs to be made liable for the child through 'obligatory supervision' - because if you don't do that, you will be accountable.
You say 'as long as parents do not violate'. Indeed, as long as you do what is required to ensure nothing goes wrong ("bonus pater familias"), but an accident still happens, then society/German Law accepts no one is liable.
Your argument seems to point more to: How do you define someone as liable and how do you enforce it?
If we take the case of the robot surgeon, then the person/institution who made the decision to use the robot would presumably be responsible (e.g. in this case, the hospital). The 'obligatory supervision' in this case would mean running regular tests, getting it repaired, etc...
I think in most cases we can come up with someone who is responsible for the robot. The person who owns it will generally be the person responsible but there will be exceptions (e.g. if I borrow your robot, break the "don't attack people" software and then use it).
If, after doing all the maintenance, the robot surgeon still kills someone, then you would need to investigate. Did something go wrong with the robot? If so, is this a defect in design that the manufacturers should have thought of? If so, blame them. If not, well, no one is to blame (unless it happens with the next model... where they should have learned from the past). I suspect we'd also end up seeing recalls of robots when flaws are detected, as we do with cars occasionally.
> Your example implicitly means that you do agree someone needs to be made liable for the child through 'obligatory supervision' - because if you don't do that, you will be accountable.
Sorry, I only tried to describe the legal situation in Germany. 'Aufsichtspflicht' is a legal term, and its translation is 'obligatory supervision'. The law says that parents are responsible for controlling their children in a sane way. I.e. your child can walk or bike to school unattended, but if you give her dangerous substances to play with and turn your back, you are held liable.
> Your argument seems to point more to: How do you define someone as liable and how do you enforce it?
That's what the courts do. As far as I know they use common sense and social norms to determine how much you have to guard your children.
Replace 'autonomous machine' with 'child' and it is clear a parent is responsible. For autonomous machines these are the owners. Machines may be autonomous, but if a real person is liable, then I am sure there will be plenty of checks and tests done to prevent any disasters.
In fact, the results of the accounting scandals not so long ago may be a good template. Investors were fully dependent on this autonomous, faceless, financial machine to churn out profits. That blew up in their face and they had no-one to blame. One result was the enactment of Sarbanes-Oxley which set, amongst other things, criminal penalties on individuals.
That is good motivation: accountability.