As a famous engineer said, "and if my grandmother had wheels, she'd be a wagon."
Something to consider is that a "good" risk analysis takes into account the foreseeable use and misuse of a product based on the information available. Given that it is no secret that AutoPilot users have used the system contrary to directions, it is reasonable to demand that the risk analysis be updated to account for this reality (assuming it did not account for this) as part of the continuous PLM process.
The risk analysis certainly needs to account for misuse, but that doesn't necessarily mean that the product needs to be designed to prevent it. If cars were designed that way, they'd all be governed to 85MPH and refer the driver for mandatory training if they failed to signal for turns too frequently.
Something to consider is that a "good" risk analysis takes into account the foreseeable use and misuse of a product based on the information available. Given that it is no secret that AutoPilot users have used the system contrary to directions, it is reasonable to demand that the risk analysis be updated to account for this reality (assuming it did not account for this) as part of the continuous PLM process.