Following another fatal Tesla crash, accident investigators have announced that they have stopped working with the company. Self-driving cars urgently need ‘ethical black boxes’ so that we can all learn from their mistakes.
Fri 13 Apr 2018
Self-driving cars are learning to drive. The algorithms that control them need to be fed vast quantities of real world data in order to improve. Cities and freeways, particularly in the US, are the laboratories in which they are being trained. Companies like Waymo, Uber and Tesla would argue that this real-world experience is vital for machine learning. Others would say that it creates an experiment in which other road users are unwitting test subjects. When technologies fail and people die, as happened with the Uber crash in Tempe last month, everyone, not just self-driving car companies, needs to learn what happened and why. Social learning must take precedence over machine learning.