There is a real danger that there will be a class who use algorithms and a class that are used by algorithms
The global health impact of the diesel emissions scandal is estimated to be a minimum of 38,000 premature deaths per year.
Lee McKnight, an associate professor at Syracuse University’s School of Information Studiesght warns that ‘right now, literally, coders can get away with murder.‘
Dieselgate illustrates why we urgently need a debate on how we ensure algorithmic literacy, transparency and oversight in Government and amongst campaigners.
I have attended meetings on ‘Smart Cities’ where there is a refusal to answer even the most basic questions on the cyber security and algorithmic transparency of driverless vehicles. The answer is evaded on the grounds of ‘sensitivity’.
Do we want to repeat the Dieselgate scandal where corporations have deliberately concealed technological flaws for profit, and at the expense of citizens health and the environment?
It was a little lab in West Virginia that caught Volkswagens big cheat
The lab explained “It’s both writing the code, but you also need to do validation. So someone had to take these vehicles out, test them on the standard test cycle, make sure that the emission controls are supposed to be working when they’re supposed to be working.”
However it took a year long investigation by expert researchers to completely unravel and identify the code.
‘Researchers found code that allowed a car’s onboard computer to determine that the vehicle was undergoing an emissions test. The computer then activated the car’s emission-curbing systems, reducing the amount of pollutants emitted. Once the computer determined that the test was over, these systems were deactivated.
“The Volkswagen defeat device is arguably the most complex in automotive history,” said Kirill Levchenko, computer scientist at the University of California San Diego.
Researchers found a less sophisticated circumventing ploy for the Fiat 500X. That car’s onboard computer simply allows its emissions-curbing system to run for the first 26 minutes and 40 seconds after the engine starts– roughly the duration of many emissions tests.
They also noted that for both Volkswagen and Fiat, the vehicles’ Engine Control Unit is manufactured by automotive component giant Robert Bosch. Car manufacturers then enable the code by entering specific parameters.’
Scott McLeod, an associate professor of educational leadership at the University of Colorado, Denver warns:
‘Right now the technologies are far outpacing our individual and societal abilities to make sense of what’s happening and corporate and government entities are taking advantage of these conceptual and control gaps’
Meanwhile Justin Reich, executive director at the MIT Teaching Systems Lab, observed:
‘We also need new forms of code review and oversight, that respect company trade secrets but don’t allow corporations to invoke secrecy as a rationale for avoiding all forms of public oversight.’
Another anonymous Professor cautioned:
“[The challenge presented by algorithms] is the greatest challenge of all. Greatest because tackling it demands not only technical sophistication but an understanding of and interest in societal impacts. The ‘interest in’ is key. Not only does the corporate world have to be interested in effects, but consumers have to be informed, educated and, indeed, activist in their orientation toward something subtle. This is what computer literacy is about in the 21st century.”
And Robert Bell, co-founder of the Intelligent Community Forum added:
“Transparency is the great challenge. As these things exert more and more influence, we want to know how they work, what choices are being made and who is responsible. The irony is that, as the algorithms become more complex, the creators of them increasingly do not know what is going on inside the black box. How, then, can they improve transparency?”
The FBI warns driverless cars could be used as ‘lethal weapons’
‘Criminals might override safety features to ignore traffic lights and speed limits, or terrorists might program explosive-packed cars to become self-driving bombs.‘
This ‘directly contradicts the message that many developers of self-driving vehicles are trying to communicate: that these cars – immune from road rage, tiredness and carelessness – can be even safer than human operators.’
Charlie Miller is a cyber security researcher who famously hacked a Jeep Cherokee sparking a 1.4 million recall of vehicles. He has recently left Uber, citing a need to talk more openly about how securing autonomous cars from hackers is a very difficult problem
Lack of algorithmic transparency means that commercial driverless vehicles, developed alongside the military, could have deeply embedded software that can be activated on the streets at any time, transforming vehicles into military weapons.
Like Dieselgate we can never be sure what is hidden in the code? There will be a lot of ‘snake oil’ cyber security companies wanting to make mega bucks out of the impossibility of securing driverless vehicles. But it is a dead end tech. Cyber Security is impossible.
Pingback: Londoners: beware the driverless Trojan horse | Ban Private Cars in London