Last year, Google employees (as well as US senators from both parties) expressed concern about Google's Dragonfly project, which appeared to collude with the Chinese government in censorship and suppression of human rights. A secondary concern was that Dragonfly was conducted in secrecy, without involving Google's privacy team.
Google's official position (led by CEO Sundar Pinchai) was that Dragonfly was "just an experiment". Jack Poulson, who left Google last year over this issue and has now started a nonprofit organization called Tech Inquiry, has also seen this pattern in other technology projects.
"I spoke to coworkers and they said 'don’t worry, by the time the thing launches, we'll have had a thorough privacy review'. When you do R and D, there's this idea that you can cut corners and have the privacy team fix it later." (via Alex Hern)A few years ago, Microsoft Research ran an experiment on "emotional eating", which involved four female employees wearing smart bras. "Showing an almost shocking lack of sensitivity for gender stereotyping", wrote Sebastian Anthony. While I assume that the four subjects willingly volunteered to participate in this experiment, and I hope the privacy of their emotional data was properly protected, it does seem to reflect the same pattern - that you can get away with things in the R and D stage that would be highly problematic in a live product.
Poulson's position is that the engineers working on these projects bear some responsibility for the outcomes, and that they need to see that the ethical principles are respected. He therefore demands transparency to avoid workers being misled. He also notes that if the ethical considerations are deferred to a late stage of a project, with the bulk of the development costs already incurred and many stakeholders now personally invested in the success of the project, the pressure to proceed quickly to launch may be too strong to resist.
Sebastian Anthony, Microsoft’s new smart bra stops you from emotionally overeating
(Extreme Tech, 9 December 2013)
Erin Carroll et al, Food and Mood: Just-in-Time Support for Emotional Eating (Humaine Association Conference on Affective Computing and Intelligent Interaction, 2013)
Ryan Gallagher, Google’s Secret China Project “Effectively Ended” After Internal Confrontation (The Intercept,
17 December 2018)
Alex Hern, Google whistleblower launches project to keep tech ethical (Guardian, 13 July 2019)
Casey Michel, Google’s secret ‘Dragonfly’ project is a major threat to human rights (Think Progress, 11 Dec 2018)
Iain Thomson, Microsoft researchers build 'smart bra' to stop women's stress eating
(The Register, 6 Dec 2013)
Related posts: Have you got Big Data in your Underwear? (December 2014), Affective Computing (March 2019)
The saying "Just an experiment" is so out of touch with the ethical development of scientific experiments that it's worrying - a lot of the scientific ethos (eg hypothesis testing for agile development) seems to have been imported by the private sector, but does this remark indicate that it's very much a "choose the favourable parts, ignore anything that is less favourable"?
ReplyDeleteAt a university, you would expect all experiments (especially ones with live participants) to be run past an ethics board and a procedure to consider harm. Obviously experiments from decades back lacked this insight - sadly, the results of these are only recently being questioned in many cases, and will take decades more to reverse in popular discourse outside of those discussions. Will the same be true of the tech we use in the second half of this century?
True, but academic ethics boards can sometimes be merely bureaucratic box-ticking. And Ben Goldacre talks about the ethical paradox in medical research, whereby the ethics committee blocks potentially useful initiatives. See my post Ethics committee raises alarm. Dr Goldacre has also campaigned against about other ways experimental findings can be misrepresented and misused, including publication bias.
ReplyDelete