To make the ideas contained in the checklist more concrete, we've compiled examples of times when things have gone wrong. They're paired with the checklist questions to help illuminate where in the process ethics discussions may have helped provide a course correction.
Checklist Question | Examples of Ethical Issues --- | --- | **Data Collection** **A.1 TEST Informed consent**: If there are human subjects, have they given informed consent, where subjects affirmatively opt-in and have a clear understanding of the data uses to which they consent? |- [TEST Facebook uses phone numbers provided for two-factor authentication to target users with ads.](https://techcrunch.com/2018/09/27/yes-facebook-is-using-your-2fa-phone-number-to-target-you-with-ads/)
- [African-American men were enrolled in the Tuskagee Study on the progression of syphilis without being told the true purpose of the study or that treatment for syphilis was being withheld.](https://en.wikipedia.org/wiki/Tuskegee_syphilis_experiment)
- [StreetBump, a smartphone app to passively detect potholes, may fail to direct public resources to areas where smartphone penetration is lower, such as lower income areas or areas with a larger elderly population.](https://hbr.org/2013/04/the-hidden-biases-in-big-data)
- [Facial recognition cameras used for passport control register Asian's eyes as closed.](http://content.time.com/time/business/article/0,8599,1954643,00.html)
- [Personal information on taxi drivers can be accessed in poorly anonymized taxi trips dataset released by New York City.](https://www.theguardian.com/technology/2014/jun/27/new-york-taxi-details-anonymised-data-researchers-warn)
- [Netflix prize dataset of movie rankings by 500,000 customers is easily de-anonymized through cross referencing with other publicly available datasets.](https://www.wired.com/2007/12/why-anonymous-data-sometimes-isnt/)
- [In six major cities, Amazon's same day delivery service excludes many predominantly black neighborhoods.](https://www.bloomberg.com/graphics/2016-amazon-same-day/)
- [Facial recognition software is significanty worse at identifying people with darker skin.](https://www.theregister.co.uk/2018/02/13/facial_recognition_software_is_better_at_white_men_than_black_women/)
- [-- Related academic study.](http://proceedings.mlr.press/v81/buolamwini18a.html)
- [Personal and financial data for more than 146 million people was stolen in Equifax data breach.](https://www.nbcnews.com/news/us-news/equifax-breaks-down-just-how-bad-last-year-s-data-n872496)
- [Cambridge Analytica harvested private information from over 50 million Facebook profiles without users' permission.](https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html)
- [AOL accidentally released 20 million search queries from 658,000 customers.](https://www.wired.com/2006/08/faq-aols-search-gaffe-and-you/)
- [The EU's General Data Protection Regulation (GDPR) includes the "right to be forgotten."](https://www.eugdpr.org/the-regulation.html)
- [FedEx exposes private information of thousands of customers after a legacy s3 server was left open without a password.](https://www.zdnet.com/article/unsecured-server-exposes-fedex-customer-records/)
- [When Apple's HealthKit came out in 2014, women couldn't track menstruation.](https://www.theverge.com/2014/9/25/6844021/apple-promised-an-expansive-health-app-so-why-cant-i-track)
- [A widely used commercial algorithm in the healthcare industry underestimates the care needs of black patients, assigning them lower risk scores compared to equivalently sick white patients.](https://www.nature.com/articles/d41586-019-03228-6)
- [-- Related academic study.](https://science.sciencemag.org/content/366/6464/447)
- [word2vec, trained on Google News corpus, reinforces gender stereotypes.](https://www.technologyreview.com/s/602025/how-vector-space-mathematics-reveals-the-hidden-sexism-in-language/)
- [-- Related academic study.](https://arxiv.org/abs/1607.06520)
- [Women are more likely to be shown lower-paying jobs than men in Google ads.](https://www.theguardian.com/technology/2015/jul/08/women-less-likely-ads-high-paid-jobs-google-study)
- [Misleading chart shown at Planned Parenthood hearing distorts actual trends of abortions vs. cancer screenings and preventative services.](https://www.politifact.com/truth-o-meter/statements/2015/oct/01/jason-chaffetz/chart-shown-planned-parenthood-hearing-misleading-/)
- [Georgia Dept. of Health graph of COVID-19 cases falsely suggests a steeper decline when dates are ordered by total cases rather than chronologically.](https://www.vox.com/covid-19-coronavirus-us-response-trump/2020/5/18/21262265/georgia-covid-19-cases-declining-reopening)
- [Strava heatmap of exercise routes reveals sensitive information on military bases and spy outposts.](https://www.theguardian.com/world/2018/jan/28/fitness-tracking-app-gives-away-location-of-secret-us-army-bases)
- [Excel error in well-known economics paper undermines justification of austerity measures.](https://www.bbc.com/news/magazine-22223190)
- [Variables used to predict child abuse and neglect are direct measurements of poverty, unfairly targeting low-income families for child welfare scrutiny.](https://www.wired.com/story/excerpt-from-automating-inequality/)
- [Amazon scraps AI recruiting tool that showed bias against women.](https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G)
- [Criminal sentencing risk asessments don't ask directly about race or income, but other demographic factors can end up being proxies.](https://www.themarshallproject.org/2015/08/04/the-new-science-of-sentencing)
- [Creditworthiness algorithms based on nontraditional criteria such as grammatic habits, preferred grocery stores, and friends' credit scores can perpetuate systemic bias.](https://www.whitecase.com/publications/insight/algorithms-and-bias-what-lenders-need-know)
- [Apple credit card offers smaller lines of credit to women than men.](https://www.wired.com/story/the-apple-card-didnt-see-genderand-thats-the-problem/)
- [Google Photos tags two African-Americans as gorillas.](https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-two-african-americans-as-gorillas-through-facial-recognition-software/#12bdb1fd713d)
- [With COMPAS, a risk-assessment algorithm used in criminal sentencing, black defendants are almost twice as likely as white defendants to be mislabeled as likely to reoffend.](https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing)
- [-- Northpointe's rebuttal to ProPublica article.](https://www.documentcloud.org/documents/2998391-ProPublica-Commentary-Final-070616.html)
- [-- Related academic study.](https://www.liebertpub.com/doi/pdf/10.1089/big.2016.0047)
- [Google's speech recognition software doesn't recognize women's voices as well as men's.](https://www.dailydot.com/debug/google-voice-recognition-gender-bias/)
- [Google searches involving black-sounding names are more likely to serve up ads suggestive of a criminal record than white-sounding names.](https://www.technologyreview.com/s/510646/racism-is-poisoning-online-ad-delivery-says-harvard-professor/)
- [-- Related academic study.](https://arxiv.org/abs/1301.6822)
- [Facebook seeks to optimize "time well spent", prioritizing interaction over popularity.](https://www.wired.com/story/facebook-tweaks-newsfeed-to-favor-content-from-friends-family/)
- [YouTube's search autofill suggests pedophiliac phrases due to high viewership of related videos.](https://gizmodo.com/youtubes-creepy-kid-problem-was-worse-than-we-thought-1820763240)
- [Patients with pneumonia with a history of asthma are usually admitted to the intensive care unit as they have a high risk of dying from pneumonia. Given the success of the intensive care, neural networks predicted asthmatics had a low risk of dying and could therefore be sent home. Without explanatory models to identify this issue, patients may have been sent home to die.](http://people.dbmi.columbia.edu/noemie/papers/15kdd.pdf)
- [GDPR includes a "right to explanation," i.e. meaningful information on the logic underlying automated decisions.](hhttps://academic.oup.com/idpl/article/7/4/233/4762325)
- [Google Flu claims to accurately predict weekly influenza activity and then misses the 2009 swine flu pandemic.](https://www.forbes.com/sites/stevensalzberg/2014/03/23/why-google-flu-is-a-failure/#6fa6a1925535)
- [Software mistakes result in healthcare cuts for people with diabetes or cerebral palsy.](https://www.theverge.com/2018/3/21/17144260/healthcare-medicaid-algorithm-arkansas-cerebral-palsy)
- [Google "fixes" racist algorithm by removing gorillas from image-labeling technology.](https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai)
- [Sending police officers to areas of high predicted crime skews future training data collection as police are repeatedly sent back to the same neighborhoods regardless of the true crime rate.](https://www.smithsonianmag.com/innovation/artificial-intelligence-is-now-used-predict-crime-is-it-biased-180968337/)
- [-- Related academic study.](https://arxiv.org/abs/1706.09847)
- [Microsoft's Twitter chatbot Tay quickly becomes racist.](https://www.theguardian.com/technology/2016/mar/24/microsoft-scrambles-limit-pr-damage-over-abusive-ai-bot-tay)
- [Deepfakes—realistic but fake videos generated with AI—span the gamut from celebrity porn to presidential statements.](http://theweek.com/articles/777592/rise-deepfakes)