Gender Recognition or Gender Reductionism?: The Social Implications of Embedded Gender Recognition Systems

Morgan Klaus Scheuerman
Morgan Klaus Scheuerman

CHI, 2018.

Cited by: 41|Bibtex|Views55|Links
EI
Keywords:
negative impactautonomyuser-centered designfacial recognitionnegative attitudeMore(9+)
Weibo:
We studied the perceptions and attitudes of transgender individuals towards automatic gender recognition, a technology that aims to classify a person’s gender based on their physical characteristics

Abstract:

Automatic Gender Recognition (AGR) refers to various computational methods that aim to identify an individualu0027s gender by extracting and analyzing features from images, video, and/or audio. Applications of AGR are increasingly being explored in domains such as security, marketing, and social robotics. However, little is known about st...More

Code:

Data:

Introduction
  • Gender is a significant social construct in human cultures; it permeates both the offline, physical worlds, and increasingly the online, virtual spaces and digital devices [5,40].
  • Whether it be through social networks or video games, users increasingly encounter embedded gender.
  • Sensitivity to one’s gender identity is crucially important, as the act of misgendering––whereby one’s gender is incorrectly identified, leading to the use of incorrect gendered words– –is a form of “structural violence” that can have a significant negative impact on trans individuals [21,31,35]
Highlights
  • Gender is a significant social construct in human cultures; it permeates both our offline, physical worlds, and increasingly our online, virtual spaces and digital devices [5,40]. Whether it be through social networks or video games, users increasingly encounter embedded gender. Another outgrowth of this trend is the development of automatic gender recognition (AGR), a class of algorithms that use various techniques, including facial recognition [27,32] and body recognition [6,45], to classify an individual’s gender
  • We studied the perceptions and attitudes of transgender individuals towards automatic gender recognition (AGR), a technology that aims to classify a person’s gender based on their physical characteristics
  • We found that participants had overwhelmingly negative attitudes towards Automatic Gender Recognition and questioned if it can offer any beneficial applications to end users
  • We presented several recommendations for incorporating gender in system design, including informing users if their gender information would be used, giving them the option to opt out and allowing them to communicate their own gender identity to systems
  • With respect to Automatic Gender Recognition, we are not necessarily arguing for the elimination of gender recognition from technology, but a careful consideration of the implications of incorporating it. These recommendations point towards an approach to gender that is more inclusive, collaborative and sensitive to human autonomy and choice
Methods
  • Participants In this study, the authors focused on understanding the perspective of individuals who identify as transgender [47].
  • In addition to binary-identifying transgender individuals, the authors have included people with non-binary and gender nonconforming trans identities, which the authors refer to as “transgender” and “trans” throughout this paper.
  • The authors recruited 13 trans-identifying participants, three of whom were technologists, or professionals working in a field related to digital technology, such as software engineering.
  • Please see Table 1 on the page for the participants’ demographic information
Results
  • Previous Experiences of Misgendering: “The Base Alienation that Comes with Transphobia”

    Misgendering in Physical Spaces Participants discussed the negative impact of misgendering on their mental and emotional wellbeing.
  • Previous Experiences of Misgendering: “The Base Alienation that Comes with Transphobia”.
  • Misgendering in Physical Spaces Participants discussed the negative impact of misgendering on their mental and emotional wellbeing.
  • Some (P1, P2, P3, P4, P7, T2) reported being more often misgendered offline.
  • Participants who identified as non-binary (P5, P8, P10) reported never being gendered correctly by strangers.
  • Others (P2, P6) who said they usually “pass” in person, reported instances where they were misgendered on the phone or through voice chat where people cannot see them
Conclusion
  • ENGAGING GENDER DIVERSITY IN HCI Results from the study show that transgender individuals have serious concerns about the possible negative impact of AGR and similar technologies that incorporate gender.
  • The authors found that participants had overwhelmingly negative attitudes towards AGR and questioned if it can offer any beneficial applications to end users
  • They expressed doubt about whether AGR can accurately identify gender and described the harm of being misgendered by it.
  • These recommendations point towards an approach to gender that is more inclusive, collaborative and sensitive to human autonomy and choice
Summary
  • Introduction:

    Gender is a significant social construct in human cultures; it permeates both the offline, physical worlds, and increasingly the online, virtual spaces and digital devices [5,40].
  • Whether it be through social networks or video games, users increasingly encounter embedded gender.
  • Sensitivity to one’s gender identity is crucially important, as the act of misgendering––whereby one’s gender is incorrectly identified, leading to the use of incorrect gendered words– –is a form of “structural violence” that can have a significant negative impact on trans individuals [21,31,35]
  • Methods:

    Participants In this study, the authors focused on understanding the perspective of individuals who identify as transgender [47].
  • In addition to binary-identifying transgender individuals, the authors have included people with non-binary and gender nonconforming trans identities, which the authors refer to as “transgender” and “trans” throughout this paper.
  • The authors recruited 13 trans-identifying participants, three of whom were technologists, or professionals working in a field related to digital technology, such as software engineering.
  • Please see Table 1 on the page for the participants’ demographic information
  • Results:

    Previous Experiences of Misgendering: “The Base Alienation that Comes with Transphobia”

    Misgendering in Physical Spaces Participants discussed the negative impact of misgendering on their mental and emotional wellbeing.
  • Previous Experiences of Misgendering: “The Base Alienation that Comes with Transphobia”.
  • Misgendering in Physical Spaces Participants discussed the negative impact of misgendering on their mental and emotional wellbeing.
  • Some (P1, P2, P3, P4, P7, T2) reported being more often misgendered offline.
  • Participants who identified as non-binary (P5, P8, P10) reported never being gendered correctly by strangers.
  • Others (P2, P6) who said they usually “pass” in person, reported instances where they were misgendered on the phone or through voice chat where people cannot see them
  • Conclusion:

    ENGAGING GENDER DIVERSITY IN HCI Results from the study show that transgender individuals have serious concerns about the possible negative impact of AGR and similar technologies that incorporate gender.
  • The authors found that participants had overwhelmingly negative attitudes towards AGR and questioned if it can offer any beneficial applications to end users
  • They expressed doubt about whether AGR can accurately identify gender and described the harm of being misgendered by it.
  • These recommendations point towards an approach to gender that is more inclusive, collaborative and sensitive to human autonomy and choice
Tables
  • Table1: Participant information. Participant IDs beginning with 'T' represent a technology developer or researcher
Download tables as Excel
Related work
  • Automatic Gender Recognition (AGR) and its Applications Automatic Gender Recognition (AGR) (also known as gender classification) refers to algorithmic methods, including automatic facial recognition [27,32] and body recognition [6,45] technologies, that extract features from images, video, or audio of one or more individuals in order to identify their gender. AGR often leverages computer vision algorithms and/or voice recognition modules. A common method is to extract features (e.g., facial hair) from an individual’s visual and/or audio data (e.g., a video showing their face) and compare them with ground-truth samples (e.g., videos of faces for which the gender is known) in an existing database. If the input features are found to be similar to those in the database, a match is declared.

    AGR has been developed since at least the early 1990’s [12]. Prior research has explored the technical capabilities of AGR and its applications, including gendered marketing, human-robot interaction, and security surveillance [32]. There are several motivations for using AGR: it is believed to improve user experience by providing a digital system with more information about the user, such that it can better adapt to them [32, 39, 45]; it is also believed to have the ability to enhance surveillance or marketing research by analyzing user data and providing results to marketers [33] or authorities (i.e., police) [18, 41].
Funding
  • Interviewed 13 transgender individuals, including three transgender technology designers, about their perceptions and attitudes towards AGR
  • Found that transgender individuals have overwhelmingly negative attitudes towards AGR and fundamentally question whether it can accurately recognize such a subjective aspect of their identity
  • Presents a series of recommendations on how to accommodate gender diversity when designing new digital systems
  • Found that our participants had overwhelmingly negative impressions of AGR and had serious concerns about how it would impact their autonomy and privacy
  • Focuses on the expereince of being misgendered by technology from the transgender individuals’ perspective
Reference
  • AJL - Algorithmic Justice League. Retrieved September 7, 2017 from https://www.ajlunited.org/
    Findings
  • Rena Bivens and Oliver L. Haimson. 2016. Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers. Social Media + Society 2, 4: 1–12.
    Google ScholarLocate open access versionFindings
  • Lindsay Blackwell, Jean Hardy, Tawfiq Ammari, Tiffany Veinot, Cliff Lampe, and Sarita Schoenebeck. 2016. LGBT Parents and Social Media: Advocacy, Privacy, and Disclosure During Shifting Social Movements. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16), 610–622.
    Google ScholarLocate open access versionFindings
  • 5. Judith Butler. 1988. Performative Acts and Gender Constitution: An Essay in Phenomenology and Feminist Theory. Theatre Journal 40, 4: 519.
    Google ScholarLocate open access versionFindings
  • 6. Liangliang Cao, Mert Dikmen, Yun Fu, and Thomas S. Huang. 2008. Gender Recognition from Body. In Proceeding of the 16th ACM International Conference on Multimedia (MM ’08), 725–728.
    Google ScholarLocate open access versionFindings
  • 7. John Christman. 2015. Autonomy in Moral and Political Philosophy. The Stanford Encyclopedia of Philosophy.
    Google ScholarFindings
  • 8. Andrew R. Flores, Jody L. Herman, Gary J. Gates, and Taylor N. T. Brown. 2016. How Many Adults Identify As Transgender in the United States? Los Angeles, CA. Retrieved August 22, 2017 from https://williamsinstitute.law.ucla.edu/wpcontent/uploads/How-Many-Adults-Identify-asTransgender-in-the-United-States.pdf
    Findings
  • 9. Batya Friedman and Peter H. Kahn. 200Human values, ethics, and design. Lawrence Erlbaum Associates.
    Google ScholarFindings
  • 10. Batya Friedman. 1996. Value-sensitive design. Interactions 3, 6: 16–23.
    Google ScholarLocate open access versionFindings
  • 11. Batya Friedman, Eric Brok, Susan King Roth, and John Thomas. 1996. Minimizing bias in computer systems. ACM SIGCHI Bulletin 28, 1: 48–51.
    Google ScholarLocate open access versionFindings
  • 12. Beatrice A. Golomb, David T. Lawrence, Terrence J. Sejnowski. 1990. Sexnet: A Neural Network Identifies Sex from Human Faces. Advances in Neural Information Processing Systems 3: 572–577.
    Google ScholarLocate open access versionFindings
  • 13. Jaime M. Grant, Lisa A. Mottet, Justin Tanis, Jack Harrison, Jody L. Herman, and Mara Keisling. 2011. Injustice at Every Turn: A Report of the National Transgender Discrimination Survey. Washington National Center for Transgender Equality and National Gay and Lesbian Task Force.
    Google ScholarFindings
  • 14. Ansara Y. Gavriel and Peter Hegarty. 2014. Methodologies of misgendering: Recommendations for reducing cisgenderism in psychological research. Feminism & Psychology 24, 2: 259–270.
    Google ScholarLocate open access versionFindings
  • 15. Oliver L. Haimson, Jed R. Brubaker, Lynn Dombrowski, and Gillian R. Hayes. 2015. Disclosure, Stress, and Support During Gender Transition on Facebook. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing (CSCW ’15), 1176–1190.
    Google ScholarLocate open access versionFindings
  • 16. Oliver L. Haimson, Jed R. Brubaker, Lynn Dombrowski, and Gillian R. Hayes. 2016. Digital Footprints and Changing Networks During Online Identity Transitions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16), 2895–2907.
    Google ScholarLocate open access versionFindings
  • 17. David Hankerson, Andrea R Marshall, Jennifer Booker, Houda El Mimouni, Imani Walker, and Jennifer A Rode. 20Does Technology Have Race? In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA ’16), 473–486.
    Google ScholarLocate open access versionFindings
  • 18. Lucas D Introna and David Wood. 2004. Picturing Algorithmic Surveillance: The Politics of Facial Recognition Systems. Surveillance & Society 2, 2/3: 177–198.
    Google ScholarFindings
  • 19. Sandy E. James, Jody L. Herman, Susan Rankin, Mara Keisling, Lisa Mottet, and Ma’ayan Anafi. 2016. The Report of the 2015 U.S. Transgender Survey. Retrieved August 22, 2017 from http://www.transequality.org/sites/default/files/docs/us ts/USTS Full Report - FINAL 1.6.17.pdf
    Locate open access versionFindings
  • 20. James Vincent. 2017. Transgender YouTubers had their videos grabbed to train facial recognition software. The Verge. Retrieved August 28, 2017 from https://www.theverge.com/2017/8/22/16180080/transg ender-youtubers-ai-facial-recognition-dataset
    Findings
  • 21. Haiyan Jia, Mu Wu, Eunhwa Jung, Alice Shapiro, and S. Shyam Sundar. 2012. Balancing Human Agency and Object Agency: An End-User Interview Study of the Internet of Things. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing (UbiComp ’12), 1185-1188.
    Google ScholarLocate open access versionFindings
  • 22. Stephanie Julia Kapusta. 2016. Misgendering and Its Moral Contestability. Hypatia 31, 3: 502–519.
    Google ScholarLocate open access versionFindings
  • 23. Gopinaath Kannabiran and Marianne Graves Petersen. 2010. Politics at the Interface: A Foucauldian Power Analysis. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries (NordiCHI ’10), 695-698.
    Google ScholarLocate open access versionFindings
  • 24. Michal Kosinski, David Stillwell, and Thore Graepel. 2013. Private traits and attributes are predictable from digital records of human behavior. In Proceedings of the National Academy of Sciences of the United States of America 110, 15: 5802–5.
    Google ScholarLocate open access versionFindings
  • 25. Irwin Krieger. 2017. The Impact of Stigma on Transgender and Non-Binary Youth. In Counseling Transgender and Non-Binary Youth: The Essential Guide. Jessica Kingsley Publisher, 46–47.
    Google ScholarFindings
  • 26. Vijay Kumar, R. Raghavendra, Anoop Namboodiri, and Christoph Busch. 2016. Robust transgender face recognition: Approach based on appearance and therapy factors. In IEEE Conference on Identity, Security and Behavior Analysis (ISBA 2016), 1–7.
    Google ScholarLocate open access versionFindings
  • 27. Chien-Cheng Lee and Chung-Shun Wei. 2013. Gender Recognition Based On Combining Facial and Hair Features. In Proceedings of International Conference on Advances in Mobile Computing & Multimedia (MoMM ’13), 537–540.
    Google ScholarLocate open access versionFindings
  • 28. Gayathri Mahalingam and Karl Ricanek. HRT Transgender Dataset. Retrieved August 23, 2017 from http://www.faceaginggroup.com/hrt-transgender/
    Findings
  • 29. Gayathri Mahalingam and Karl Ricanek. 2013. Is the eye region more reliable than the face? A preliminary study of face-based recognition on a transgender dataset. In IEEE Conference on Biometrics: Theory, Applications and Systems (BTAS 2013), 1–7.
    Google ScholarLocate open access versionFindings
  • 30. Erno Makinen and Roope Raisamo. 2008. Evaluation of gender classification methods with automatically detected and aligned faces. IEEE Transactions on Pattern Analysis and Machine Intelligence 30, 3: 541– 547.
    Google ScholarLocate open access versionFindings
  • 31. Kevin A. McLemore. 2015. Experiences with Misgendering: Identity Misclassification of Transgender Spectrum Individuals. Self and Identity 14, 1: 51–74.
    Google ScholarLocate open access versionFindings
  • 32. Choon Boon Ng, Yong Haur Tay, and Bok Min Goi. 2015. A review of facial gender recognition. Pattern Analysis and Applications 18, 4: 739–755.
    Google ScholarLocate open access versionFindings
  • 33. Mei Ngan and Patrick Grother. 2015. Face Recognition Vendor Test (FRVT) - Performance of Automated Gender Classification Algorithms. NIST Interagency/Internal Report (NISTIR) - 8052.
    Google ScholarFindings
  • 34. Nick Whigham. 2017. Glitch in digital pizza advert goes viral, shows disturbing future of facial recognition tech. news.com.au. Retrieved September 18, 2017 from http://www.news.com.au/technology/innovation/desig n/glitch-in-digital-pizza-advert-goes-viral-showsdisturbing-future-of-facial-recognition-tech/newsstory/3b43904b6dd5444a279fd3cd6f8551db
    Findings
  • 35. Y. Avriel Ansara. 2012. Cisgenderism in medical settings: How collaborative partnerships can challenge structural violence. In Out of the Ordinary: LGBT Lives. Cambridge Scholars Publishing, Cambridge, 102–122.
    Google ScholarFindings
  • 36. Cathy O’Neil. 2016. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown Random House.
    Google ScholarFindings
  • 37. Özlem Özbudak, Mürvet Kirci, Yüksel Çakir, and Ece Olcay Güneş. 2010. Effects of the Facial and Racial Features on Gender Classification. In Proceedings of the Mediterranean Electrotechnical Conference (MELECON), 26–29.
    Google ScholarLocate open access versionFindings
  • 38. Joyojeet Pal, Anandhi Viswanathan, Priyank Chandra, Anisha Nazareth, Vaishnav Kameswaran, Hariharan Subramonyam, Aditya Johri, Mark S. Ackerman, and Sile O’Modhrain. 2017. Agency in Assistive Technology Adoption: Visual Impairment and Smartphone Use in Bangalore. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17), 5929–5940.
    Google ScholarLocate open access versionFindings
  • 39. Arnaud Ramey and Miguel A. Salichs. 2014. Morphological Gender Recognition by a Social Robot and Privacy Concerns. In Proceedings of the 2014 ACM/IEEE Iternational conference on Human-Robot Interaction (HRI ’14): 272–273.
    Google ScholarLocate open access versionFindings
  • 40. Jemima Repo. 2017. The Biopolitics of Gender. Oxford University Press.
    Google ScholarFindings
  • 41. Bridget A Sarpu. 2015. Google: The Endemic Threat to Privacy. Journal of High Technology Law XV, April 2012. Retrieved September 7, 2017 from https://sites.suffolk.edu/jhtl/files/2014/12/SarpuGoogle-The-Endemic-Threat-to-Privacy.pdf
    Locate open access versionFindings
  • 42. Christine Satchell. 2010. Women are people too: The problem of designing for gender. Retrieved December 18, 2017 from https://www.cl.cam.ac.uk/events/experiencingcriticalth eory/Satchell-WomenArePeople.pdf
    Findings
  • 43. Ari Schlesinger, W. Keith Edwards, and Rebecca E. Grinter. 2017. Intersectional HCI: Engaging Identity through Gender, Race, and Class. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17), 5412–5427.
    Google ScholarLocate open access versionFindings
  • 44. Michael Skirpan and Micha Gorelick. 2017. The Authority of “Fair” in Machine Learning. Retrieved September 7, 2017 from https://arxiv.org/pdf/1706.09976.pdf
    Findings
  • 45. Jinshan Tang, Xiaoming Liu, Huaining Cheng, and Kathleen M. Robinette. 2011. Gender recognition using 3-D human body shapes. IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews 41, 6: 898–908.
    Google ScholarLocate open access versionFindings
  • 46. Jisha C. Thankappan and Sumam M. Idicula. 2010. Language Independent Voice-Based Gender Identification System. In Proceedings of the 1st Amrita ACM-W Celebration on Women in Computing in India (A2CWiC ’10), 1996: 1–6.
    Google ScholarLocate open access versionFindings
  • 47. Transgender (1974). In Merriam-Webster's Dictionary. Retrieved September 3, 2017 from https://www.merriamwebster.com/dictionary/transgender 48. Archana Vijayan, Shyma Kareem, and Jubilant J. Kizhakkethottam.2016. Face Recognition Across Gender Transformation Using SVM Classifier. Procedia Technology 24:1366–1373.
    Locate open access versionFindings
  • 49. Shiqi Yu, Tieniu Tan, Kaiqi Huang, Kui Jia, and Xinyu Wu. 2009. A Study on Gait-Based Gender Classification. IEEE Transactions on Image Processing 18, 8: 1905–1910.
    Google ScholarLocate open access versionFindings
Your rating :
0

 

Best Paper
Best Paper of CHI, 2018
Tags
Comments