Project Sidewalk - A Web-based Crowdsourcing Tool for Collecting Sidewalk Accessibility Data At Scale

Manaswi Saha
Manaswi Saha
Michael Saugstad
Michael Saugstad
Hanuma Teja Maddali
Hanuma Teja Maddali
Aileen Zeng
Aileen Zeng
Ryan Holland
Ryan Holland
Steven Bower
Steven Bower
Aditya Dash
Aditya Dash
Sage Chen
Sage Chen
Anthony Li
Anthony Li

CHI, pp. 622019.

Cited by: 15|Bibtex|Views51|Links
EI
Keywords:
accessibility crowdsourcing gis mobility impairments urban informatics
Weibo:
Through a multi-methods approach, our results demonstrate the viability of virtually auditing urban accessibility at scale, highlight behavioral and labeling quality differences between user groups, and summarize how key stakeholders feel about Project Sidewalk and the crowdsourc...

Abstract:

We introduce Project Sidewalk, a new web-based tool that enables online crowdworkers to remotely label pedestrian-related accessibility problems by virtually walking through city streets in Google Street View. To train, engage, and sustain users, we apply basic game design principles such as interactive onboarding, mission-based tasks, an...More

Code:

Data:

0
Introduction
  • Geographic Information Systems (GIS) such as Google Maps, Waze, and Yelp have transformed the way people travel and access information about the physical world.
  • While these systems contain terabytes of data about road networks and points of interest (POIs), their information about physical accessibility is commensurately poor.
  • While local users who report data are likely to be reliable, the dependence on in situ reporting dramatically limits scalability—both who can supply data and how much data they can supply
Highlights
  • Geographic Information Systems (GIS) such as Google Maps, Waze, and Yelp have transformed the way people travel and access information about the physical world
  • We present background on sidewalk accessibility, survey existing methods for collecting street-level accessibility data, and review volunteer geographic information (VGI) systems
  • In Project Sidewalk, we focus on five high-priority areas that impact mobility impairments pedestrians drawn from Americans with Disability Act standards [51, 52, 55] and prior work [35, 37]: curb ramps, missing curb ramps, sidewalk obstacles, surface problems, and the lack of a sidewalk on a pedestrian pathway
  • If we examine only those users who passed our “good” user heuristic, we filter 28.2% paid, 23.7% anonymous, and 22.6% registered workers; relative user behaviors stay the same
  • To begin exploring why users contribute to Project Sidewalk, we developed a 5-question survey shown to users after their second mission
  • Through a multi-methods approach, our results demonstrate the viability of virtually auditing urban accessibility at scale, highlight behavioral and labeling quality differences between user groups, and summarize how key stakeholders feel about Project Sidewalk and the crowdsourced data
Results
  • Project Sidewalk had 11,891 visitors to the landing page, of which 797 (627 volunteers; 170 turkers) completed the tutorial and audited at least one street segment in the first mission
  • These users contributed 205,385 labels and audited 2,941 miles of DC streets (Table 1).
  • On average, registered users completed more missions (5.8 vs 1.5), contributed more labels (171.1 vs 33.7), audited faster (1.93 mi/hr vs 1.22), and spent more time on Project Sidewalk (55.8 mins vs 18.3) than anonymous users (Table 2).
  • By the top 25%, contribution percentages rise to 77.4%, 93.6%, and 94.8%
Conclusion
  • DISCUSSION AND CONCLUSION

    Through a multi-methods approach, the results demonstrate the viability of virtually auditing urban accessibility at scale, highlight behavioral and labeling quality differences between user groups, and summarize how key stakeholders feel about Project Sidewalk and the crowdsourced data.
  • The authors' data validation study found that, on average, users could find 63% of accessibility issues at 71% precision.
  • This is comparable to early streetscape labeling work by Hara et al [25], where turkers labeled at 67.0% and 55.6% for recall and precision, respectively; the tasks are more complex, contain more label types, and are evaluated at a larger scale.
  • The authors believe the findings represent a lower bound on performance and provide a nice baseline for future work
Summary
  • Introduction:

    Geographic Information Systems (GIS) such as Google Maps, Waze, and Yelp have transformed the way people travel and access information about the physical world.
  • While these systems contain terabytes of data about road networks and points of interest (POIs), their information about physical accessibility is commensurately poor.
  • While local users who report data are likely to be reliable, the dependence on in situ reporting dramatically limits scalability—both who can supply data and how much data they can supply
  • Results:

    Project Sidewalk had 11,891 visitors to the landing page, of which 797 (627 volunteers; 170 turkers) completed the tutorial and audited at least one street segment in the first mission
  • These users contributed 205,385 labels and audited 2,941 miles of DC streets (Table 1).
  • On average, registered users completed more missions (5.8 vs 1.5), contributed more labels (171.1 vs 33.7), audited faster (1.93 mi/hr vs 1.22), and spent more time on Project Sidewalk (55.8 mins vs 18.3) than anonymous users (Table 2).
  • By the top 25%, contribution percentages rise to 77.4%, 93.6%, and 94.8%
  • Conclusion:

    DISCUSSION AND CONCLUSION

    Through a multi-methods approach, the results demonstrate the viability of virtually auditing urban accessibility at scale, highlight behavioral and labeling quality differences between user groups, and summarize how key stakeholders feel about Project Sidewalk and the crowdsourced data.
  • The authors' data validation study found that, on average, users could find 63% of accessibility issues at 71% precision.
  • This is comparable to early streetscape labeling work by Hara et al [25], where turkers labeled at 67.0% and 55.6% for recall and precision, respectively; the tasks are more complex, contain more label types, and are evaluated at a larger scale.
  • The authors believe the findings represent a lower bound on performance and provide a nice baseline for future work
Tables
  • Table1: The total amount of data collected during our deployment. *Total clusters refers to filtered data only. All other columns are the full dataset
  • Table2: The total amount of data collected during our deployment. Averages are per user. Avg. speed is in mi/hr, time is in mins, lbls/100m is median labels per 100m, and ‘avg desc’ is the average number of open-ended descriptions
  • Table3: Accuracy by label type. All pairwise comparisons are significant
Download tables as Excel
Related work
  • We present background on sidewalk accessibility, survey existing methods for collecting street-level accessibility data, and review volunteer geographic information (VGI) systems.

    Street-Level Accessibility

    Accessible infrastructure has a significant impact on the independence and mobility of citizens [1, 40]. In the U.S, the Americans with Disability Act (ADA) [53] and its revision, the 2010 ADA Standards for Accessible Design [52], mandate that new constructions and renovations meet modern accessibility guidelines. Despite these regulations, pedestrian infrastructure remains inaccessible [18, 28]. The problem is

    1http://projectsidewalk.io/api

    Project Sidewalk not just inaccessible public rights-of-way but a lack of reliable, comprehensive, and open information. Unlike road networks, there are no widely accepted standards governing sidewalk data (though some recent initiatives are emerging, such as OpenSidewalks.com [41]). While accessible infrastructure is intended to benefit broad user populations from those with unique sensory or physical needs to people with situational impairments [58], our current focus is supporting those with ambulatory disabilities. In Project Sidewalk, we focus on five high-priority areas that impact MI pedestrians drawn from ADA standards [51, 52, 55] and prior work [35, 37]: curb ramps, missing curb ramps, sidewalk obstacles, surface problems, and the lack of a sidewalk on a pedestrian pathway.
Funding
  • This work was supported by an NSF grant (IIS-1302338), a Singapore MOE AcRF Tier 1 Grant, and a Sloan Research Fellowship
Reference
  • Court of Appeals 3rd Circuit. 1993. Kinney v. Yerusalim, 1993 No. 93-1168. Technical Report. Retrieved January 7, 2019 from https://www.leagle.com/decision/199310769f3d10671900
    Findings
  • Ahmed Ali, Nuttha Sirilertworakul, Alexander Zipf, Amin Mobasheri, Ahmed Loai Ali, Nuttha Sirilertworakul, Alexander Zipf, and Amin Mobasheri. 2016. Guided Classification System for Conceptual Overlapping Classes in OpenStreetMap. ISPRS International Journal of Geo-Information 5, 6 (Jun 2016), 87. https://doi.org/10.3390/ijgi5060087
    Locate open access versionFindings
  • V Antoniou and A Skopeliti. 2015. Measures and Indicators of VGI Quality: An Overview. ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences II-3/W5 (2015), 345–351. https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/II-3-W5/345/2015/
    Locate open access versionFindings
  • Michael D.M. Bader, Stephen J. Mooney, Yeon Jin Lee, Daniel Sheehan, Kathryn M. Neckerman, Andrew G. Rundle, and Julien O. Teitler. 2015. Development and deployment of the Computer Assisted Neighborhood Visual Assessment System (CANVAS) to measure healthrelated neighborhood conditions. Health & Place 31 (Jan 2015), 163–172. https://doi.org/10.1016/J.HEALTHPLACE.2014.10.012
    Findings
  • Hannah M Badland, Simon Opit, Karen Witten, Robin A Kearns, and Suzanne Mavoa. 2010. Can virtual streetscape audits reliably replace physical streetscape audits? Journal of urban health: bulletin of the New York Academy of Medicine 87, 6 (Dec 2010), 1007–16. https:
    Google ScholarLocate open access versionFindings
  • Michael S Bernstein, Greg Little, Robert C Miller, Björn Hartmann, Mark S Ackerman, David R Karger, David Crowell, and Katrina Panovich. 2010. Soylent: A Word Processor with a Crowd Inside. In Proceedings of the 23Nd Annual ACM Symposium on User Interface
    Google ScholarLocate open access versionFindings
  • https://doi.org/10.1145/1866029.1866078
    Findings
  • [7] John R Bethlehem, Joreintje D Mackenbach, Maher Ben-Rebah, Sofie
    Google ScholarLocate open access versionFindings
  • 2014. The SPOTLIGHT virtual audit tool: a valid and reliable tool to assess obesogenic characteristics of the built environment. International Journal of Health Geographics 13, 1 (Dec 2014), 52. https:
    Google ScholarLocate open access versionFindings
  • [8] Carlos Cardonha, Diego Gallo, Priscilla Avegliano, Ricardo Herrmann, Fernando Koch, and Sergio Borger. 2013. A Crowdsourcing Platform for the Construction of Accessibility Maps. In Proceedings of the 10th ’13). ACM, New York, NY, USA, 26:1—-26:4. https://doi.org/10.1145/
    Locate open access versionFindings
  • [9] Philippa Clarke, Jennifer Ailshire, Robert Melendez, Michael Bader, and Jeffrey Morenoff. 2010. Using Google Earth to conduct a neighborhood audit: Reliability of a virtual audit instrument. Health & Place 16, 6 (Nov 2010), 1224–1229. https://doi.org/10.1016/J.HEALTHPLACE.
    Locate open access versionFindings
  • [10] John W. Creswell. 20Qualitative Inquiry and Research Design: Choosing Among Five Approaches (third ed.). Sage Publications, Inc.
    Google ScholarFindings
  • [11] Igor Gomes Cruz and Claudio Campelo. 2017. Improving Accessibility Bertolotto, and Padraig Corcoran (Eds.). 208–226.
    Google ScholarFindings
  • [12] DC.gov. [n. d.]. OpenData DC Quadrants. January 7, 2019 from http://opendata.dc.gov/datasets/
    Findings
  • [13] DC.gov. 2010. OpenData DC Sidewalk Ramp Dataset. Retrieved January 7, 2019 from http://opendata.dc.gov/datasets/sidewalk-ramps-2010
    Findings
  • [14] DC.gov. 20OpenData DC Zoning Regulations of 2016. Retrieved January 7, 2019 from http://opendata.dc.gov/datasets/
    Findings
  • [15] Chaohai Ding, Mike Wald, and Gary Wills. 2014. A Survey of Open Accessibility Data. In Proceedings of the 11th Web for All Conference (W4A ’14). ACM, New York, NY, USA, 37:1—-37:4. https://doi.org/10.
    Locate open access versionFindings
  • [16] Steffen Fritz, Linda See, and Maria Brovelli. 2017. Motivating and sustaining participation in VGI. In Mapping and the Citizen Sensor.
    Google ScholarLocate open access versionFindings
  • [17] Google Inc. 2018. Google Street View Service. Retrieved January 7, 2019 from https://developers.google.com/maps/documentation/ Retrieved January 7, 2019 from https://www.seattletimes.com/seattle-news/transportation/
    Findings
  • [19] Richard Guy and Khai Truong. 2012. CrossingGuard: exploring information content in navigation aids for visually impaired pedestrians. In Proceedings of the SIGCHI Conference on Human Factors in https://doi.org/10.1145/2207676.2207733
    Locate open access versionFindings
  • [20] Mordechai Haklay. 2010. How Good is Volunteered Geographical Survey Datasets. Environment and Planning B: Planning and Design 37, 4 (Aug 2010), 682–703. https://doi.org/10.1068/b35097
    Findings
  • [21] Muki Haklay. 2013. Citizen Science and Volunteered Geographic Information: Overview and Typology of Participation. In Crowdsourcing Geographic Knowledge. Springer Netherlands, Dordrecht, 105–1https://doi.org/10.1007/978-94-007-4587-2_7
    Findings
  • [22] Mordechai (Muki) Haklay, Sofia Basiouka, Vyron Antoniou, and Aamer Ather. 2010. How Many Volunteers Does it Take to Map an Area Well? The Validity of Linus’ Law to Volunteered Geographic Information. The Cartographic Journal 47, 4 (Nov 2010), 315–322. https://doi.org/10.1179/000870410X12911304958827
    Locate open access versionFindings
  • [23] Kotaro Hara, Shiri Azenkot, Megan Campbell, Cynthia L Bennett, Vicki Le, Sean Pannella, Robert Moore, Kelly Minckler, Rochelle H Ng, and Jon E Froehlich. 2015. Improving Public Transit Accessibility for Blind Riders by Crowdsourcing Bus Stop Landmark Locations with Google Street View: An Extended Analysis. ACM Transactions on Accessible Computing (TACCESS) 6, 2 (Mar 2015), 5:1—-5:23. https://doi.org/10.1145/2717513
    Locate open access versionFindings
  • [24] Kotaro Hara, Christine Chan, and Jon E Froehlich. 2016. The Design of Assistive Location-based Technologies for People with Ambulatory Disabilities: A Formative Study. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 1757–1768. https://doi.org/10.1145/2858036.2858315
    Locate open access versionFindings
  • [25] Kotaro Hara, Vicki Le, and Jon Froehlich. 2013. Combining crowdsourcing and google street view to identify street-level accessibility problems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI ’13. ACM, 631–640. https://doi.org/10.1145/2470654.2470744
    Locate open access versionFindings
  • [26] Kotaro Hara, Jin Sun, Robert Moore, David Jacobs, and Jon Froehlich. 2014. Tohme: Detecting Curb Ramps in Google Street View Using Crowdsourcing, Computer Vision, and Machine Learning. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (UIST ’14). ACM, New York, NY, USA, 189–204. https://doi.org/10.1145/2642918.2647403
    Locate open access versionFindings
  • [27] Daniel J. Hruschka, Deborah Schwartz, Daphne Cobb St.John, Erin Picone-Decaro, Richard A. Jenkins, and James W. Carey. 2004. Reliability in Coding Open-Ended Data: Lessons Learned from HIV Behavioral Research. Field Methods 16, 3 (Aug 2004), 307–331. https://doi.org/10.1177/1525822X04266540
    Locate open access versionFindings
  • [28] Winnie Hu. 2017. For the Disabled, New York’s Sidewalks Are an Obstacle Course - The New York Times. Retrieved January 7, 2019 from https://www.nytimes.com/2017/10/08/nyregion/new-york-city-sidewalks-disabled-curb-ramps.html
    Findings
  • [29] Yusuke Iwasawa, Kouya Nagamine, Ikuko Eguchi Yairi, and Yutaka Matsuo. 2015. Toward an Automatic Road Accessibility Information Collecting and Sharing Based on Human Behavior Sensing Technologies of Wheelchair Users. Procedia Computer Science 63 (Jan 2015), 74–81. https://doi.org/10.1016/J.PROCS.2015.08.314
    Locate open access versionFindings
  • [30] Levente Juhász and Hartwig H. Hochmair. 2016. User Contribution Patterns and Completeness Evaluation of Mapillary, a Crowdsourced Street Level Photo Service. Transactions in GIS 20, 6 (Dec 2016), 925– 947. https://doi.org/10.1111/tgis.12190
    Locate open access versionFindings
  • [31] Reuben Kirkham, Romeo Ebassa, Kyle Montague, Kellie Morrissey, Vasilis Vlachokyriakos, Sebastian Weise, and Patrick Olivier. 2017. WheelieMap: An Exploratory System for Qualitative Reports of Inaccessibility in the Built Environment. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’17). ACM, New York, NY, USA, 38:1—-38:12. https://doi.org/10.1145/3098279.3098527
    Locate open access versionFindings
  • [32] Anthony Li, Manaswi Saha, Anupam Gupta, and Jon E. Froehlich. 2018. Interactively Modeling And Visualizing Neighborhood Accessibility At Scale: An Initial Study Of Washington DC. In Poster Proceedings of ASSETS’18.
    Google ScholarLocate open access versionFindings
  • [33] Afra Mashhadi, Giovanni Quattrone, and Licia Capra. 2013. Putting Ubiquitous Crowd-sourcing into Context. In Proceedings of the 2013 Conference on Computer Supported Cooperative Work (CSCW ’13). ACM, New York, NY, USA, 611–622. https://doi.org/10.1145/2441776.2441845
    Locate open access versionFindings
  • [34] Afra Mashhadi, Giovanni Quattrone, Licia Capra, and Peter Mooney. 2012. On the Accuracy of Urban Crowd-sourcing for Maintaining Large-scale Geospatial Databases. In Proceedings of the Eighth Annual International Symposium on Wikis and Open Collaboration (WikiSym ’12). ACM, New York, NY, USA, 15:1—-15:10. https://doi.org/10.1145/2462932.2462952
    Locate open access versionFindings
  • [35] Hugh Matthews, Linda Beale, Phil Picton, and David Briggs. 2003. Modelling Access with GIS in Urban Systems (MAGUS): capturing the experiences of wheelchair users. Area 35, 1 (Mar 2003), 34–45. https://doi.org/10.1111/1475-4762.00108
    Findings
  • [36] Andrew May, Christopher J. Parker, Neil Taylor, and Tracy Ross. 2014. Evaluating a concept design of a crowd-sourced ‘mashup’ providing ease-of-access information for people with limited mobility. Transportation Research Part C: Emerging Technologies 49 (Dec 2014), 103–113. https://doi.org/10.1016/J.TRC.2014.10.007
    Findings
  • [37] Allan R Meyers, Jennifer J Anderson, Donald R Miller, Kathy Shipp, and Helen Hoenig. 2002. Barriers, facilitators, and access for wheelchair users: sbstantive and methodologic lessons from a pilot study of environmental effects. Social Science & Medicine 55, 8 (Oct 2002), 1435–1446. https://doi.org/10.1016/S0277-9536(01)00269-6
    Locate open access versionFindings
  • [38] Amin Mobasheri, Jonas Deister, and Holger Dieterich. 2017. Wheelmap: the wheelchair accessibility crowdsourcing platform. Open Geospatial Data, Software and Standards 2, 1 (Dec 2017), 27. https://doi.org/10.1186/s40965-017-0040-5
    Locate open access versionFindings
  • [39] Ladan Najafizadeh and Jon E. Froehlich. 2018. A Feasibility Study of Using Google Street View and Computer Vision to Track the Evolution of Urban Accessibility. In Poster Proceedings of ASSETS’18.
    Google ScholarLocate open access versionFindings
  • [40] Andrea Nuernberger. 2008. Presenting Accessibility to Mobility-Impaired Travelers (Ph.D. Thesis). Ph.D. Dissertation. University of California, Santa Barbara.
    Google ScholarFindings
  • [41] OpenSidewalks.com. [n. d.]. OpenSidewalks. Retrieved January 7, 2019 from https://www.opensidewalks.com/
    Findings
  • [42] Katherine Panciera, Reid Priedhorsky, Thomas Erickson, and Loren Terveen. 2010. Lurking? cyclopaths?: a quantitative lifecycle analysis of user behavior in a geowiki. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’10). ACM, New York, NY, USA, 1917–1926. https://doi.org/10.1145/1753326.1753615
    Locate open access versionFindings
  • [43] Falko Weigert Petersen, Line Ebdrup Thomsen, Pejman Mirza-Babaei, and Anders Drachen. 2017. Evaluating the Onboarding Phase of FreetoPlay Mobile Games: A Mixed-Method Approach. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play (CHI PLAY ’17). ACM, New York, NY, USA, 377–388. https://doi.org/10.1145/3116595.3125499
    Locate open access versionFindings
  • [44] Catia Prandi, Paola Salomoni, and Silvia Mirri. 2014. mPASS: Integrating people sensing and crowdsourcing to map urban accessibility. In 2014 IEEE 11th Consumer Communications and Networking Conference (CCNC). IEEE, 591–595. https://doi.org/10.1109/CCNC.2014.6940491
    Locate open access versionFindings
  • [45] Giovanni Quattrone, Afra Mashhadi, Daniele Quercia, Chris SmithClarke, and Licia Capra. 2014. Modelling Growth of Urban Crowdsourced Information. In Proceedings of the 7th ACM International Conference on Web Search and Data Mining (WSDM ’14). ACM, New York, NY, USA, 563–572. https://doi.org/10.1145/2556195.2556244
    Locate open access versionFindings
  • [46] Alexander J Quinn and Benjamin B Bederson. 2011. Human Computation: A Survey and Taxonomy of a Growing Field. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’11). ACM, New York, NY, USA, 1403–1412. https://doi.org/10.1145/1978942.1979148
    Locate open access versionFindings
  • [47] Andrew G. Rundle, Michael D.M. Bader, Catherine A. Richards, Kathryn M. Neckerman, and Julien O. Teitler. 2011. Using Google Street View to Audit Neighborhood Environments. American Journal of Preventive Medicine 40, 1 (Jan 2011), 94–100. https://doi.org/10.
    Locate open access versionFindings
  • [48] Daniel Sinkonde, Leonard Mselle, Nima Shidende, Sara Comai, Matteo Comai, and Matteo Matteucci. 2018. Developing an Intelligent PostGIS
    Google ScholarFindings
  • Science 2, 3 (Jun 2018), 52. https://doi.org/10.3390/urbansci2030052
    Findings
  • [49] Sharon Spall. 1998. Peer Debriefing in Qualitative Research: Emerging Operational Models. Qualitative Inquiry 4, 2 (Jun 1998), 280–292.
    Google ScholarFindings
  • https://doi.org/10.1177/107780049800400208
    Findings
  • [50] Jin Sun and David W. Jacobs. 2017. Seeing What is Not There: Learning Context to Determine Where Objects are Missing. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 1234–1242. https://doi.org/10.1109/CVPR.2017.136
    Locate open access versionFindings
  • [51] United States Access Board. [n. d.]. Retrieved January 7, 2019 from https://www.
    Findings
  • [52] United States Department of Justice. 2010. 2010 ADA Standards for Accessible Design. Retrieved January 7, 2019 from https://www.ada.
    Findings
  • [53] United States Department of Justice Civil Rights Division. 1990. Americans with Disabilities Act of 1990, Pub. L. No. 101-336, 104
    Google ScholarFindings
  • [54] U.S. Census Bureau. [n. d.]. U.S. Census QuickFacts: District of Columbia. Retrieved January 7, 2019 from https://www.census.gov/
    Findings
  • [55] U.S. Department of Transportation Federal Highway Administration. Safety. Retrieved January 7, 2019 from https://safety.fhwa.dot.gov/
    Findings
  • [56] Washington.org. [n. d.]. Washington DC Visitor Research. Retrieved January 7, 2019 from https://washington.org/press/dc-information/
    Findings
  • [57] Jeffrey S. Wilson, Cheryl M. Kelly, Mario Schootman, Elizabeth A.
    Google ScholarFindings
  • 2012. Assessing the Built Environment Using Omnidirectional Imagery. American Journal of Preventive Medicine 42, 2 (Feb 2012), 193–199.
    Google ScholarLocate open access versionFindings
  • https://doi.org/10.1016/J.AMEPRE.2011.09.029
    Findings
  • [58] J.O. Wobbrock, S.K. Kane, K.Z. Gajos, S. Harada, and J. Froehlich.
    Google ScholarFindings
  • 2011. Ability-Based Design: Concept, Principles and Examples. ACM Transactions on Accessible Computing (TACCESS) 3, 3 (2011), 9. http:
    Google ScholarLocate open access versionFindings
Your rating :
0

 

Best Paper
Best Paper of CHI, 2019
Tags
Comments