Affordance++: Allowing Objects to Communicate Dynamic Use

CHI, pp. 2515-2524, 2015.

Cited by: 75|Bibtex|Views158|Links
EI
Keywords:
affordanceelectrical muscle stimulationinput devices and strategiesinteraction styles
Weibo:
By storing information about objects’ states, affordance++ allows implementing not only motion, but multi-step processes and behaviors that change over time

Abstract:

We propose extending the affordance of objects by allowing them to communicate dynamic use, such as (1) motion (e.g., spray can shakes when touched), (2) multi-step processes (e.g., spray can sprays only after shaking), and (3) behaviors that change over time (e.g., empty spray can does not allow spraying anymore). Rather than enhancing o...More

Code:

Data:

0
Introduction
  • Affordance is a key concept in usability. When well-designed objects “suggest how to be used” [7], they avoid the necessity for training and enable walk-up use.
  • For example, use their visual and tactile cues to suggest the possible range of usages to the user [7].
  • Physical objects are limited in that they cannot communicate use that involves (1) motion, (2) multi-step processes, and (3) behaviors that change over time.
  • Publication rights licensed to ACM.
  • ACM 978-1-4503-3145-6/15/04...$15.00 http://dx.doi.org/10.1145/2702123.2702128 be used for spraying anymore
Highlights
  • Affordance is a key concept in usability
  • Physical objects are limited in that they cannot communicate use that involves (1) motion, (2) multi-step processes, and (3) behaviors that change over time
  • While animating objects allows implementing object behavior, we argue that affordance is about implementing user behavior
  • By storing information about objects’ states, affordance++ allows implementing not only motion, but multi-step processes and behaviors that change over time
  • We presented the concept of affordance++, i.e., an extension to the traditional notion of affordance that allows objects to communicate (1) motion, (2) multistep processes, and (3) behaviors that change over time
  • The aforementioned observations reported in Task 1 suggest that, to some extent, users “believe” that a script with an intention is attached to the object
Methods
  • The authors recruited 12 participants (2 female, 10 male) from the institution. All were right handed.
Results
  • In 76% of all trials participants correctly named the behaviors the object had been designed to communicate.
  • In the remaining 24% of trials where participants did not name the exact behavior, they named some behaviors that were reasonably close, such as juggle instead of shake.
  • Overall, this result suggests that affordance++ allowed the blank cube to communicate identifiable object behaviors.
  • Six participants stated: “it does not want to be touched” one “The author cannots touch it” and one “The author is not allowed to touch it”
Conclusion
  • CONCLUSIONS AND FUTURE

    WORK

    In this paper, the authors presented the concept of affordance++, i.e., an extension to the traditional notion of affordance that allows objects to communicate (1) motion, (2) multistep processes, and (3) behaviors that change over time.
  • The authors observed that the agency (“who did what”) was redirected to the object—“it doesn‘t want the author to not drink from it” (P10) is very different from “The author should not drink from this cup”
  • The latter would elicit a more user-centered understanding of affordance, rooted in "what can the author does with this object".
  • Comments such as “the tool told the author to pull” (P11) suggest this is the case for affordance++, in which object-user dialogs do not happen on a verbal level but non-verbally, through the user's body motion
Summary
  • Introduction:

    Affordance is a key concept in usability. When well-designed objects “suggest how to be used” [7], they avoid the necessity for training and enable walk-up use.
  • For example, use their visual and tactile cues to suggest the possible range of usages to the user [7].
  • Physical objects are limited in that they cannot communicate use that involves (1) motion, (2) multi-step processes, and (3) behaviors that change over time.
  • Publication rights licensed to ACM.
  • ACM 978-1-4503-3145-6/15/04...$15.00 http://dx.doi.org/10.1145/2702123.2702128 be used for spraying anymore
  • Methods:

    The authors recruited 12 participants (2 female, 10 male) from the institution. All were right handed.
  • Results:

    In 76% of all trials participants correctly named the behaviors the object had been designed to communicate.
  • In the remaining 24% of trials where participants did not name the exact behavior, they named some behaviors that were reasonably close, such as juggle instead of shake.
  • Overall, this result suggests that affordance++ allowed the blank cube to communicate identifiable object behaviors.
  • Six participants stated: “it does not want to be touched” one “The author cannots touch it” and one “The author is not allowed to touch it”
  • Conclusion:

    CONCLUSIONS AND FUTURE

    WORK

    In this paper, the authors presented the concept of affordance++, i.e., an extension to the traditional notion of affordance that allows objects to communicate (1) motion, (2) multistep processes, and (3) behaviors that change over time.
  • The authors observed that the agency (“who did what”) was redirected to the object—“it doesn‘t want the author to not drink from it” (P10) is very different from “The author should not drink from this cup”
  • The latter would elicit a more user-centered understanding of affordance, rooted in "what can the author does with this object".
  • Comments such as “the tool told the author to pull” (P11) suggest this is the case for affordance++, in which object-user dialogs do not happen on a verbal level but non-verbally, through the user's body motion
Related work
  • The work proposed builds on the theoretical foundations of affordance, animated objects, and haptic actuation of users. Affordance

    Affordance is a ground concept in HCI, thus much theoretical background can be evoked to explain it. Gibson was one of the earliest to seek a formal definition: “affordances (...) are what it offers, what it provides or furnishes” [7]. His definition roots affordance in the object’s ability of offering or providing use for the user. Digging deeper, his definition of affordance elicits the visual channel as the main channel for a perceived affordance [7].

    Affordance, however, has been revisited and refined numerous times, most notably by Norman [4]: “affordances define what actions are possible” and “affordances are relationships between object and user”. Norman restricted affordance to the domain of physical objects, “affordances make sense for interacting with physical objects, but they are confusing when dealing with virtual ones”.
Funding
  • Proposes extending the affordance of objects by allowing them to communicate dynamic use, such as motion, multi-step processes, and behaviors that change over time
  • Actuates users by controlling their arm poses using electrical muscle stimulation, i.e., users wear a device on their arm that talks to the user’s muscles by means of electrodes attached to the user’s arm
  • Demonstrates the expressiveness of affordance++ at the example of six unfamiliar household objects and tools
Reference
  • Baker, N., Prism Nightlight, http://nicholasbaker.com/post/70963455726/prism-nightlight-thegoal-of-this, last accessed in 9/9/2014.
    Locate open access versionFindings
  • Boxtel, V. Skin resistance during square-wave electrical pulses of 1 to 10 mA. Medical and Biological Engineering and Computing 15.6 (1977): 679-687.
    Google ScholarLocate open access versionFindings
  • Djajadiningrat, T., Matthews, B., and Stienstra, M. Easy doesn't do it: skill and expression in tangible aesthetics. Personal Ubiquitous Comput. 11, 8, 2007, 657-676.
    Google ScholarLocate open access versionFindings
  • Donald A. Norman. The Design of Everyday Things: Revised and Expanded Edition. Basic Books, 2013.
    Google ScholarFindings
  • Follmer, S., Leithinger, D., Olwal, A., Hogge, A., and Ishii, H. inFORM: dynamic physical affordances and constraints through shape and object actuation. Proc. UIST '13, 417-426.
    Google ScholarLocate open access versionFindings
  • Forlizzi, J. How robotic products become social products: an ethnographic study of cleaning in the home. Proc. HRI '07, 129-136.
    Google ScholarLocate open access versionFindings
  • Gibson, J. The Ecological Approach to Visual Perception, Psychology Press, 1979, Chapter 8.
    Google ScholarFindings
  • Grönvall, E., Kinch, S., Petersen, M., and Rasmussen, M. Causing commotion with a shape-changing bench: experiencing shape-changing interfaces in use. Proc. CHI’14, 2559-2568.
    Google ScholarLocate open access versionFindings
  • Ito, K., Takahiro S., and Toshiyuki K. Lower-limb Joint Torque and Position Controls by Functional Electrical Stimulation (FES). Complex Medical Engineering. Springer Japan, 2007. 239-249.
    Google ScholarFindings
  • Kim, D. Hilliges, O., Izdi, S. Butler, A., Chen, J., Oikonomidis, I., Oliver. P. Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor. Proc. UIST’12, 167-176.
    Google ScholarLocate open access versionFindings
  • Kruijff, E., Schmalstieg, D., and Beckhaus, D. Using neuromuscular electrical stimulation for pseudo-haptic feedback. Proc. VRST '06, 316-319.
    Google ScholarLocate open access versionFindings
  • Kwak, M., Hornbæk, K., Markopoulos, P., and Alonso, M. The design space of shape-changing interfaces: a repertory grid study. Proc. DIS '14, 181-190.
    Google ScholarLocate open access versionFindings
  • Latour, B. Where are the missing masses? The sociology of a few mundane artifacts. In Shaping technology/building society, 1992, 225-58.
    Google ScholarLocate open access versionFindings
  • Lopes, P., and Baudisch, P. Muscle-propelled force feedback: bringing force feedback to mobile devices. Proc. CHI '13, 2577-2580.
    Google ScholarLocate open access versionFindings
  • Marshall, M., Carter, T., Alexander, J., and Subramanian, S. Ultra-tangibles: creating movable tangible objects on interactive tables. Proc. CHI '12, 2185-2188.
    Google ScholarLocate open access versionFindings
  • McGrenere, J., and Ho, W. Affordances: Clarifying and evolving a concept. Proc. Graphics Interface’00, 179– 186.
    Google ScholarLocate open access versionFindings
  • Murayama, J., Bougrila, L., Luo, Y., Akahane, K., Hasegawa, S., Hirsbrunner, B., Sato, M. SPIDAR G&G: a two-handed haptic interface for bimanual VR interaction. Proc. EuroHaptics '04, 138-146.
    Google ScholarLocate open access versionFindings
  • Pangaro, G., Maynes-Aminzade D., and Ishii, H. The actuated workbench: computer-controlled actuation in tabletop tangible interfaces. Proc UIST '02, 181-190.
    Google ScholarLocate open access versionFindings
  • Pedersen, E., Subramanian, S., and Hornbæk, K. Is my phone alive? A large-scale study of shape change in handheld devices using videos. Proc. CHI '14, 25792588.
    Google ScholarLocate open access versionFindings
  • Pinhanez, C. S. The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces. In Proc. UBICOMP’01, 315–331.
    Google ScholarLocate open access versionFindings
  • Rasmussen, M., Grönvall, E., Kinch, S., and Petersen, M. "It's alive, it's magic, it's in love with you": opportunities, challenges and open questions for actuated interfaces. Proc. OzCHI’13, 63-72.
    Google ScholarLocate open access versionFindings
  • Strojnik, P., Kralj, A., and Ursic, I., Programmed sixchannel electrical stimulator for complex stimulation of leg muscles during walking. IEEE Trans. Biomed. Eng. 26, 112, 1979.
    Google ScholarLocate open access versionFindings
  • Tamaki, E., Miyaki, T., and Rekimoto, J. PossessedHand: techniques for controlling human hands using electrical muscles stimuli. Proc. CHI '11, 543552.
    Google ScholarLocate open access versionFindings
  • Tocky, http://www.nandahome.com, last accessed in 9/9/2014.
    Findings
  • Togler, J., Hemmert, F., and Wettach, R., Living interfaces: the thrifty faucet. Proc. TEI '09, 43-44.
    Google ScholarLocate open access versionFindings
  • Tsetserukou, D., Sato, K., and Tachi, S. ExoInterfaces: novel exosceleton haptic interfaces for virtual reality, augmented sport and rehabilitation. Proc. AH'10, Article 1.
    Google ScholarFindings
  • Verbeek, P., Materializing Morality. Design Ethics and Technological Mediation, In Science, Technology and Human Values 31, 2006, (3):361-380.
    Google ScholarLocate open access versionFindings
Your rating :
0

 

Best Paper
Best Paper of CHI, 2015
Tags
Comments