Gesture-based Interactions

A main focus of my research has been exploring the use of gestures in various types of interactions. Below are some of my current projects in this area.

Hand motion -- pointing, gesturing, grasping, shaking, tapping -- is a rich channel of communication. We point and gesture while we talk; we grasp tools to extend our capabilities; we grasp, rotate, and shake items to explore them. Yet, the rich repertoire of hand motion is largely ignored in interfaces to mobile computation: the user of a modern smartphone generally holds the device stationary while tapping or swiping its surface. Why are so many possible affordances ignored? Certainly not for technical reasons, as smartphones contain an evolving set of sensors for recognizing movement of the phone, including accelerometers, gyroscopes and cameras. However, beyond rotating to change screen orientation or shaking to shuffle songs, little has been done to enable rich gestural input through device motion. This research project includes my on-going work in design, recognition and characterizing of motion gestures to control modern smartphones
more >>
Currently, my graduate students and I are applying a similar methodology to my motion gesture work in order to examine users’ current mental model of how gestures can be used for an additional input modality for mobile interaction. Our initial study examined the use of back of the device gestures (i.e., gestures performed on the back of a smartphone) as a means for addressing the limited thumb reachability problem, the problem amounting from a user’s attempt to reach to an area of a smartphone screen with their thumb. This problem is exasperated by the current trend of smartphone screens increasing in size. The findings of this study highlight common user motivations of wanting to prevent accidental input and the reliance on mimicking current touchscreen gestures.
more >>
we explored how to design non-touchscreen gestures to extend the input space of smartwatches. Similar to our prior work on motion gestures, we demonstrated that a consensus exists among the participants on the mapping of gesture to command and use this consensus to define a user-defined gesture set. We defined a taxonomy describing the mapping and physical characteristics of the gestures, and provided insights to inform the design of non-touchscreen gestures for smartwatch interaction. This work is currently under review for CHI 2016.
more >>
The use of mid-air gestures has become extremely popular with the release of the Microsoft Kinect and Nintendo Wii. Projects in this category examine how to use mid-air gestures effectively communicate with computers.
more >>

Exploring User-Defined Back-Of-Device Gestures for Mobile Devices
Shaikh Shawon Arefin Shimon, Sarah Morrison-Smith, Noah John, Ghazal Fahimi, and Jaime Ruiz. 2015. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '15). ACM, New York, NY, USA, 227-232.

Exploring the Use of Gesture in Collaborative Tasks
Isaac Wang, Pradyumna Narayana, Dhruva Patil, Gururaj Mulay, Rahul Bangar, Bruce Draper, Ross Beveridge, and Jaime Ruiz. 2017. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '17). ACM, New York, NY, USA, 2990-2997. DOI: https://doi.org/10.1145/3027063.3053239