Prior research has offered a plethora of wearables centred around sensing bodily actions ranging from more explicit data, such as movement and physiological response, to implicit information, such as ocular and brain activity. Bodily augmentations that physically extend the user’s body along with altering body schema and image have been proposed recently as well, owing to factors such as accessibility and improving communication. However, these attempts have usually consisted of uncomfortable interfaces that either restrict the user’s movement or are intrusive in nature. In this work, we present Pneunocchio, a playful nose augmentation based on the lore of Pinocchio. Pneunocchio consists of a pneumatic-based inflatable that a user wears on their nose to play a game of two truths and a lie. With our work, we aim to explore expressive bodily augmentations that respond to a player’s physiological state that can alter the perception of their body while serving as an expressive match for a current part of the body.
Fluito: Towards Understanding the Design of Playful Water Experiences through an Extended Reality Floatation Tank System
Maria F. Montoya, YuYang Ji, Ryan Wee, and 5 more authors
Proceedings of the ACM on Human-Computer Interaction, Oct 2023
Water’s pleasant nature and associated health benefits have captivated the interest of HCI researchers. Prior WaterHCI work mainly focused on advancing instrumental applications, such as improving swimming performance, and less on designing systems that support interacting with technology in water in more playful contexts. In this regard, we propose floatation tanks as research vehicles to investigate the design of playful interactive water experiences. Employing somaesthetic design, we developed a playful extended reality floatation tank experience: "Fluito". We conducted a 13-participant study to understand how specific design features amplified participants’ water experiences. We used a postphenomenological lens to articulate eight strategies useful for designers aiming to develop digital playful experiences in water, such as designing to call attention to the water and designing to encourage breathing and body awareness in water experiences. Ultimately, we hope that our work supports people to be playful and benefit from the many advantages of being in water.
Auto-Paizo Games: Towards Understanding the Design of Games That Aim to Unify a Player’s Physical Body and the Virtual World
Rakesh Patibanda, Chris Hill, Aryan Saini, and 6 more authors
Proceedings of the ACM on Human-Computer Interaction, Oct 2023
Most digital bodily games focus on the body as they use movement as input. However, they also draw the player’s focus away from the body as the output occurs on visual displays, creating a divide between the physical body and the virtual world. We propose a novel approach – the "Body as a Play Material" – where a player uses their body as both input and output to unify the physical body and the virtual world. To showcase this approach, we designed three games where a player uses one of their hands (input) to play against the other hand (output) by loaning control over its movements to an Electrical Muscle Stimulation (EMS) system. We conducted a thematic analysis on the data obtained from a field study with 12 participants to articulate four player experience themes. We discuss our results about how participants appreciated the engagement with the variety of bodily movements for play and the ambiguity of using their body as a play material. Ultimately, our work aims to unify the physical body and the virtual world.
Fused Spectatorship: Designing Bodily Experiences Where Spectators Become Players
Rakesh Patibanda, Aryan Saini, Nathalie Overdevest, and 8 more authors
Proceedings of the ACM on Human-Computer Interaction, Oct 2023
Spectating digital games can be exciting. However, due to its vicarious nature, spectators often wish to engage in the gameplay beyond just watching and cheering. To blur the boundaries between spectators and players, we propose a novel approach called "Fused Spectatorship", where spectators watch their hands play games by loaning bodily control to a computational Electrical Muscle Stimulation (EMS) system. To showcase this concept, we designed three games where spectators loan control over both their hands to the EMS system and watch them play these competitive and collaborative games. A study with 12 participants suggested that participants could not distinguish if they were watching their hands play, or if they were playing the games themselves. We used our results to articulate four spectator experience themes and four fused spectator types, the behaviours they elicited and offer one design consideration to support each of these behaviours. We also discuss the ethical design considerations of our approach to help game designers create future fused spectatorship experiences.
Dancing Delicacies: Designing Computational Food for Dynamic Dining Trajectories
Jialin Deng, Humphrey Yang, Aryan Saini, and 4 more authors
In Proceedings of the 2023 ACM Designing Interactive Systems Conference, Jul 2023
Contemporary human-food interaction design is often a technology-driven endeavor in which food’s materiality has been largely underexplored. Building on the concept of “computational food”, this paper explores the design of food as a material realization of computation through a material-centered approach. We engaged with a “Research through Design” exploration by designing a computational food system called “Dancing Delicacies”, which enables food items to be “programmed” and “reconfigured” within dynamic trajectories. Our practice led to a design framework resulting in four original dish designs. Our dishes aim to illustrate the richness of this new design space for computational food. Furthermore, through engaging with expert practitioners from the hospitality industry, we provide a first account of understanding the design of computational food for dynamic dining trajectories and its speculative use contexts in the industry. With this work, we hope to inspire researchers and designers to envision a new future of human-food interaction.
Towards Designing for Everyday Embodied Remembering: Findings from a Diary Study
Nathalie Overdevest, Rakesh Patibanda, Aryan Saini, and 2 more authors
In Proceedings of the 2023 ACM Designing Interactive Systems Conference, Jul 2023
Our bodies play an important part in our remembering practices, for example when we can remember passwords by typing, even if we cannot verbalise them. An increasing number of technologies are being developed to support remembering. However, so far, they seem to have not taken the opportunity yet to support remembering through bodily movements. To better understand how to design for such embodied remembering, we conducted a diary study with 12 participants who recorded their embodied remembering experiences in everyday life over a three-week period. Our thematic analysis of the diaries and interviews led to the creation of a framework that helps understand embodied remembering experiences (ERXs) based on the level of skilled and conscious movements used. We describe how this ERX framework could help with the design of technologies to support embodied remembering.
DUMask: A Discrete and Unobtrusive Mask-Based Interface for Facial Gestures
Arpit Bhatia, Aryan Saini, Isha Kalra, and 2 more authors
In Proceedings of the Augmented Humans International Conference 2023, Mar 2023
Interactions using the face, not only enable multi-tasking but also enable us to create hands-free applications. Previous works in HCI used sensors attached directly to the person’s face or inside their mouth. However, a mask, which has now become a norm in our everyday life and is socially acceptable, has rarely been used to explore facial interactions. We designed, “DUMask”, an interface that uses face parts covered by a mask to discretely enable 14 (+1 default) interactions. DUMask uses an infrared camera embedded inside an off-the-shelf face mask to recognize the gestures, and we demonstrate the effectiveness of our interface through in-lab studies. We conducted two user studies evaluating the experience of both the wearer and the onlooker, which validated that the interface is indeed inconspicuous and unobtrusive.
2022
TouchMate: Understanding the Design of Body Actuating Games using Physical Touch
Shreyas Nisal, Rakesh Patibanda, Aryan Saini, and 2 more authors
In Extended Abstracts of the 2022 Annual Symposium on Computer-Human Interaction in Play, Nov 2022
Body-actuating technologies such as Electrical Muscle Stimulation (EMS) can actuate multiple players simultaneously via physical touch. To investigate this opportunity, we designed a game called “Touchmate”. Here, one guesser and two suspects sit across with their legs hidden under a table. The guesser attaches a ground electrode from one EMS channel, and each suspect attaches one active electrode from the same channel on their forearms. When a suspect touches the guesser’s leg, their bodies complete the electrical circuit, actuating both their hands involuntarily via the EMS. The guesser’s goal is to determine who touched their leg. In this paper, we present the results from our initial study and articulate three player experience themes. Ultimately, we hope our work inspires game designers to create physical touch games using body-actuating technologies.
SomaFlatables: Supporting Embodied Cognition through Pneumatic Bladders
Aryan Saini, Haotian Huang, Rakesh Patibanda, and 3 more authors
In Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology, Oct 2022
Applying the theory of Embodied Cognition through design allows us to create computational interactions that engage our bodies by modifying our body schema. However, in HCI, most of these interactive experiences have been stationed around creating sensing-based systems that leverage our body’s position and movement to offer an experience, such as games using Nintendo Wii and Xbox Kinect. In this work, we created two pneumatic inflatables-based prototypes that actuate our body to support embodied cognition in two scenarios by altering the user’s body schema. We call these ”SomaFlatables” and demonstrate the design and implementation of these inflatables based prototypes that can move and even extend our bodies, allowing for novel bodily experiences. Furthermore, we discuss the future work and limitations of the current implementation.
Motor movements are performed while playing hand-games such as Rock-paper-scissors or Thumb-war. These games are believed to benefit both physical and mental health and are considered cultural assets. Electrical Muscle Stimulation (EMS) is a technology that can actuate muscles, triggering motor movements and hence offers an opportunity for novel play experiences based on these traditional hand-games. However, there is only limited understanding of the design of EMS games. We present the design and evaluation of two games inspired by traditional hand-games, ”Slap-me-if-you-can” and ”3-4-5”, which incorporate EMS and can be played alone, unlike traditional games. A thematic analysis of the data collected revealed three themes: 1) Gameplay experiences and influence of EMS hardware, 2) Interaction with EMS and the calibration process and, 3) Shared control and its effect on playing EMS games. We hope that an enhanced understanding of the potential of EMS to support hand-games can aid the advancement of movement-based games as a whole.
2019
Aesop: Authoring Engaging Digital Storytelling Experiences
Aryan Saini, Kartik Mathur, Abhinav Thukral, and 2 more authors
In Adjunct Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, Oct 2019
The traditional storytelling experiences are often one dimensional, wherein they only contain a single channel of communication with the audience through narration. With the advancements in technology, storytelling experiences have been augmented with the help of digital media to be more engaging and immersive. Authoring these scenarios, however, is complicated as it requires technical knowledge to interface the means of engagement. In this work, we talk about Aesop, a system which assists the narrator to author engaging storytelling experiences. Aesop provides a block-based interface like Scratch and manifests words of a story, Cues, and Visualization (Outputs) as blocks that enable the user to create captivating stories. Our system also leverages physical actions performed by the user as Cues. These cues can trigger visualizations like robot actions, animations, environments simulation using sound and lighting effects.
Gehna: Exploring the Design Space of Jewelry as an Input Modality
Jatin Arora, Kartik Mathur, Aryan Saini, and 1 more author
In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, May 2019
Jewelry weaves into our everyday lives as no other wearable does. It comes in many wearable forms, is fashionable, and can adorn any part of the body. In this paper, through an exploratory, Research through Design (RtD) process, we tap into this vast potential space of input interaction that jewelry can enable. We do so by first identifying a small set of fundamental structural elements — called Jewelements — that any jewelry is composed of, and then defining their properties that enable the interaction. We leverage this synthesis along with observational data and literature to formulate a design space of jewelry-enabled input techniques. This work encapsulates both the extensions of common existing input methods (e.g., touch) as well as new ones inspired by jewelry. Furthermore, we discuss our prototypical sensor-based implementations. Through this work, we invite the community to engage in the conversation on how jewelry as a material can help shape wearable-based input.
VirtualBricks: Exploring a Scalable, Modular Toolkit for Enabling Physical Manipulation in VR
Jatin Arora, Aryan Saini, Nirmita Mehra, and 3 more authors
In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, May 2019
Often Virtual Reality (VR) experiences are limited by the design of standard controllers. This work aims to liberate a VR developer from these limitations in the physical realm to provide an expressive match to the limitless possibilities in the virtual realm. VirtualBricks is a LEGO based toolkit that enables construction of a variety of physical-manipulation enabled controllers for VR, by offering a set of feature bricks that emulate as well as extend the capabilities of default controllers. Based on the LEGO platform, the toolkit provides a modular, scalable solution for enabling passive haptics in VR. We demonstrate the versatility of our designs through a rich set of applications including re-implementations of artifacts from recent research. We share a VR Integration package for integration with Unity VR IDE, the CAD models for the feature bricks, for easy deployment of VirtualBricks within the community.