top of page
Teaser_edited.png

GesVenture: Hand Gestures for Rhythmic Music Games

After struggling with different aspects of my M2.1 project, I have chosen to work on a theme that I am quite familiar with: rhythmic games. I am personally a great fan of rhythmic games, and have played a lot of this genre with different interaction techniques. However, I found that the use of hand gestures in rhythmic games is still a blank to be filled, and the advantages of the Meta-gloves could nicely fit the application domain at the current phase of technological development. This project, in the end, proposes interaction paradigms for hand-gestural interaction in rhythmic games, covering the rhythmic game itself and the pre-game setup tasks. The iteration-based process made gradual progress in the design concept, and the evaluation with target users showed the pleasurable experience they had.

Background

Maestro.png

Music games have been applied to a variety of domains, for example, rhythmic training and rehabilitation. Rhythmic music games specifically ask the players to react to the rhythm of the game. The current rhythmic games mostly focus on the rhythm itself, while the music expressions are ignored. Hand gestures, which have been rarely applied to rhythmic games, could serve this well as they carry semantic information.

Benchmarking

A study on rhythmic music games has included a number of existing ones on the market and juxtaposed their peripherals (input hardware), recorded responses, and forms of output. However, within the five years since this paper was published, there have been a number of new games in this domain, employing diverse ways of input and output. For example, some music games in VR have been invented, which are not included in the previous review. Therefore, based on the existing study and my own playing experience, a benchmark is made to compare the current existing rhythmic music games in terms of the three aforementioned dimensions.

Benchmark_Music.png

Design Process

The whole project takes a linear design-thinking approach: it starts by studying target users around how they perceive music, particularly in relation to hand movement, then comes to defining their commonalities in such perceptions. Based on the established paradigm as a result of ideation, prototypes were developed using the related hardware and software. The prototypes are then used for user tests to study the player experience, and the test results are used for ideating new design points to start the next iterations.

Gesture Elicitation Study

Aim: To draw inspiration around intuitive connections between hand gestures and different musical expressions, and to form a gesture vocabulary for rhythmic music games.

Participants: 12 music lovers/game lovers in 6 pairs

Process:

  • Let them play a song that they recently love.

  • Let them listen to music excerpts of different musical expressions, and make models with clay that can be interacted with certain hand gestures. The gestures are recorded by filming.

S2_edited.jpg
L10_edited.jpg

Results

Initial Paradigm

Game Mechanics

The player needs to use different hand gestures to eliminate different notes that come towards him. The appearances of the notes are based on the rhythm. The paradigm is (A stands for Angular style, and F stands for Flowing style):

  • A1: making a fist and punching a cube

  • A2: making a palm and chopping a piece of wood

  • A3: keeping a relaxed gesture and tapping a button

  • F1: making a palm and stabbing a balloon

  • F2: keeping a relaxed gesture and touching a cloud gently

  • F3: holding a coming streamlet with the palm

Different visual effects are attached to the notes, and are played when the trigger is successful.​

Evaluation

Test_1st.jpg

Aim

To get insights into how the design concept contributes to player experience in the context of rhythmic music games in VR.

Method

Observation: observing players' actions through Unity

Semi-structured Interview: Interview questions adapted from the Player Experience Inventory (PXI). A total of five participants were recruited.

Feedback Employment

  • Showing History Score

  • Change of appearance of A3

  • Change of appearance F2

  • Involving line interaction

  • Turning F3 into F2 aligned by lines

Pre-game Tasks

Tasks: Selecting the song, Switching the level of difficulty, start the game

Interaction Flow:

  • Sliding on the panel on the level side to switch the song

  • Tapping the panel with the thumb to confirm

  • Sliding on the panel on the level side to switch the level of difficulty

  • Tapping the panel with the thumb to confirm

  • Punching fists to start

Evaluation

Aim

The evaluation of the second iteration aims to study how the players would experience the optimized rhythmic game prototype, as well as how the newly-established paradigm for pregame tasks is experienced. Consequently, the evaluation was also carried out in two sessions for each participant: player experience on the pre-game tasks and the gaming part.

Method

Similar to the previous test sessions, the test for the second iteration also recruited five participants in total, who were all music lovers and have more or less experience in VR. Two of them have played rhythmic music games before and two have experience in VR/MR technologies. The selection of these five participants had no overlap with the previous tests.

Feedback & Employment

Rhythmic game: The mechanics and the dynamics are working well in the design concept, bringing pleasurable experience and even letting players attach higher significance to playing the game. The aesthetics has been adapted well to make the interaction clear, responsive and intuitive.

Pre-game Tasks:

  • Rotating on the side of the headset to rotate the wheel menu for song selection

  • Tapping on the side with the palm to confirm

  • Sliding on the side of the headset to switch the level of difficulty

  • Tapping on the side with the palm to confirm

  • Punching fists to start

Final Design

Interaction Flow.png

Reflections

Design Process:

In this project, I have also taken a iteration-based design process, which resembles those in my M1.1 and M2.1. However, compared to the previous one, I have found better ways to give significance to different iterations and align them. The exploratory elicitation study served as a good start, and the qualitative methods I used were greatly helping me with the optimization of the design concept.

Competence:

  • T&R: In this project, I was in deep command of manipulating the Meta-gloves from a technical perspective. I made use of the advantages of the Meta-gloves to develop the fully-functioning game, and incorporated the use of miscellanous relevant tools, such as the SDKs, the Dashboards, and the data transmission tools like OSC, which I learned in previous electives.

  • U&S: In this project, I have used similar user study method as I used before, which helped me with iterating the design concept well. I used some adaptations of previously established methods. For example, I adapted the gesture elicitation study from the HCI field and used it in an inspiring way; I adapted the Player Experience Inventory (PXI) to make it serve as a qualitative method. One thing I want to do but have not done, is the quantitative study on the user's performance. This prototype could serve this well by measuring the time of players' reactions. This could be used for multiple purposes, such as the selection of interaction paradigms.

  • M,D&C: Based on my learning around AI and Machine Learning, I have used my M,D&C skills in this project through gesture recognition. Unlike my M1.2 project, I am using the data in a dynamic way: collecting the instant data from the hands, use it instantly for recognition, and use the recognition results instantly as well. As mentioned in the U&S part, I would like to further study the users in a quantitative way, by studying their performance in the game, which is similar to my M1.2 project in some sense.

PI&V:

This project is a satisfying ending for my study here as a designer with a technological perspective. I have integrated a lot of technological skills that I learned and used in my master's study, and used them seamlessly. The incorporation of the MDA framework throughout the whole project helped me with understanding the gamification thoroughly. The establishment of the interaction paradigm closely corresponds with my interest and aspiration.

Besides, using the Meta-gloves in-depth also brought me new insights into the development of interaction technologies. I believe in the future, there will be no universal solution or interaction paradigm for every domain. The different interaction technologies, such as the VR controllers, game controllers like PS5, the wearable Meta-gloves, and the burdenless Leap Motion, will serve different domains which correspond with their special features and advantages. It is the designers' responsibility to discover these features and propose pleasurable experience to users with them.

Demonstration Video

bottom of page