The design for this week is to build and test a behavioral prototype for a gesture-controlled platform.
A gestural user interfaces for an Apple TV or similar system that allows interaction through physical motions. An example prototype would be controlling basic video function controls (play, pause, stop, fast forward, rewind, etc.). The gestural UI can be via a 2D (tablet touch) or a 3D (camera sensor, like Kinect) system.
Wizard of Oz System
Our wizard of oz system includes two different sets of Spotify accounts to control music streaming, each from one device. Airpods give participants audio feedback and the TV is used as a visual display to show the effect of the gesture. The pink washi tape at the bottom of the Airpods represents the sensor that we would further develop to capture the physical gestures.
During our first class, we brainstormed potential ideas together and decided to build a system that can control basic video functions. Initially, we thought it would be cool if we can both include voice and gesture control in our design. However, after some initial searching and talking with Andy, we gave up on this ambition idea due to the restricted time limit. Finally, we decided to only build a gestural user system that allows for physical interaction to control Spotify such as play, pause, next song, previous song, etc. Once we have decided on our topic, we started to design the possible gesture motions. Since the gestures are designed for a music app, we wanted to make them more entertained so that users would be easy to remember and also have fun while performing them. Listed below are the design requirements for the gestures:
- Allow users to wake the airpods up by shaking hands nearby it
- Allow users to pause the song by making a High five hand
- Allow users to resume the song by making a fist
- Allow users to turn up or turn down the volume by raising or lowering the arm
- Allow users to change songs by swiping from either direction
- Allow users to shuffle songs by making a little tornado
- Allow users to share songs by making a throw motion
As Facilitator, Brent helped the user understand the Wizard of Oz prototype system. All the tasks and scenarios are communicated
As Wizard, Melissa helped to “fool” the user by manually performed the tasks such as turning the volume up/down, pause/resume the song manually through her phone and computer. All the tasks were done behind the curtain.
As Scribe, Michelle took the notes to record details and captured insights in the evaluation session.
As a Documentarian, Augustina recorded videos from different angles during the evaluation session and edited the video after the evaluation.
To make our participants more engaged, we designed scenarios for each task and asked our participants to perform tasks. Here are our scenarios and tasks:
- On the scale of 1–5, how comfortable you are with the gesture? (1=least comfortable, 5=most comfortable)
- What do you think of using gestures to control the Airpods?
- Do those gestures all make sense?
- Any suggestions to improve the gesture to make it more intuitive?
More detailed Behavioral Prototype Plan can be found here.
What worked well:
- In addition to using Airpods, using TV as a visual display was a great reference that allows participants to see the feedback visually.
- Like the idea that showing the follow-up questions with caption on the bottom of the screen (from critique)
- The scenarios were clearly communicated and tasks are simple enough to perform (fro critique)
What needed improvements:
- Despite our participant suggesting that our system was intuitive and easy to use, there were still some flaws that could be improved upon. For starters, it was brought to our attention that having to wake up the airpods before making a gesture command became very redundant and annoying.
- Secondly, we noticed the swiping gesture to skip to the previous or next song was initially confusing for participants; they weren’t sure which way to swipe.
- Although using gestures to control Airpods was accessible when the phone is very away from users, participants brought up that some gestures are very big so using gestures to control Airpods is not appropriate in every single space. Hence, more inconspicuous gestures are needed.
Wizard of Oz prototyping was an extremely powerful way to test out ideas that are not concrete or fully developed. This method of testing is a low-cost way to explore a potential product with users in a physical space. The most difficult experience of this process was figuring out the logistics of how to incorporate a Wizard. We had to incorporate many different systems into our design (Airpods, TV, Spotify, iPhone) and making sure all the functions aligned the way we wanted them to was extremely difficult. The most enjoyable aspect of this assignment was fooling our participants. It was just funny to see their reactions after showing them the backend of how our system worked. One thing we would change if we had the opportunity to redo this assignment would be to incorporate and test new features. We had an idea for a gesture that would allow users to share songs with their friends, but due to time constraints, we decided not to include this in our final design.
Overall, our design seemed to be pretty effective. The user was able to complete every task with a relatively low amount of difficulty. Obviously there was a small learning curve in the beginning, but the user was able to quickly familiarize themselves with all the controls. We definitely had our participant convinced that “Gesturepods” was a real thing.