#eye #eye






Project Overview

Spring 2023


Team:
Jiaqi Wang, Eric Gan, Jeremia Lo, Adrian Ma, Audrey Renouf

My Role:
Design Lead, 3D Artist

Duration:
15 weeks

Tools: 
Unity, Blender, Figma, Leap Motion
Hands-on, lab-based science experiments are critical in STEM education but are often expensive and inaccessible for students with learning disabilities (LDs).

Our project aims to create an interactive virtual simulation of a biology lab experiment that is relatively cheaper, with a focus on an accessible experience for students with LDs. 

MotionLab walks students through an existing experiment developed by PBO, which teaches students how to use micropipettes in a biology lab context. Our final product is a Unity application that users can interact with through the Leap Motion sensor, covering the main concepts of the micropipette experiment.




PROJECT GOAL



Design an accessible science experiment in mixed reality (XR) that facilitates transfer of learning to a physical lab





A Leap Motion sensor translates hand gestures to a 3D, digital space.
A micropipette is an essential biology tool used for techniques like DNA extraction and pipetting cells.
How can we use the Leap Motion sensor to effectively teach students how to use a micropipette?







RESEARCH OVERVIEW


The focus of our initial research was to gain a comprehensive understanding of the affordances and limitations of existing science education tools and methods , as well as potential improvements made to these tools through the use of XR.  The methods we chose to achieve these goals and why are as follows:





Literature Review


Competitive Analysis


Expert Interviews


Contextual Inquiry




Contextual Inquiry

To understand the student experience of learning biology through lab experiments, we observed 4 undergraduate college biology students performing the PBO micropipette experiment using multisensory video instruction in order to uncover low-level details of experiment procedure and improve our understanding of students' science experiment processes.


Doing the Experiment

Observing Participants

Asking Follow-up Questions







RESEARCH INSIGHTS





Students switched attention back and forth between multiple areas during steps.


Most errors occurred from forgetting steps or improper use of the equipment.


Immediate feedback is crucial to error reduction and learning.



We identified the following design opportunities

︎︎︎




Use cues to direct focus and make sure only task-relevant items are present.

Reinforce gestural instructions and provide interactive instructions for proper use.

Provide immediate feedback when errors are made and options to rectify errors.






OUR SOLUTION






Our solution allows the user to interact with virtual lab tools with their own hands without actually having to own or purchase the physical tools themselves. This allows users with limited access to lab tools to still have access to learning opportunities within lab environments.

Affordable


The use of the Leap Motion sensor ensures that our solution is relatively affordable in comparison to micropipette kits, which are quite expensive. In addition, most other XR tools are also much more expensive.

Portable


Our solution is also very portable, which allows it to be easily sent to schools. The shareable aspect of the tool supports opportunities for students and teachers to help each other when going through the experiment.

Accessible


Our solution is also aims to be accessible to students with LDs. The XR experience allows for a degree of multisensory instruction as well as personalized customization so that students can receive instruction in the form that they’re most comfortable with.






Lo-Fi PROTOTYPE


We created a Figma Lo-Fi prototype to test our concept and receive feedback. You can interact with it!











Lo-Fi USER TESTING


We used the Wizard of Oz testing method to simulate an interactive virtual environment that reacts to participants’ hand motion. 





After the participants finished all the steps, they were given a real micropipette and ran through the prototype again in order for us to evaluate how well the skills and knowledge were translated from the digital to physical tool.








Mid-Fi PROTOTYPE


In order to accomplish our goals for the mid-fi phase, we utilized Unity and the Leap Motion to prototype several key features of our digital experience. At this point, these elements were not yet linked together to create the full experience of an experiment, but they allowed us to test the effectiveness and readability of specific design strategies on the Leap Motion before creating the full experience.





Grabbing Objects


Many steps in the experiment require users to pick up objects and move them around the table. By performing a grabbing gesture near a moveable object, users can lock a object to a set orientation in their hand.



Plunger Interactions


Simulating the usual thumb movement for pushing down a pipette plunger using the Leap Motion was not possible; instead, we allowed users to push down the plunger using their non-dominant hand and provided feedback using a vertical bar on the left of the interface.



User Interface


Users can see labels and instructions for objects they may not be familiar with, and can open up a menu by opening up their non-dominant hand’s palm to perform actions such as restarting the experiment or resetting the position of the micropipette.


To evaluate our mid-fi prototype, we utilized think-aloud usability tests alongside semi-structured interviews to assess the 3 core gesture interactions
︎︎︎











High-Fi PROTOTYPE


Our final prototype took all the insights and development from the previous phases of the project and combined them into a fully fleshed out program that allows users to go through the lab experiment steps of drawing up fluid and dispensing the fluid onto parafilm with a micropipette. The user follows instructions narrated to them in the virtual space. Since the prototype utilizes the Leap Motion, the user can use their physical hands to pick up and interact with objects in the 3D virtual space.

This prototype was developed in Unity and the program scripts for the different interactions were coded in the programming language, C#. In addition, we worked with the Leap Motion development API, UltraLeap, to program the interactions that involved hand gestures.



Exaggerated Feedback


We saw the need for exaggerated feedback for physical interactions and to let users know when they can pick up certain objects to help them better perceive depth in the 3D space in our mid-fi tests.















Multisensory User Interface


The user interface and experiment instructions were delivered through both a text display and through audio so that the user could process the instructions with the methods they prefer. In future iterations, more personalization to this feature, such as being able to turn the narration off or adjusting the speed of narration would be beneficial to help the users adapt the tool to their needs.




Clear Substeps


The instructions were also split up into clear substeps so that the user doesn't have to process too much information at one time. The prototype does not move on to the next step until the user is finished with their current task. This is important because it allows students to be confident that they will be guided through each possible step of the experience so that they don’t have to be afraid of missing anything.




High-Fi USER TESTING

For hi-fi user testing, we wanted to make an effort to recruit students who had LDs in order to assess the target audiences' needs. To do this, we created a functional needs survey that asked users whether they identified with common needs that are present in students with LDs, such as ADHD and dyslexia. 

We had users walk through the entire experiment with the goal of assessing extent of transfer, ease of use, and how well the experiment held users' attention.




Additionally, we were able to show our client our progress and get feedback from them before the final presentation. They validated the educational aspects of the prototype and gave us positive feedback:


“This is exactly what we’re looking for.”







THE RESULT




Users successfully transfer conceptual learning from digital to real micropipette


A majority of our users were able to use the real micropipette correctly after going through the digital experiment. We focused on assessing concepts that students typically had trouble with when micropipetting, such as proper hold, selecting volume, and being able to draw up and dispense liquid with the first and second stops.






   

Users utilize gestural learning to help them remember experiment steps.


We found that when users were asked to replicate the experiment with the real micropipette, using gestures helped trigger their memory of what to do in each step. Additionally, users would move their hands to an imaginary piece of equipment when completing steps, despite not having anything in front of them when using the real pipette, which indicates that gestures help users link each step with the equipment needed.