Virtual Lab

A virtual lab simulation for students.

Grand Canyon Education • 2022

Project Overview
Role

Web Design

Team

1 design manager, 1 project analyst, 2 web designers, 2 developers, 2 videographers

Tools

Adobe XD, Adobe Illustrator, Adobe Photoshop, After Effects, Bodymovin, Lottie Animation

Challenge

The GCU Master of Forensic Science program is new and is still growing. There have been requests to make it as hands on as possible to stand out from other similar programs on the market.

Solution

Build a new interactive simulation to one of the classes in the program. The simulation would create an interactive option for students to learn and conduct presumptive drug testing.

Project Background

I was working as a web designer for a small department called Academic Web Services (AWS) under a much larger company, Grand Canyon Education (GCE). Much of our work was to support Grand Canyon University’s (GCU) course curriculum. 

About a year prior to this project kickoff, virtual lab research had been done to prepare for simulations like this, experimenting with tools like Lottie files to build out interactive components. Because of this, much of the mockup work and framework for the development phase was completed, which significantly reduced the scope of this project.

The Colorimetric Testing of Toxicants Virtual Lab was going to be our department’s first time taking this research and applying it to something real. The comps were handed off to me with the idea of creating this interactive lab for students in the new Master of Forensic Science program. With COVID-19 still on the rise, this would give students the ability to learn and conduct presumptive drug testing virtually and to complete an assignment that would be submitted and graded online.

As the primary web designer on the project, I was responsible for defining the overall experience — shaping how the lab would look, function, and translate complex interactions into something intuitive, while working within the available tools and resources. Features like Lottie animations introduced additional complexity, requiring ongoing collaboration with the developer to test what worked and refine the experience.

Success Criteria:
  1. Students are able to test multipe of the 10 different drugs against 10 different reagentsi

  2. Students are able to choose the drug to test and the reagent to use at random

  3. Students are able to see reactions between each drug and each reagent individually

  4. The activity allows for "virtual lab" funtionality that allows the student to feel like the simulation is as real as possible

  5. The activity by the student is tracked and provided at the end of the simulation as a PDF

  6. This virtual activity can be tied in with the standard operating producedures and check for understanding

Chapter I

Research

Understanding the lab

Since I wasn’t a subject matter expert, I took time to fully understand the lab — what students were learning, the intended outcomes, and whether actions needed to follow a specific sequence. Accuracy was critical, as any inconsistencies in the design could lead to misinformation.

I reviewed the provided materials, familiarized myself with lab tools and substances, and made sure I could confidently communicate these details when presenting to the team and stakeholders.

A key challenge was translating a physical lab into a virtual, interactive experience that felt intuitive for students. This required research into both the lab process and interaction patterns like drag-and-drop.

The UX of interactive design

Interactive design is something users don’t think about much — we just know when it makes sense, and when it doesn’t. I had to think through things like:

  • How will the student know what to click, what to drag, and when?

  • Will there be limitations on where a user can drag an object? What does that look like?

  • How and when does an animation play out? Can we block surrounding elements from being interactive when that takes place?

  • What cues can we use to make sure the student is doing what we want them to do? How do we get students to perform these events in the order that they are designed to be in?

To better understand these patterns, I looked at other interactive experiences — many of which are found in web-based games. This even brought me back to platforms like Webkinz, where simple drag-and-drop interactions felt intuitive, even as a kid. Revisiting these experiences helped me think through what makes interactions feel easy to follow.

My biggest takeaways from my research were:
  • The cursor alone can signal to the user what is going on — it should be changing on hover

  • Tap targets needed to be large enough for users to easily select and drag objects

  • Animations, even when simple, reinforce that an action was completed successfully

  • Hover states help clarify what elements are interactive, reducing confusion and surprises

While research provided a foundation, much of this came down to testing and iteration. Prototyping played a key role in getting interactions as close as possible to the final experience — especially in simulating how animations should behave. This also served as a reference when creating animations that would later be converted into Lottie files.

Chapter II

Riff

Gathering the assets

A significant part of the early process involved sourcing the right assets for the lab experience. This required balancing accuracy with visual consistency — ensuring elements felt cohesive and didn’t look out of place within the scene. While perspective wasn’t perfectly uniform, I made sure tools aligned closely enough with the background to maintain a believable environment.

I also researched how different substances should appear and react, including expected color changes. Many of these materials were powder-based, which allowed me to reuse a single base asset and adjust color as needed, keeping the visuals consistent while improving efficiency.

Tools needed for this lab were:
  • Spot plate

  • 5 pipettes (one for each reagent)

  • Metal spatula

  • 5 reagent bottles

  • 10 unknown drug samples (in physical form)

Challenge: The Hand Interaction

The hand was part of the initial concept, used to visually guide interactions, but it quickly became clear it wasn’t working.

There were a few key issues:
  • Users may be left or right-handed, and forcing a single perspective felt unnatural and potentially distracting

  • Certain tasks required hand positions that didn’t make sense with the original asset

  • Adding more hand variations would require additional assets that weren’t readily available

I looped in other designers on the team to explore potential solutions, but finding or creating a consistent, usable set of hand assets proved challenging.

This led to a series of attempts:
  • Attempt 1: Manipulating the existing hand in Illustrator
    This wasn’t feasible due to the complexity of the graphic, including gradients and anatomy

  • Attempt 2: Sourcing alternative hand assets
    Finding a consistent set of hand positions from the same style proved difficult

  • Attempt 3: Creating custom photography
    I photographed real hands using gloves and edited them in Photoshop, but the results didn’t meet the visual standard needed

At that point, it was clear the hand just wasn’t the right solution.

💡 That’s when the idea came up: a custom cursor. This allowed the interaction to feel more natural and flexible, without forcing a specific hand orientation. It also aligned with patterns I had seen in other interactive experiences, where the cursor itself communicates interactivity.

The impact of hover states

While subtle, hover states were a key part of making this experience feel intuitive and interactive. For draggable objects, I introduced a bold white outline on hover. This provided immediate visual feedback — making it clear what had been selected and what action the user was about to take.

Similar to the Lottie animations, hover states were implemented by swapping between assets. To keep these transitions seamless, I ensured consistent sizing and positioning so the interaction felt like a single, continuous element rather than two separate images.

Clearing the dish

One interaction that required additional thought was how users could reset the Petri dish and start fresh.

My initial approach was to include a button beneath the plate, but this introduced visual clutter and didn’t align well with the perspective of the scene.

After collaborating and brainstorming with my manager, we landed on a more integrated solution: a “sink” option within the side navigation. From there, users could choose to clean the dish and reset the workspace.

This approach kept the main interface clean while still making the action easy to find and understand.

Chapter III

Refine

Hi-Fi Comps & Prototype

The final designs focused on a desktop experience, using Adobe XD’s Auto Animate feature to demonstrate how interactions and animations should behave.

Alongside the comps, I created two supporting files: one containing all UI assets (similar to a UI kit), and another outlining the different reactions within the lab. This helped clearly communicate how each interaction should function and what outcomes should be expected.

Keeping these files separate from the main designs made it easier for the developer to access what was needed during implementation.

Given the number of moving pieces in this project, organization was critical. Consistent naming conventions and structured asset management helped prevent confusion, reduce errors, and support a smoother handoff.

This prototype does not include the update of replacing the hand with a custom cursor.

Lottie animations introduced a learning curve

Working with Lottie introduced a learning curve for both myself and the developer.

After finalizing the animation concepts in the prototype, I moved into building the actual Lottie files. Implementing this within the context of this lab required a lot of experimentation. Because it was new to both of us, the process involved significant trial and error, along with frequent back-and-forth to ensure everything translated correctly into code.

Since animation isn’t my primary area of expertise, I focused on keeping interactions simple, clear, and effective.

Animations were created in Adobe After Effects and exported using the Bodymovin plugin.

A few key considerations during this process included:
  • File types and export settings:
    Assets needed to be set up correctly depending on whether they were raster or vector. This directly impacted export settings—otherwise, the JSON files would not render properly.

  • Reusable vector assets:
    The developer was able to dynamically adjust colors for vector-based elements in code. This allowed a single asset (such as a solution) to be reused across multiple scenarios, improving efficiency and reducing the need to recreate assets.

  • Animation-to-static transitions:
    Interactions were handled by playing a Lottie animation (JSON), then swapping to a static PNG. We ensured all assets maintained consistent dimensions so users wouldn’t notice the transition.

Bottle Animation

Animated with After Effects

Dropper Animation

Animated with After Effects

Spatula Animation

Animated with After Effects

The feedback loop

Working closely with the developer required an ongoing feedback loop to ensure interactions were functioning as intended.

A few challenges came up during implementation. For example, certain areas of the screen needed to be temporarily non-interactive while animations were playing. In another case, I designed a hotspot interaction for the spatula element after it became clear the full component couldn’t be made clickable.

Although it wasn’t formally part of the process, I also took time to review the build multiple times before final launch — providing feedback to ensure the experience aligned as closely as possible with the original designs, both visually and behaviorally.

Chapter IV

Reflect

Next Steps

The next step for this project would be evaluating how students interact with the lab — whether the experience is intuitive, and if it effectively supports learning outcomes.

Collecting usage data and feedback would help identify any areas of confusion, potential bugs, or opportunities for improvement.

If successful, this project could serve as a foundation for future interactive labs, providing a scalable approach that the Academic Web Services team could build on.

What I Learned
  1. Clear communication with developers goes a long way. Part of what made this feature successful was the strong collaboration between myself and the developer throughout implementation. Because we were both navigating new technical challenges, there was plenty of trial and error during development. Working through those challenges together helped us arrive at the strongest possible outcome.

  2. Design work doesn’t end when handoff begins. At GCE, traditional QA processes were in place, but designers weren’t typically involved once development was complete. Given the complexity of this project, I wanted to ensure the final experience aligned as closely as possible with the designs. I worked closely with the developer in a feedback loop, identifying small UI inconsistencies and interaction issues as the build progressed. Taking the time to QA the final product from a designer lens helped ensure the experience was the best that it could possibly be.

©

2026

Elizabeth Ardelt