BGC hero.jpg


A mixed reality application for applied earth science engineers to annotate on 3D geographic models using the Microsoft Hololens


The goal of this project was to research and develop new ways for applied earth science engineers to collaborate by annotating on holographic models. The project was built for the Hololens and was designed to explore new ways for users to mark up and interact with 3D geographic data. Documenting learnings and takeaways was an essential part of the project. This was translated into a comprehensive documentation with detailed learnings and insights from our research, designs and discoveries, which would serve as a useful asset beyond the project’s scope.

My role

  • Qualitative Research

  • Contextual Inquiry

  • Facilitating Design Sprints

  • Interaction Design

  • Story Boarding

  • UX Design

  • UI Design

  • User Flows

  • Wireframes

  • Prototyping

  • Usability Tests

The team

  • Product Designer (me)

  • Product Owner

  • Visual Designer

  • UX Designer

  • 3 Engineers​


5 months for research, design and development of MVP


By conducting one-on-one user interviews with engineers, we could better understand their needs and identify some common pain points among our users. 

2D drawings are very limiting when it comes to visualizing models. However, once there is a common understanding of the model amongst participants, then sketching and visualizing ideas is very quick and efficient.

At the time, the Ada Platform used voice commands to navigate and utilize the software features. Engineers described that using voice commands was inefficient in a meeting environment with multiple people as the usability decreased due to ambient noise and people speaking in the background. They also noted that talking to a computer in front of other people felt awkward.

Identifying the annotation types and symbols that engineers use most often was essential to create a set of tools they could use to mark up the holographic models.


Engineers use mental models when collaborating on ideas, but often there’s a gap between these models when communicating with partners/clients.

Engineers traditionally work in 2D either with pen and paper or on a computer/tablet, but when discussing projects with colleagues and clients, the information is often challenging to communicate. For that reason, BGC engineering has developed its proprietary software called the Ada Platform, enabling engineers to visualize 3D geographic data with holographic models using the Microsoft Hololens. But what it lacked at the time was the ability for engineers to draw and visualize ideas as they would over a technical drawing.


How do we interact with 3D data and make it a tool for collaboration?

By bringing geographic data to mixed reality with a variety of annotation tools, engineers can effectively and efficiently collaborate in real-time 3D space. From marking points of interest and adding standard symbols to highlighting a path for a pipeline, users can instantly visualize models, have a shared understanding of the project, share ideas, and collaborate on plans. All of these features and annotation tools needed a home, so we designed a spatial menu. This menu could either be locked to the holographic model or linked to the user’s position to follow them as they move around in the physical space.

An example of feature documentation; consisting of a user story, user flow, wireframe and storyboard. We found this approach very useful to help communicate and visualize interactions in a spatial computing environment.

Designing the menu and moving up in fidelity from paper prototype to digital and finally spatial prototype for Hololens.


After conducting usability tests with engineers, we gained valuable insights into how users navigate and use features. The overall results of these tests and the general feedback from engineers were overwhelmingly positive. However, I believe there is always room for improvement.


In retrospect, something I would do differently is the orientation of the menu. When the menu was linked to the user’s position, it followed them as they moved around the space. However, the orientation was relative to world space.

This logic caused some confusion initially, especially when users were circling around a table to view the model from different perspectives, but they quickly understood the logic after a few minutes.


This project was a real challenge. It refined my understanding of designing for a spatial computing environment within the technical constraints of the medium (Mixer Reality). I gained experience in facilitating innovative design processes that used lean experimentation and rapid iteration within a fast-paced agile environment.  


The resulting product exceeded the client’s expectations and was later featured at SIGGRAPH 2019, one of the most highly respected conferences for new computer graphics technology and research.

Like what you see? Let's chat!

© 2020 Adam Peregovits