Loading...
UX

Product Strategy - Drone Companion App

Company
Transport Canada
tactics
Balsamiq Cloud, Sketch, InVision, Adobe Photoshop
Role
Product Designer, UX/UI
September
2017

Problem statement: 

How might we enable recreational drone pilots to confidently conduct their first site survey?

What we started design with, from discovery.  

Business goals:

  • Keep everybody safe by having users complete adequate site survey
  • Have users understand the risks associated with drone flying that the site survey is aimed to prevent
  • Tool normalizes site survey as part of routine, this tool should encourage habit

User goals:

  • Pass inspection through conducting a proper site survey
  • Basic understanding of complex aviation requirements (ex. Checking NOTAMs, what is a NOTAM and why is it important)
  • Makes the process of site survey seem manageable for users
  • Have something they can trust, without needing to reference many other sources

Product goals:

  • Instruction - guide users through the process of conducting a site survey
  • Include the why - include information why it’s important (relating to user goals of avoiding fines, keeping drone/people safe)
  • Include success metrics - give users criteria for success. How will they know when they’ve done it sufficiently and would pass inspection? 
  • Be a trusted source - include liability information and awareness, have Transport Canada branding 
  • Provide access offline - ability for offline capability to serve remote nature of drone flying

Desired product outcomes:

  • Motivation to follow regulations through an understanding of why these aspects are important to safety, and avoiding fines
  • Success with following regulations from understanding instructions and receiving guidance.

Design constraints:

  • Must be a digital tool
  • Must be scalable to accommodate other “doing” pieces of the regulations
  • Must allow users to conduct regulation activities to inspector standard, ie. no simplifying processes to meet user needs. Work with new regulations and intentions for safety.


Initial concept work

We started with three possible product concepts to address the problem statement above. By looking at three different directions, we could get an early sense of technical feasibility and business appetite. Often in product design you pick a direction and refine the concept based on user and business input. However, we cycled through the concepts as we uncovered blockers that prevented the direction from being viable. Working with agile methodologies allowed us to make these pivots that would have otherwise caused our project to fail.


Concept #1: “Let’s do it”
Goal of solution: 

To facilitate conducting a site survey by pulling resources together and allowing users to input and organize their research work. Offline capability would enable users to have their site survey work offline, ready for inspection if needed.


Rationale:

By taking care of the leg-work of users, we could make the process easy, going beyond our goal of manageability. The goal of any product is to take care of as much work as possible. Starting off by exploring the most facilitative option would allow us to sculpt a solution from the ideal state. 


What we did:
  • Used low-fidelity prototyping to explore the technical constraints of the solution. 
  • Further domain/organizational research about the ability to pull and impact NAV CANADA data. Sought information from project stakeholder
  • Diary study with two representative users on completing a site survey for the first time under the new regulations

What we found:


1. Unable to pull essential data from NAV Canada--NOTAMS and METAR in the timeline of the project.

A key regulated step is checking “Notice to Airmen”, or NOTAMs. These are daily notices that inform pilots of anything that could impact their flight. An example would be a flight plan from a private citizen, or a new restricted airspace enacted for emergency responders in a natural disaster. 

NAV Canada is a non-government organization that owns all airspace in Canada. Selling airspace data is a key revenue generator, and the type of commercial license we would need would be in the tens of thousands of dollars. 

Even if we justified the expense, the data is not user friendly. When looking into how we could change that, we found that there was no possible way to “codify” the data to only include what is relevant, and translate it for non-aviation folks. 

Solving NOTAM’s with our tool would be a massive under-taking, and one that seemed to be too risky to pull off in a 9-month project. 

The same finding applies to METAR data, the standard weather data pilots are required to reference when assessing weather conditions. However tools already exist to translate METAR data into plain language, unlike NOTAM data.

When asking if it’s essential for users to check METAR data versus the weather network, for example, we ran into the friction between legal liability and practicality for uses. We talked out the scenarios for using an alternative source of data like the weather network. If a user caused an accident due to weather, they could be held liable as they did not check the source regulated in the Canadian Aviation Regulations, NAV Canada’s METAR data. Therefore if a tool provided by Transport Canada used alternative data, the liability would fall to Transport Canada. There could be no “bending” the rules with stakes this high. 


2. Google Earth is essential and a predominant tool in the process… can we handle it?

Through our prototyping with the team we uncovered that the majority of the research and planning of a site survey needs to happen on a geographical information system, or GIS. The most popular and free option is Google Earth. If we were to facilitate the site survey, we would have to replicate Google earth for one third of the product.

The technical complexity to do this meant that it would likely take the entire time to build. Wanting to have a minimum viable product released by the end of the Fellowship was important to us. A map that simulated Google Earth would not be a true minimum viable product. Without the whole site survey facilitated to some degree, it would not meet user’s core needs.   

Even if we were able to complete an MVP, with the sheer complexity, is it realistic for Transport Canada to continue building and maintaining? Working alongside a digital service developer we were able to get a picture of the likelihood of being maintained and developed--outcome didn’t look good. Capacity is limited, and many developers are using a different programming language. 


3. With other tools handling these key steps, are we duplicating effort?

  • We can’t change NOTAMS, or pull NOTAM or METAR data
  • Google Earth already exists, does a great job, and is free
  • Where to Fly tool tells you airspace data: the area of the airspace, the class, the name. (Also owned by another agency, the National Research Council.)

Three of the core pieces of a site survey already exist. So what’s the problem with users succeeding to use them? With regulations published, and a scope of a site survey, we could return to user research to find out what the real challenges are for our users to conduct site surveys.


Diary Study

We had a hard deadline to define our MVP by the end of the sprint. Which meant that any research at this time needed to be quick and efficient. I reached out to past user interview participants to see if they would be able to participate in a diary study in the timeline we outlined. 2 participants said yes. Although we understood the sample size was small, we had a very specific focus. We decided to run the diary study with 2 primary users, knowing that anything they did would give us more insight into how real drone operators would attempt this new process. User research is the art of good enough. 


Key findings:
  • Users used the wrong tools. Example, they used Google Maps instead of Google Earth. 
  • Believed they were doing critical parts adequately, when in reality they were incorrect. 
  • A lot of uncertainty in the process.
  • Inspectors are looking at rationale, and that outweighs precision. 
“I consider that the participant failed mostly because they decided to eyeball the general dimensions of the area of operation. Topographic maps, scale diagram and satellite imagery all have the particularity of having a scale, thus estimating the distances are fairly straight forward from that point. Also, I would have appreciated some reasonable thinking relevant to why he chose the present dimensions.”

- Transport Canada Inspector


Key learnings to apply: 
  • We need a “dynamic” approach to guidance, knowing that we have: a shifting baseline of basic knowledge as more users become educated and certified, and some need explicit direction and others need translation to the impact on how it influences the operation
  • Success is not about precision, but about decision-making rationale. How can our guidance support clear decision-making?
  • Users desire a “template”, something to help them “check the boxes”. Video content mentioned. We can use these to support both explicit instruction, and provide a reference for sound decision-making.
  • We can measure our tool’s effectiveness on the baseline data of self-reported adequacy, and inspector evaluation.
  • Given our initial baseline is w/o basic exam, next we should include both with and w/o basic exam to get an encompassing result. In future, will “weed out” those without basic exam. 


With these findings we decided to look at how we can be the template these users desperately need. We asked ourselves, “How can we provide a template for a site survey that enables good decision-making?” 


Concept #2: “Checklist”
Goal of solution: 

A structured template that would provide guidance to specific tasks of a site survey. This would enable users to know exactly what to do, and how best to do it. We framed it as a “Companion app” to tie resources together, provide instructions and guidance on all aspects of the regulations. As well as be an offline tool for having what you need in the field no matter where you are.


Key activities:
  • Created basic wireframes to make testing more accurate 
  • Prototyped in InVision 
  • Tested it with a full usability study
  • Looked at findings and tried to evaluate how best to push forward


Wireframing digitally gave me the ability to imagine possible interactions within a typical checklist format. I used standard material design components to keep the design at a medium fidelity. I used InVision to stitch screens together to illustrate functionality. We showed this to our stakeholders to get initial feedback. The response was mostly positive, but there were concerns with signals given by checking something off. The fear is that it could lead to false positive of compliance, and lawsuits.

The usability study design influenced what parts of the prototype to keep, and what to add to augment the experience of navigating to a different tool.  

The design of the usability study tested the tools ability for users to make a correct choice of where it is legal to fly using our product to guide them. Users were tasked to:

  1. Complete the section “Determine where to fly” on the checklist prototype,
  2. Determine where (if at all) it is legal to fly your drone.

The new Where to Fly tool had yet to be released, but we had early access to see what was to come. I stole screenshots from the tool so that we could simulate the experience of using the tool without users having to use it. We also created a document we could share if users wanted to use the Canadian Flight Supplement or Designated Airspace handbooks. Read all test design details here. 


Top findings:
  1. Users wanted the app to hold their work.

The first checklist item in the tool was “Create a file where you will keep all the details of your research and surveying.” All users were thrown-off in some way by this. They either were confused, checked it off thinking they didn’t need to do it, or thought that by checking the box, you are enabling the functionality. When users reflected on their experience, they wanted to the tool to be the file we were asking them to create.


2. Participants had a completion mindset, not a learning mindset. 


Order of information was very important to the success of actions taken. Participants typically skimmed headers and first lines, but didn’t fully read until they were stuck. They all used tools in the order that they appeared. In the initial prototype, the Canadian Flight Supplement was listed as the first resource. Instead of reading both options, they used the first option, even though it described the CFS as an advanced aviation document.  


3.  Participants checked boxes even if they hadn't done what was explicitly being asked of them. 


They used their own judgement to determine completion, versus following the app's direction. The more experienced participant maintained his own process, using a tool called Airmap to determine airspace. There was a discrepancy in the map, and participant missed identifying the caution space.


This made us consider Transport Canada’s false-positive/liability concerns even more heavily. If users weren’t even doing the things they were checking off, how could we argue that the tool let us know that they tried. 


Iteration:

Before entirely giving up on the checklist concept, we investigated how to better accommodate storing user’s work. If we could do that, and put more focus on information architecture in the guidance, perhaps we could address all of the findings. 


When we started looking at the offline capability of storage, we found a limitation. We found that we were restricted to storing 50mb of data offline. Which meant if you wanted to take your work out into the field with you, you would need reliable internet to maintain access to it all. 


We also didn’t think that we could solve the mindset of users. We saw that the real need is education, yet a checklist is prioritizing completion. Even with a redesign of architecture, we would fundamentally miss prioritizing the main need of users. 


Outcome:

Given the solution is educational in nature, we decided to focus on an educational experience, and abandon the potential liability of a checklist.


Concept #3: “Guidance”
Goal of the solution: 

Focus the user-flow and information architecture around learning how to do a site survey. This way we could enable users to use the tools they wanted, and avoid missed expectations of tool functionality.


What we did:
  • Used BOPPPS model to inform architecture
  • Ideated on educational experiences as a team
  • Internal usability testing - “cafeteria test”


About BOPPPS model

The BOPPPS model is used in designing lessons for experiential education. If you google BOPPPS model, you’ll see almost every Canadian university has it listed as a resource for it’s instructional staff. It stands for:


B - Bridge-in (Introducing the why behind the lesson.)

O - Objective (By the end of the lesson, the learner should have learned this.)

P - Pre-test (Assesses the participants current knowledge.)

P - Participatory learning (Activity-based, group-based are strong choices for learning mechanisms here.)

P - Post-test (Assesses how the learner has met the objective.)

S - Summary (Concludes learning experience.)


The BOPPPS model enables clear, active learning planning and measuring, with studies to prove it. It’s primary use is in teaching trades, as learning practical skills demands active learning. This is why you’ll see “train the trainer” courses, like the Instructional Skills Workshop, preach this lesson-planning model. 

Given our users need to learn practical skills in an engaging way, we approached our site survey lessons with the BOPPPS model. 

Study conducted on the effectiveness of BOPPPS model versus traditional approaches in general surgery. 


Internal usability - “cafeteria test”

We didn’t have much time to test our educational pivot, so we designed a way to test with people in the building. We walked around the cafeteria and provided people with a link to a prototype and a survey that asked:

  1. How would you describe the tool’s purpose?
  2. How would you rate the tool’s performance in achieving the purpose you’ve defined above?
  3. How would you rate the tool’s headings and labels as they influenced your understanding of what you were consuming or completing?
  4. Indicate your agreement to the following statement: The tool’s progress through the guidance material made learning feel manageable.

We received 18 responses.


Findings:

  • The majority of participants felt the tool's purpose was clear, and stated it as "educational"
  • The majority of participants felt the headings and labels were clear. Only 2 participants found the labels confusing. Only 3 participants made statements about the progress being unclear for them. 
  • The majority of participants either felt that it was manageable, or were unable to adequately assess based on the lack of content. Only 1 participant felt it was challenging to digest.
  • Participants desired an onboarding experience to set their expectations better


Iterations:
  • Redesigned the progress bar to cover everything in the task-completion flow
  • Took away the tasks page, and just navigated from the phases. This was a duplicate step that wasn’t needed.
  • Created onboarding introduction page with images
  • Explored a different taxonomies (labels etc) as a team. Made slight adjustments. But for consistency amongst the team, we still refer to everything by the name in the database. 


Visual Design

After iterating, we were able to add a visual layer to the product and update the UI to ensure accessibility. We took the Drone Safety branding and created a style tile. The branding was intended for print material, so I would need to adapt and select brand elements for accessibility on the web. I explored many versions before settling on a tertiary colour scheme using the purple-blue at the top of the drone safety gradient to use as a primary colour. The combination of purple, lime green, and Nunito Sans gave an energetic, youthful vibe. Considering our tool could be used by youth as young as 16, we thought this youthful style would help with the feelings of manageability and motivation. 

Syle tile for digital.
Style guide for the Drone Companion App. Font used is Nunito Sans.


Working with government standards

We’re a progressive web application, which is a native-seeming web application. So do we follow the native application standards? Or the web application standards? Also, what of Canada.ca standards must we follow? Navigating these three sets of standards was difficult. At first we explored bottom navigation to more closely match a native application experience. But once we looked at the Canada.ca standards, we realized we would need the standard header and footer. This pushed us to design more like a web application. Although we would love to further explore how to leverage the native experiences that a progressive web application allows, for now, the product goals are achieved by following the web standards applied to web applications. 

To read further into the visual design and interface process, visit this case study.

Build

We all worked hard to build a product in few months. I contributed the styling to the components, our government partners did quality assurance, and our developers coded. It was complete just in time for the Code for Canada showcase!

Watch the presentation here!

Our project can be accessed on GitHub here.


Andee is an extremely talented designer. She cultivated our Community Partner network (over 500 strong) through a period of intense company growth, and was a huge part of developing the internal culture of accountability and self management at RED.
Sean Eikerman
RED Academy
Andee brings a level of dedication, skills, and outlook that is hard to find. She has a great long-term vision and can execute to a very high standard.
Karlo Kowalczyk
Founder, Restore Human
Working with Andee was a dream! I was consistently blown away by how quickly she understood my needs and was able to use user data to produce tools that influenced decisions down the road. Her work completely transformed how our customers interact with us.
Sarah Blenkhorn
Founder, Leverage Labs
Follow Me on
Instagram
Contact Me
Let’s Work Together
My story
Learn More About Me