Drone Inspection Software UXR
In 2017 I was leading a strategic research team supporting both Intel’s Chief Strategy Officer and the company’s office of New Business Initiatives. For the latter, I was assigned to support each of the VP/GMs running different incubation businesses with research support ranging from market segmentation and product/market fit assessments to UX research support. Below is an example of a typical consulting engagement.
01 Background and Goals
In 2016 Intel purchased Ascending Technologies, one of the top manufacturers of drones for commercial applications. In addition to providing drone show services to events like the Super Bowl and the Olympics, Intel was one of the top 3 manufacturers in the nascent commercial drone market. This was part of a larger strategy to expand into AI-enabled “edge devices” that also included an acquisition of Mobileye (the world’s largest maker of components for driver assistance and self-driving cars) and components for robotics and smart cameras.
In this case the drone business was developing software to be used in inspection applications such as oil rigs, refineries, bridges, and wind farms. I had already been heavily involved in their initial efforts to segment the commercial drone market in 2017. However, in 2018 the team brought me in again on this project only after they were pretty far along in the development process, so we had to scope the project to address basic questions of market viability and user journey of inspection software adoption on top of testing the service for usability. Given how far along the product was, negative results would set back the product launch considerably.
02 Methodology
We recruited 16 representative aerial inspection customers who had volunteered for our beta test. Testing followed our team’s standard UX assessment protocol with the exception that the in person UX evaluation was done at the respondents worksite.
Up front survey capturing firmographics, category experience and expectations of product (90 min, phone)
Onboarding/data upload experience (60 min, phone)
Usage testing (3-4 hours, In person)
Final assessment (phone survey, 60 min)
03 Insights
Prior expectations of the type of “smart” software that Intel might develop to facilitate analysis of inspection photos taken by drone were sky-high, as we were the only other large firm in the industry aside from DJI and had a reputation for technology leadership given our high profile drone shows (and overall technology market leadership). So, beta participants were eager to see what type of advanced analytics, AI, and machine learning that Intel would bring to automate the labor-intensive process of aerial photography-based inspection.
The beta product we tested did not meet those expectations at all. Participants expected a far more analytics-rich feature set that what we tested. Moreover, the 3D models that our service composed from inspection photos were of far worse quality than those of competitors, although the rendering time was excellent. Rather, the team had focused most of its efforts on improving the experience of team collaboration and workflow optimization based on earlier user journey findings, and while these features were valued the experience as a whole was underwhelming to testers given their prior expectations and the need to meet certain thresholds of visual quality. Also, alternative analytics engine plugins from two leading photo analytics vendors were available but so buried in the UI that no beta tester successfully found them.
Aside from the dissonance of expectations regarding product scope, we identified a number of usability issues with the product for the team to tackle in its next beta release.
04 Actionability
When we reported findings to the product team and business unit GM, they were mostly unaware of the weight of expectations that the industry had on their upcoming product release…that an analytics product produced by Intel would be like an AI-enhanced fully featured “Photoshop” which would offer huge productivity gains through superior performance. Instead what they had produced was a lighter weight cloud application focused on enhanced collaboration features.
This set off a two-pronged response by the product team. Confident in the potential to enhance users’ productivity through better cloud-based collaboration tools, they doubled down on improving these features and developing onboarding tools that would better integrate remote collaboration as part of a company’s business processes. At the same time they made the third-party photogrammetry and analytics engines more prominent to compensate for the perceived poor performance of the internally developed engine, while changing their marketing plans to emphasize the productivity benefits.
05 My Learnings
Brand expectations matter in shaping user perceptions of experience quality.
Value of embedding researchers in a product team.