Drone Inspection Software UXR
In 2012 I rejoined Intel’s central market research group and was asked with another collegue to build out a formalized UX Research function. Our first task was to develop a repeatable methdology for pre-launch user experience testing based on best practices. We used a request to do some competitive analysis of Google Cloud and Apple iCloud to test out an overall framework for UX assessment, as well as to test specific tools re: physical response testing (e.g., eye tracking, microexpression analysis, emotional response sorts) that we might incorporate.
01 Background and Goals
In 2016 Intel purchased Ascending Technologies, one of the top manufacturers of drones for commercial applications. In addition to providing drone show services to events like the Superbowl and the Olympics, were one of the top 3 manufacturers in the nascent commercial drone market. This was part of a larger strategy to expand into AI-enabled “edge devices” that also included an acquisition of Mobileye (worlds largest maker of components for driver assistance and self-driving cars) and components for robotics and smart cameras. The team I managed supported all these businesses with research spanning from product/market fit and segmentation to user experience.
In this case the drone business was developing software to be used in inspection applications such as oil rigs, refineries, bridges, and wind farms. In 2018 the team brought me in only after they were pretty far along in the development process, so we had to scope the project to address basic questions of market viability and user journey of inspection software adoption on top of testing the service for usability. Given how far along the product was, negative results would set back the product launch considerably.
02 Methodology
We recruited 16 representative aerial inspection customers who had volunteered for our beta test. Testing followed our team’s standard UX assessment protocol with the exception that the in person UX evaluation was done at the respondents worksite.
Up front survey capturing firmographics, category experience and expectations of product (90 min, phone)
Onboarding/data upload experience (60 min, phone)
Usage testing (3-4 hours, In person)
Final assessment (phone survey, 60 min)
03 Insights
Prior expectations of the type of “smart” software that Intel might develop to facilitate analysis of inspection photos taken by drone were sky-high, as we were the only other large firm in the industry aside from DJI and had a reputation for technology leadership given our high profile drone shows. So, beta participants were eager to see what type of advanced analytics, AI, and machine learning that Intel would bring to automate the labor-intensive process of aerial photography based inspection.
The beta product we tested did not meet those expectations at all. Participants expected a far more analytics-rich feature set that what we tested. Moreover the 3D models which our service composited from inspection photos were of far worse quality that those of competitors, although the rendering time was excellent. Rather, the team had focused most of its efforts on improving the experience of team collaboration and workflow optimization based on earlier user journey findings, and while these features were valued the experience as a whole was underwhelming to testers given their prior expectations and the need to meet certain thresholds of visual quality. Also, alternative analytics engine plugins from two leading photo analytics vendors were available but so buried in the UI that no beta tester successfully found them.
Aside from the dissonance of expectations regarding product scope, we identified a number of usability issues with the product for the team to tackle in its next beta release.
04 Actionability
When we reported findings to the product team and business unit GM, they were mostly unaware of the weight of expectations that the industry had on their upcoming product release…that an analytics product produced by Intel would be like an AI-enhanced fully featured “Photoshop” which would offer huge productivity gains through superior performance. Instead what they had produced was a lighter weight cloud application focused on enhanced collaboration features
This set off a two-pronged response by the product team. Confident in the potential to enhance users’ productivity through better cloud based collaboration tools, they doubled down on improving these features and developing onboarding tools that would better integrate remote collaboration as part of a company’s business processes. At the same time they made the third-party photogrammetry and analytics engines more prominent to compensate for the perceived poor performance of the internally developed engine, while changing their marketing plans to emphasize the productivity benefits.
05 My Learnings
Brand expectations matter in shaping user perceptions of experience quality.
Value of embedding researchers in a product team.