Platfora’s Design team incorporates a mix of data gathering techniques and methods to assess workflow and user experience issues. We then test multiple potential design solutions, working to identify the optimal approach for each scenario. Once we have found the optimal solution, we integrate it into the product, where it becomes available for all of our users.
Our latest refresh of the visual language – that is, the core visual design – for the Vizboard section of the Platfora application provides a good example of how this process works. You might expect UI (user interface) design to include extensive A/B testing and a painstaking selection process from among “41 shades of blue.” But our approach was different, relying on a total of seven data gathering techniques.
The Design team’s initial user experience research for the Vizboards included:
- A large UI benchmarking exercise covering 52 tasks done by more than 20 users. This user data provided us with a baseline to assess proposed changes and also uncover any pain points.
- A Think-Out Loud exercise on Platfora’s existing visual language with 14 users.
- Related desirability tests (to gather first impressions) on the existing visual language with 14 users.
Synthesizing this rich user feedback, which came from multiple sources and which included both qualitative and quantitative research findings, we crafted a visual strategy that resulted in three different visual languages. The next step was for us to evaluate these new visual languages, which we did via the following tools:
- Expert task modeling and simulation for UI controls that were undergoing functional change in addition to the visual language refresh
- A Think-Out Loud exercise in which 14 users interacted with and evaluated the three languages.
- Related desirability tests in which 14 users provided first impressions of the three languages
While we expected the data to reflect a change in our user first impressions for the existing visual language when compared to the 3 newer proposed ones, the magnitude of change in perception was quite surprising! The desirability metrics showed that the initial feedback for the existing visual language was 69% negative; the same user group had an 81% positive first impression for one of the the new visual languages. The Product team welcomed this major improvement in user response and were glad to incorporate these changes into the product.
This new visual refresh was part of the 3.5 GA release. We can now track the impact of these changes via user telemetry, which shows us how the new UI is driving feature adoption and usage. The telemetry data shows an increase of usage of up to 63% for some of the controls, in particular several that were difficult to distinguish in the old design. Such an increase in feature usage is significantly higher than the usual conversion rate uptick that we would expect from, for example, tweaking shades of color on a “Call-to-Action” button or other general UI elements.
These findings confirm that the general user experience is greatly enhanced by the visual language refresh. By using a data-driven approach we are able to provide new levels of flexibility and consistency to help business analysts derive insights faster and share these same insights within their own data-driven organizations.