💳

Surf Analytics: A marketing data dashboard

Project

Surf Analytics

My role

Lead UX Designer

Timeframe
November 19, 2020 → March 20, 2021
Key metrics

• ~85% MoM customer retention rate (stabilized) • +20% increase in desired behaviour

The problem

Users were cancelling their Surf subscriptions due to in-product friction.

The solution

We conducted user research sessions with enterprise clients to identify their friction-points. Analyzing that data, we developed solutions for usability, content, and data oversight.

Collaborators

• Account Management • Front-end Engineering & CTO

Surf Analytics: A marketing data dashboard

image
When I joined the team at Trufan, now known as Surf Analytics, brands used our web app to analyze and segment Twitter and Instagram audience data.

Surf Analytics has changed shape since, but the marketing value it provided users had them overlooking, at least initially, a clunky interface and experience. However, we suspected this was changing after noticing a small but steady decline in our, albeit crazy high, 90% MoM retention rate. And through a bootstrapped approach to user research, we quickly discovered current users were considering leaving due to product frustrations. We followed up by designing solutions for usability, content, and data oversight to keep users satisfied and subscribed long term.

My challenges and successes with high-stakes user interviews

1) Getting support across departments

As the first and only UXer at Surf at the time, I knew we needed to understand our current users, the bulk of whom were enterprise-level clients. These folks were influencers, data analysts, marketing managers, and agents juggling campaigns for fortune 500 companies. After I worked out a research plan and had it approved, I needed support recruiting these users. As professionals working in high-pressure jobs, I knew they would be busy. And not only that, but I wouldn’t have the freedom reach out to them myself. Account managers (AMs) were their internal gatekeepers, and for good reason. Relationships with these clients were coveted and their contracts represented thousands in monthly revenue.

To get AMs invested and interested, I chose to invite them into the research process— asking them if they’d like to participate in proposed sessions as observers and/or note-takers. This way when I made the case for the project’s potential learnings, the case was our case, and the potential learnings were our potential learnings. I scheduled a one-on-one meeting with the account management team leader to share goals and educate. With her green light, I followed up to do the same with willing Account Management team members. Any hesitance on their part was quickly dispelled as we connected over our shared empathy for users, as well as our the gaps we shared in our understanding of their experience. The AMs wanted to help. They wanted to recruit, and they wanted to participate.

With their help reaching out and recruiting participants, and with the help of additional gift card incentives, we were able to conduct research session with six enterprise-level clients. An account manager and I met with one client at a time. I moderated, taking key notes on my observations, and an account manager accompanied me to observe and take notes for cross-referencing later. During sessions I asked open questions, we watched users explore the product, and asked them to complete core tasks in Surf Analytics.

2) Maximizing the value of each session

We so rarely (literally never before) had the chance to conduct user research with enterprise, that I ambitiously planned both generative and evaluative research methods into each session. I immediately discovered it can be challenging to combine interviews with usability tests. Starting with open interview questions like, “what is your relationship to marketing?” and “what information are you looking for?” generated key insights, but monopolized the session— leaving little time for product exploration. Visa versa, beginning the session with product testing could colour a participant’s responses in the following interview.

In the first session, however, I’d noticed that the interviewee was emphatic about continuing with user testing, even enthusiastically walking us through ways our product worked (or didn’t work) with other marketing tools in their workflow. I had a hunch this same drive could appear in upcoming sessions, which could help solve the problem of balancing research methods. I thankfully had the foresight to get the budget approved for additional participant incentives/rewards just in case my hunch was correct. In part, I was ethically motivated to compensate people for sharing their thoughts, feelings, and time with us. And I especially wanted to keep our clients happy; an incentive could mitigate the chances that someone might regret going over time, even if they advocated for it in the moment. As I suspected, the following five participants were each intent on continuing an extra 30 minutes, or even more! We did have to let them live their lives though. Their passion about the product’s value and the product’s frustrations was remarkable.

3) Conducting user research with new (or non-)researchers

I’d worked previously with salespeople to conduct testing, so I was aware of the potential challenges. A researcher’s relationship to a participant can skew results, so it was important that AMs didn’t a create any distractions until their opportunity to ask clarifying questions towards the end. We shared trust in one another’s professionalism, which was critical to the success of each session.

It turned out that the familiarity of their assigned AM on each call, even as a silent observer, made each participant more open to airing their grievances and demonstrating their use of the product. Sessions went smoothly. Their were occasional exceptions with what I refer to as user ‘talking-points’. These are rehearsed topics or concerns a client continuously lobbies the company for. It was sometimes worthwhile following these thoughts to their natural conclusion, especially if a motivation hadn’t been uncovered. But if a talking point was too divergent, I would validate their concern, thank them, and return us to the conversation.

Turning research data into design solutions

The interview data we generated would inform future features, user profiles/personas, and especially the design decisions to come for usability and content strategy. From our usability data, we discovered some friction points were caused by a combination of technical issues like endless loading spinners, unclear error messages and dead ends. We resolved these issues quickly with troubleshooting options, informative error messaging, and progress indicators. But more significantly we discovered pain-points associated with core flows— for example, if a user forgot to update a big follower report (+50k users), they might miss a deadline waiting hours for the synchronization process to complete. As well, locating reports and interpreting content could be unintuitive.

We had to develop a strategic plan to smooth out the experience with minimal support from development. Engineers were busy reviewing bulky data processing requests, and were strapped with the integration challenges presented by Twitter and Instagram’s constantly changing systems. Make no mistake— web optimization, load times, and bugs are a part of the UX, but our efforts would be limited to creating front-end, UI-focused solutions.

One of my biggest asks from development was a dashboard for reports. Our engineering team found the capacity to develop it because the effort was primarily on the front-end. The designs were planned to leverage back-end data and existing product functionality.
One of my biggest asks from development was a dashboard for reports. Our engineering team found the capacity to develop it because the effort was primarily on the front-end. The designs were planned to leverage back-end data and existing product functionality.

I created a list of possible design solutions, each paired with research findings, and prioritized it with the CTO based on UX impact, business impact, and, critically, technical feasibility. This functioned as an ‘epic’ (or project) for my upcoming sprints.

I would go on to collaborate with developers to improve content strategy, usability, and core actions, even improving once time-consuming internal processes between account managers and our development team. Our efforts slowed losses in our retention, stabilizing the KPI at 85% month-over-month retention. As time went on, feature-outages and technical issues increased with restrictions to Twitter and Instagram. However, our later surveys and tests would indicate increased satisfaction with usability was helping to balance technical concerns (which we also worked to address, UX-wise, as they came up). And no doubt our retention rate was helped by ongoing improvements to voice, tone, and consistency of messaging between the customer success team and enterprise clients.

Content strategy

One significant content strategy issue was a lack of actionable information when opening the product. When a user logged in, they were greeted with an arrow pointing to the search bar/navigation, asking them to “select a report to continue.” The search bar worked great, suggesting popular and recent reports, and an improved understanding of our users, we had a sense of what information they wanted surfaced— specifically an overview of their recent or favourite reports, follower metrics, and noteworthy demographic changes.

We nearly forgot to include a refresh button for reports in the new dashboard wireframes! Prototypes are always a testament to mistakes unmade. Thankfully we caught it quickly and moved forward.
We nearly forgot to include a refresh button for reports in the new dashboard wireframes! Prototypes are always a testament to mistakes unmade. Thankfully we caught it quickly and moved forward.

Throughout dashboard testing, users had no qualms with wireframe content, and took particular satisfaction in the “noteworthy changes” data. However, they weren’t clear on the logic of how I had visualized changes to metrics. If they saw an 11% increase in follower, was that a monthly change, a weekly change, or was it tracking the difference between the current report and its last update? Because reports contained too much data to automatically refresh them weekly, I chose to resolve the confusion by averaging the data to per-week differences between report updates. outlining this with helper text in the section heading. We’d learned in our interviews that users engage with the product 1–3 times per week, so it turned out that a weekly average kept the data relevant in our users’ oversight.

The final mockups of the dashboard were ready for production. We later iterated to improve our empty and error states for metrics.
The final mockups of the dashboard were ready for production. We later iterated to improve our empty and error states for metrics.

Improving a core action with new context

Often you want one or two places a user can perform an action. This keeps a product learnable, and avoids clutter. But exceptions can be made for a product’s core action. If your product is dedicated to setting alarms, generally speaking, where a user sees an alarm they will try to edit it. The content both affords and signifies the action regardless of context.

Discovering how cumbersome it could be for users to perform the product-essential, social-report “refresh,” I decided to implement the feature in key places that reports could be found, to make it more accessible and add value product-wide. The way it worked was each refresh imported a collection of user data from Twitter or Instagram, specific to that moment, which could then be compared to report updates before and after. This time-stamped data would help marketers discover trends in audiences over time. Additionally, reports and their updates would be available to all enterprise-level users, so if Microsoft updated a report on @logitech’s Twitter followers, Sony would gain access to that new information as well.

Before:
Before: The top navigation, in the context of a report, read “Last refreshed X days ago”.

The top bar was a powerful navigation tool for creating and switching between reports. One small but impactful improvement was turning our “last refreshed X days ago” indicator into a button that could refresh the report itself.

After:
After: The top navigation now read, “Last refreshed X days ago. Click to refresh.” Revisiting this for mobile, we replaced “click” with “press” to make the language device-agnostic.

Where before users had to open a report before refreshing it, they could now refresh reports directly from the navigation dropdown and their dashboard. After implementing these changes, there was a 20% increase in report refreshes per user each month. This added the product-wide value we’d hoped for, and users had 20% more relevant and up-to-date Twitter and Instagram data to browse.

In the navigation, we had the refresh button appear in the “refresh-status” area on hover, an ideal way to keep a tight layout on desktop before mobile access was possible. As we improved device and breakpoint support, we had a chance to reconsider the layout for mobile views.
In the navigation, we had the refresh button appear in the “refresh-status” area on hover, an ideal way to keep a tight layout on desktop before mobile access was possible. As we improved device and breakpoint support, we had a chance to reconsider the layout for mobile views.

Improving a core action with a new feature

I designed a feature for users to schedule report refreshes. This feature worked to the benefit of not just users (who often forgot to press the button), but our development team that could now foresee upcoming high-data-traffic times and plan accordingly.

For simple features with already-designed patterns, we worked
For simple features with already-designed patterns, we worked fast. Often moving straight from a wireflow to code in the same day, skipping the time-consuming mockup step entirely.
To smoothly hand off low-fidelity designs to development required maturity and communication. I had to do what I could to empathize and communicate with developers, predicting their questions and answering them with clear documentation. It also required a front-end developer, like Surf’s very own Rod Miles, with experience and a strong understanding of existing style/pattern guides. I would skip mockups when a Sr. frontend developer was in charge of implementing the designs, and I had a good sense our lines of communication would be open and speedy, just in case we needed to resolve any confusion in the moment. It was a “don’t do this at home” but startup-friendly design process, and I was lucky to get away with it so frequently. It cut design costs, and enabled me to design sprints of work in just days.
To smoothly hand off low-fidelity designs to development required maturity and communication. I had to do what I could to empathize and communicate with developers, predicting their questions and answering them with clear documentation. It also required a front-end developer, like Surf’s very own Rod Miles, with experience and a strong understanding of existing style/pattern guides. I would skip mockups when a Sr. frontend developer was in charge of implementing the designs, and I had a good sense our lines of communication would be open and speedy, just in case we needed to resolve any confusion in the moment. It was a “don’t do this at home” but startup-friendly design process, and I was lucky to get away with it so frequently. It cut design costs, and enabled me to design sprints of work in just days.

The key to our results

Looking back on the Surf Analytics project, it's clear that a deep dive into user research and a commitment to seamless collaboration across teams made way for enhancements in both usability and content strategy. This journey was also an opportunity to showcase the pivotal role of UX design in not just solving fundamental user pain-points, but stabilizing our business’ 85–90% retention rate, a KPI critical to Surf’s success.

It further instilled in me the invaluable impact of empathy in design, the power of cross-functional teamwork, and the importance of research and iteration in crafting solutions that truly resonate with our users. Our endeavours to improve Surf Analytics stood as a testament to what can be achieved when we place the user at the heart of a design process— satisfied users, impactful changes, and features that are not only intuitive but a joy to use.

← Back to Projects