Surf Analytics
Lead UX Designer
⢠~85% MoM customer retention rate (stabilized) â˘Â +20% increase in desired behaviour
Users were cancelling their Surf subscriptions due to in-product friction.
We conducted user research sessions with enterprise clients to identify their friction-points. Analyzing that data, we developed solutions for usability, content, and data oversight.
â˘Â Account Management ⢠Front-end Engineering & CTO
Surf Analytics: A marketing data dashboard
When I joined the team at Trufan, now known as Surf Analytics, brands used our web app to analyze and segment Twitter and Instagram audience data.
Surf Analytics has changed shape since, but the marketing value it provided users had them overlooking, at least initially, a clunky interface and experience. However, we suspected this was changing after noticing a small but steady decline in our, albeit crazy high, 90% MoM retention rate. And through a bootstrapped approach to user research, we quickly discovered current users were considering leaving due to product frustrations. We followed up by designing solutions for usability, content, and data oversight to keep users satisfied and subscribed long term.
My challenges and successes with high-stakes user interviews
1) Getting support across departments
As the first and only UXer at Surf at the time, I knew we needed to understand our current users, the bulk of whom were enterprise-level clients. These folks were influencers, data analysts, marketing managers, and agents juggling campaigns for fortune 500 companies. After I worked out a research plan and had it approved, I needed support recruiting these users. As professionals working in high-pressure jobs, I knew they would be busy. And not only that, but I wouldnât have the freedom reach out to them myself. Account managers (AMs) were their internal gatekeepers, and for good reason. Relationships with these clients were coveted and their contracts represented thousands in monthly revenue.
To get AMs invested and interested, I chose to invite them into the research processâ asking them if theyâd like to participate in proposed sessions as observers and/or note-takers. This way when I made the case for the projectâs potential learnings, the case was our case, and the potential learnings were our potential learnings. I scheduled a one-on-one meeting with the account management team leader to share goals and educate. With her green light, I followed up to do the same with willing Account Management team members. Any hesitance on their part was quickly dispelled as we connected over our shared empathy for users, as well as our the gaps we shared in our understanding of their experience. The AMs wanted to help. They wanted to recruit, and they wanted to participate.
With their help reaching out and recruiting participants, and with the help of additional gift card incentives, we were able to conduct research session with six enterprise-level clients. An account manager and I met with one client at a time. I moderated, taking key notes on my observations, and an account manager accompanied me to observe and take notes for cross-referencing later. During sessions I asked open questions, we watched users explore the product, and asked them to complete core tasks in Surf Analytics.
2) Maximizing the value of each session
We so rarely (literally never before) had the chance to conduct user research with enterprise, that I ambitiously planned both generative and evaluative research methods into each session. I immediately discovered it can be challenging to combine interviews with usability tests. Starting with open interview questions like, âwhat is your relationship to marketing?â and âwhat information are you looking for?â generated key insights, but monopolized the sessionâ leaving little time for product exploration. Visa versa, beginning the session with product testing could colour a participantâs responses in the following interview.
In the first session, however, Iâd noticed that the interviewee was emphatic about continuing with user testing, even enthusiastically walking us through ways our product worked (or didnât work) with other marketing tools in their workflow. I had a hunch this same drive could appear in upcoming sessions, which could help solve the problem of balancing research methods. I thankfully had the foresight to get the budget approved for additional participant incentives/rewards just in case my hunch was correct. In part, I was ethically motivated to compensate people for sharing their thoughts, feelings, and time with us. And I especially wanted to keep our clients happy; an incentive could mitigate the chances that someone might regret going over time, even if they advocated for it in the moment. As I suspected, the following five participants were each intent on continuing an extra 30 minutes, or even more! We did have to let them live their lives though. Their passion about the productâs value and the productâs frustrations was remarkable.
3) Conducting user research with new (or non-)researchers
Iâd worked previously with salespeople to conduct testing, so I was aware of the potential challenges. A researcherâs relationship to a participant can skew results, so it was important that AMs didnât a create any distractions until their opportunity to ask clarifying questions towards the end. We shared trust in one anotherâs professionalism, which was critical to the success of each session.
It turned out that the familiarity of their assigned AM on each call, even as a silent observer, made each participant more open to airing their grievances and demonstrating their use of the product. Sessions went smoothly. Their were occasional exceptions with what I refer to as user âtalking-pointsâ. These are rehearsed topics or concerns a client continuously lobbies the company for. It was sometimes worthwhile following these thoughts to their natural conclusion, especially if a motivation hadnât been uncovered. But if a talking point was too divergent, I would validate their concern, thank them, and return us to the conversation.
Turning research data into design solutions
The interview data we generated would inform future features, user profiles/personas, and especially the design decisions to come for usability and content strategy. From our usability data, we discovered some friction points were caused by a combination of technical issues like endless loading spinners, unclear error messages and dead ends. We resolved these issues quickly with troubleshooting options, informative error messaging, and progress indicators. But more significantly we discovered pain-points associated with core flowsâ for example, if a user forgot to update a big follower report (+50k users), they might miss a deadline waiting hours for the synchronization process to complete. As well, locating reports and interpreting content could be unintuitive.
We had to develop a strategic plan to smooth out the experience with minimal support from development. Engineers were busy reviewing bulky data processing requests, and were strapped with the integration challenges presented by Twitter and Instagramâs constantly changing systems. Make no mistakeâ web optimization, load times, and bugs are a part of the UX, but our efforts would be limited to creating front-end, UI-focused solutions.
I created a list of possible design solutions, each paired with research findings, and prioritized it with the CTO based on UX impact, business impact, and, critically, technical feasibility. This functioned as an âepicâ (or project) for my upcoming sprints.
I would go on to collaborate with developers to improve content strategy, usability, and core actions, even improving once time-consuming internal processes between account managers and our development team. Our efforts slowed losses in our retention, stabilizing the KPI at 85% month-over-month retention. As time went on, feature-outages and technical issues increased with restrictions to Twitter and Instagram. However, our later surveys and tests would indicate increased satisfaction with usability was helping to balance technical concerns (which we also worked to address, UX-wise, as they came up). And no doubt our retention rate was helped by ongoing improvements to voice, tone, and consistency of messaging between the customer success team and enterprise clients.
Content strategy
One significant content strategy issue was a lack of actionable information when opening the product. When a user logged in, they were greeted with an arrow pointing to the search bar/navigation, asking them to âselect a report to continue.â The search bar worked great, suggesting popular and recent reports, and an improved understanding of our users, we had a sense of what information they wanted surfacedâ specifically an overview of their recent or favourite reports, follower metrics, and noteworthy demographic changes.
Throughout dashboard testing, users had no qualms with wireframe content, and took particular satisfaction in the ânoteworthy changesâ data. However, they werenât clear on the logic of how I had visualized changes to metrics. If they saw an 11% increase in follower, was that a monthly change, a weekly change, or was it tracking the difference between the current report and its last update? Because reports contained too much data to automatically refresh them weekly, I chose to resolve the confusion by averaging the data to per-week differences between report updates. outlining this with helper text in the section heading. Weâd learned in our interviews that users engage with the product 1â3 times per week, so it turned out that a weekly average kept the data relevant in our usersâ oversight.
Improving a core action with new context
Often you want one or two places a user can perform an action. This keeps a product learnable, and avoids clutter. But exceptions can be made for a productâs core action. If your product is dedicated to setting alarms, generally speaking, where a user sees an alarm they will try to edit it. The content both affords and signifies the action regardless of context.
Discovering how cumbersome it could be for users to perform the product-essential, social-report ârefresh,â I decided to implement the feature in key places that reports could be found, to make it more accessible and add value product-wide. The way it worked was each refresh imported a collection of user data from Twitter or Instagram, specific to that moment, which could then be compared to report updates before and after. This time-stamped data would help marketers discover trends in audiences over time. Additionally, reports and their updates would be available to all enterprise-level users, so if Microsoft updated a report on @logitechâs Twitter followers, Sony would gain access to that new information as well.
The top bar was a powerful navigation tool for creating and switching between reports. One small but impactful improvement was turning our âlast refreshed X days agoâ indicator into a button that could refresh the report itself.
Where before users had to open a report before refreshing it, they could now refresh reports directly from the navigation dropdown and their dashboard. After implementing these changes, there was a 20% increase in report refreshes per user each month. This added the product-wide value weâd hoped for, and users had 20% more relevant and up-to-date Twitter and Instagram data to browse.
Improving a core action with a new feature
I designed a feature for users to schedule report refreshes. This feature worked to the benefit of not just users (who often forgot to press the button), but our development team that could now foresee upcoming high-data-traffic times and plan accordingly.
The key to our results
Looking back on the Surf Analytics project, it's clear that a deep dive into user research and a commitment to seamless collaboration across teams made way for enhancements in both usability and content strategy. This journey was also an opportunity to showcase the pivotal role of UX design in not just solving fundamental user pain-points, but stabilizing our businessâ 85â90% retention rate, a KPI critical to Surfâs success.
It further instilled in me the invaluable impact of empathy in design, the power of cross-functional teamwork, and the importance of research and iteration in crafting solutions that truly resonate with our users. Our endeavours to improve Surf Analytics stood as a testament to what can be achieved when we place the user at the heart of a design processâ satisfied users, impactful changes, and features that are not only intuitive but a joy to use.