FreshGrade Customer Care
FreshGrade Inc. — With mentorship from Nathan Karsgaard (Sr. Designer)
UX Designer
For the longest time, FreshGrade's care team knew their help site needed work. It was hard enough to keep it up to date, let alone organized. It wasn't until they pulled up some metrics they knew the terrifying scope of the problem. 50% of help-site visitors contacted customer care. Either because they couldn't find what they were looking for, or they simply needed to reach out. It was a big cost to the customer support department, and a big cost to the company.
Aiming for 80% success
80% success is an attainable goal for information architecture projects. It's an industry rule-of-thumb thanks to content strategist Gerry McGovern. 40% of users will expect to find something one way. 40% will find expect to find something another way. And 20% won't know what the hell they're doing.
TLDR; because of a card sort and content audit, FreshGrade now receives almost half the amount of support requests. We went from a 40% success to 85–95% success. Meaning after the redesign, more than twice as many people would find the article they were looking for, and not need to file a help ticket after visiting. And we didn't have to hide the "Contact Us" button to do it. Please don't do that.
Defining the steps
Identifying some of the weak points of the prior care site wasn't difficult.
There were obvious navigation issues for teachers. As a teacher, where would I go to find how to invite parents? It might be a frequently asked question, it might be something to do with getting started, and it could be either under "Teacher" or "Parent" because it applies to both users. That's four categories of overlap for one common question.
We started by created new categories based on competitor analysis and some feedback from the customer care team.
The steps to organize, consolidate and optimize 170+ articles needed to be sweeping and effective. I had time-boxed the project to 3 weeks, between other tasks and projects.
Find people to sort your mess
If you've come to the FreshGrade help site, you're probably at least a little familiar with FreshGrade. Maybe it's your first time using it, or maybe you're a veteran that found yourself in a tricky spot. Our test participants needed to have a range of familiarity, teach a variety of subjects and grades, and be from a variety of locations. We sent out a survey in our bi-weekly newsletter, using the results to screen participants.
We recruited for a card sort. A card sort takes 5+ users, and about 30 minutes each. Participants group content "cards" into categories. It's great in person because you're able to ask specific and open questions like, "why did you feel that belonged there?" and take note of any confusion participants might experience.
But if you need the data quick and don't mind being hands-off, you can set up an online, unmoderated card sort with OptimalSort. We went this route because of two limitations; a short timeline and limited geographical access.
Over the next week and some, 50+ participants completed the exercise. It was exceptional! Really, 15 is more than enough. People often question this because we're accustomed to hearing scientific statistics with fancy considerations like "statistical relevance". When designing a product, you don't need statistical accuracy or 100% representation. You do, however, need enough unbiased data to identify patterns. If you've asked six users to find a sign up page, and half of them went to the log in page, you can bet you'd find the same result (give-or-take) with sixty participants. And think of the cost associated with refining that finding! As Erika Hall titled her book, "Just enough research."
Big data and the bias alarm
Filtering through the participants, some names sounded familiar. That's a bias alarm in any test. Do I know these people? Digging into it, more than half of participants were power users. Champions of our product!
Thankfully we had huge turnout, and I could deactivate enough power user data to get a proportionate balance of familiarity level among our users.
Then came the big question. What goes where? There were significant trends for most answers. When 60–90% of users picked a category for a piece of content, it was a clear winner.
But there are always outliers. In a few cases, users opinion was split between two categories.
If there are consistent patterns of divided sentiment, your categories aren't clear. You'll need to make changes until you find what's right for your users. Add categories, rephrase them, or as a last ditch effort, change the content itself.
Thankfully it was only a few cases where users encountered this. We took a shortcut. By putting one article into multiple categories. Shhhh...🤫 if users can easily find what they're looking for, they won't need to know it's not perfect!
Clarity and the search for meaning; a story of massive edits
We decided we would finish with a content audit. We identified bad content, and made improvements. Sounds simple, right? It is actually simple. But it's a lot of work.
The quickest way to identify nasty content was a ROT analysis. ROT involved identifying content that was redundant, out of date, and trivial. Redundant? Remove it. Out-of-date? Update it. Trivial? Edit it, consolidate it, or cut it.
After plugging all 170 article titles and links into a spreadsheet, I could take a quick look at each article and identify ROT. The care team provided view counts for each article as well, which often helped confirm whether a topic was trivial or not.
Once we had stakeholder approval, the fun process of updating articles began. A big step in this was improving headlines. The original headlines had a lot of inconsistency. So we looked for help on Facebook's ultra-consistent help site. Referencing their at-the-time success, we found their phrased headlines as questions. And we found that helpful. It wouldn't improve SEO much, or searchability as I like to call it, but it would start some good new habits among our content creators. Natural language helps writers phrase things the way a user would. Ultimately making it more recognizable. "How do I create a class from a list of student names?" is much clearer than "Class creation through student lists". It also gave us a significant opportunity to call out user-specificity. For a feature only an administrator could use, the help article would begin, "as an administrator, how can I... ?" Like a user story.
Rolling out, with some other key improvements
Where before each category was an endless scrolling list of tagged articles, there were now pages sorted into subcategories. Sorted by feature or topic.
There was originally a popular articles section in the right-hand column. Often articles would make the front page simply because other, better articles were harder to find. Like with many things, popular doesn't always mean best.
We solved the popularity issue with a recommended articles section. In it, customer care selected articles addressing the most common reasons users contacted support.
Hitting 85% success
Since the redesign, FreshGrade's care site had an 85–95% success rate, fluctuating a bit month-to-month. That's half the amount of support requests. It even freed up a full-time customer care employee to change roles. She quickly took charge of online training and now hosts webinars with thousands of attendees. What a boss!
Content strategy is exhausting, but it's cheap to work with users, and profitable to make sense of messes.