🏫

FreshGrade Next: A next-generation app for schools

Project

FreshGrade Next

My role

UX Designer

Timeframe
January 8, 2019 → September 1, 2019
Key metrics

• 40+ critical research findings • ~2–4 months saved in product-planning time • +100–500% usability success rates for key features

The problem

Teachers found it hard to organize and viewing class content. They also needed Gradebook and lesson planning tools.

The solution

We invested in user research and team collaboration to make a scalable new FreshGrade product. Thanks to new processes and a growing team, we could align the product with the needs and expectations of teachers.

Collaborators

• Product Director • UX Design team (Summer 2019) • Sales, Customer Success, and Engineering

The FreshGrade promise(s)

With a five-year-old piece of software, changes in leadership, big clients, and a big sales team, promises will be made at EdTech startup, FreshGrade. These promises were for the coming school year. Thankfully FreshGrade's market research guided our small product team to the right problems solve for. We built a next-generation product for schools designed by reimagining the classic app’s solutions with the findings from user research conducted across North America.

image

FreshGrade's key feature was collaborative student portfolios. Students and teachers could share learning, and parents could engage too. With a window into the classroom, parents could actively participate in their child's learning. Teachers could assess student work at a birds-eye view in their Gradebook.

We wanted to keep all these benefits, get rid of the slow FreshGrade Classic platform, and solve pre-existing problems in the user experience. With a six-month timeline, we needed to build fast.

  • Teachers and students needed a way to better organize and view class content. Year over year the content piles up and it becomes difficult to find things
  • Teachers didn't need a one-size-fits-all Gradebook, but instead one tailored to their individual assessment style
  • Likewise, teachers needed ways to lesson plan in a way that was more tailored to their individual assessment style

Here are some things we learned along the way:

We tested early, high-level solutions

At the time I was brought on as a UX designer, FreshGrade had a short-term UX designer concluding a contract and handing over their work. The documentation was high-level and conceptual. Alongside the then-head-of-product the designer it was clear they had invested in a great deal of complex modelling— creating app permissions that extend from School Administrator to Teacher to Student and Parent.

When developing a research plan, we decided the first step was to test the early wireframes with 5–7 users across Canada and the United States. We wanted to see how far we were from something clear and usable.

We tried early exercises to translate high-level "tags" into user-centred language and patterns.
We tried early exercises to translate high-level "tags" into user-centred language and patterns.

FreshGrade was new to testing, and I was stepping in as the sole UXer. Short on time and resources, we asked salespeople to help recruit and moderate off-site tests. The sales team was enthusiastic. I trained four on the basics of moderation, supplied them with paper prototypes, instructions, and a script. They hosted tests in Texas, Ontario, and BC. Three key markets for our new product. I recorded and took notes remotely via video, and jumped in with questions or requests only when it was necessary.

It was unconventional to ask salespeople to be unbiased, fly-on-the-wall observers when their primary job is to sell! But they were respectful, attentive, and incredibly helpful moderators. Occasionally one posed a leading question, but this would have been expected from anyone new to user research.

image
image

With the tests we found three major patterns.

  • Teachers wanted to organize lessons and grades in a few ways; by subject, standard, or category
  • Teachers didn't understand how to sort, so we would need to make design improvements
  • Much of the language we used was too high-level and confusing to teachers

There were dozens of findings. But the biggest take-aways were the actionable design revisions. We needed to revisit gradebook- and activity-creation flows with our new understanding of how teachers planned to use them. Our biggest challenge would be to make the previously high-level designs clear and usable.

We followed the steps and trusted the process

As our new product manager ramped up, we went back to retrace steps that were skipped previously. With each feature we defined scope through users stories and acceptance criteria. And we documented.

✏️
A user story is a requirement framed in the user's language. It's a helpful tool for getting business, development, and design to collectively understand a requirement. Here's an example. As a teacher I need to filter my gradebook by subject so I can find activities related to an area of study Acceptance criteria is for design, development, and quality assurance to understand the full scope of a feature or improvement. For example: I can see a new class when I create it When I archive a label, I can no longer select it I can edit my student's name

We defined scope in short, collaborative meetings. A technical stakeholder clarified technical feasibility. A UX stakeholder and a customer care person vied for users and their experience. And a business leader addressed business and market requirements.

A relatively simple feature flow for a user adding media to a post
A relatively simple feature flow for a user adding media to a post

Once we reached consensus on user stories, we moved each feature seamlessly on to user flows. We used flows to described the path(s) a user would take to complete a task or a portion of a task. We aimed to consult with developers for each flow. Though programmers aren't user experts, they provide incredible feedback in building a technically consistent product.

Clear documentation fed into a clear product

We had to iterate to get the documentation right. We wanted a way to share acceptance criteria (AC), stories, user flows, wireframes, high-fidelity mockups, and prototypes with developers. Meanwhile using Confluence (wiki) for stories, whimsical.io for flows and wireframes, and Invision for mockups. Not to mention our developers had been using Google Docs for their technical documentation. It was a messy start.

Our now team-of-three UX people iterated our way to a good system. We kept business requirements on feature-specific wiki pages. Each page linked to one or more user flows and/or wireframes. And under each wireframe were links to mockups. With different states depending on user-type, inputs, assessment types, errors, etc.

Teachers could create activities with many types of assessment criteria. It was important developers knew how each variation of a feature should look and interact.
Teachers could create activities with many types of assessment criteria. It was important developers knew how each variation of a feature should look and interact.
We used Sketch + Invision for prototyping. Prototypes weren't just important for testing, but sharing with stakeholders. Adding learning media is
We used Sketch + Invision for prototyping. Prototypes weren't just important for testing, but sharing with stakeholders. Adding learning media is the core experience in FreshGrade, so it was a big help to let business and development click through that flow themselves.

We tried a new collaborative process

Our design team was new, and our development team was too. We didn't want to make the mistake of creating silos. So we design handoff became conversational meetings. Way more effective and inclusive than throwing documentation over a cubicle wall.

My favourite part was working with developers when technical challenges came up. With a tight September deadline, shortcuts were welcome. And no design was set in stone if it meant we could get things out the door on time. We made critical revisions along the path to completion thanks to open and honest communication.

image

We tested the beta to make refinements before launch

We quietly launched the product's beta, and set out to test the riskiest parts of it. I assembled a plan to test the first-time user experience (FTUE). We wanted to know know the obstacles in signing up, setting up, and attempting the first core tasks.

We hosted remote, moderated tests with 5 teachers across subjects and grades. FreshGrade's market had expanded to other states in the first test, so we conducted new tests with users from Alberta, Ontario, BC, New York, and Texas.

With the findings, I collated and presented 40+ findings using a new tool called Dovetail. Our product team prioritized findings by business and UX importance. I then came up with design suggestions for each finding, working with our UX Lead to refine them. Later we turned the suggestions into actions. Prioritizing them as design and developer tasks in Jira, our company's product management software.

Fortunately we had the chance to refine some features with test feedback, before the launch. There were some big wins with very little work. It's what we had time for, and we had a happy backlog for post-launch.

We made layout improvements that made it easier for teachers to create classrooms. We improved language across the product so users could better understand features. And we were able to validate many design improvements we made from earlier tests.

Each finding was prioritized and given action items. Some actions were small improvements, some were validation of current work, and some were new features entirely.
Each finding was prioritized and given action items. Some actions were small improvements, some were validation of current work, and some were new features entirely.
I collated the findings in Dovetail. It meant we could see the project holistically, as well as dive into the details of each finding. We were able improve usability quickly in these early stages, for example, solving interaction issues for adding students to a classroom. This would have been more expensive to solve after it hit production.
I collated the findings in Dovetail. It meant we could see the project holistically, as well as dive into the details of each finding. We were able improve usability quickly in these early stages, for example, solving interaction issues for adding students to a classroom. This would have been more expensive to solve after it hit production.

After the rush, there was time to thoroughly research what competitors were doing (better late than never)

Once the dust had settled and the product had launched, I had time to do a deep dive into how three leading competitors were trying to solve the problems we were trying to. If the timeline sounds backwards, you're probably right. It's always best to research competitors at the beginning of a project. But because we had little to no time in the early stages, our glances at competitors had been cursory.

There were surprising trends even between three competitors. But there were a dozen-or-so findings that validated ongoing and past feature work (pictured not blurred). A nice comfort that only research can bring.
There were surprising trends even between three competitors. But there were a dozen-or-so findings that validated ongoing and past feature work (pictured not blurred). A nice comfort that only research can bring.

The competitor report was more of a retrospective, with 40+ noteworthy findings to evaluate our positioning. Making sure the process was well documented with screenshots and content audits, I performed feature-usability tests with each competitor. For example, we added a submissions feature. So I tried Google Classroom's submissions feature. Meanwhile observing layouts, patterns, and content along the way.

Just like with our usability tests, I was able to collate findings, prioritize, and develop suggestions.

I was happy about what I was able to contribute to FreshGrade’s product pipeline. Valuable insights, and a prioritized backlog for development, and an exciting new, usable product.

FreshGrade Next launched in September 2019, ready for the school year, packed with usable new features for its students, parents, and teachers.
FreshGrade Next launched in September 2019, ready for the school year, packed with usable new features for its students, parents, and teachers.

← Back to Projects