Tony Goins
UX Professional / Content Manager

Resources for Students (Cont.)

A "chalkmark" test as an A/B test

Our No. 4 page ... but no metrics

Problem: We want to afford access to student tasks, but designing a directory of tasks and services is tricky. Worse yet, there are no obvious metrics to define success.

Solution: A/B test two versions, using a "chalkmark" technique.

Tools: Wireframes, unmoderated testing, chalkmark test by Optimal Workshop.

Time to get creative

Our Resources for Students page is the No. 4 most-visited page on the site, so it's worth putting in the time to improve it.

As we saw in our original user test of the Resources for Students page, creating a directory of student tasks is a tricky problem. We emphasized the main student information system and help resources, added a search box, and de-emphasized the directory section.

But what about the directory section? I did wireframes of a few different versions of this page, with the tasks split into categories. We were settling on a design that listed the categories in simple accordions, but in response to stakeholder feedback, I tried a version that showed the text descriptions of the categories. Our team was split on which version was better.

Sounds like an A/B test, right? But this page has more than 120 links, so there was no obvious metric (bounce rate, search exit, destination page, etc.) we could use to determine a winner.

Time to get creative. I used a "chalkmark" test, where users are given images of the page and asked where they'd click to find certain things. This technique is usually used for information architecture, but it seemed ideal for a "where would you find -?" test.

I used the chalkmark technique in grad school - it's by Optimal Workshop. I tested with 40 students - 20 for each version.

Resources for Students (Cont.)

Here's the Text-Exposed Version

This version offers more context about each category.

Resources for Students (Cont.)

And here's the Accordion Version

This one shows the categories at a glance, with less scrolling.

I won't keep you in suspense. The text-exposed version won, but it was surprisingly close.

76%
Success Rate - Text-Exposed
4:38
Test Time - Text-Exposed
70%
Success Rate - Accordions
3:33
Test Time - Accordions

Recommendations

  • Stick with the text-exposed version.

  • Use the accordion version for mobile users.

  • I found several students looked for "Degree Audit" under Student Records, so I cross-linked it there. This test is useful for exploring users' mental models.

  • Change label to "View Resources" instead of "View More."

Resources for Students (Cont.)

An odd finding

It wouldn't be a user test if you didn't find a weirdie. I asked students how they search our website:

  • Browse using the site menus.

  • Search using the site's search box.

  • Search using Google.

  • Just call someone.

Respondents were allowed to choose more than one. Suprisingly, "browse using the menu" was the big winner with 34 responses, compared with 21 in the search box, 6 for Google and 1 "just call."

These findings emphasize how important it is to keep working on your IA.

See the previous Resources for Students user test.