In the past day, I've had a few experiences of my own in this regard. This weekend, my dryer broke. So, I found a repair shop that had an online booking mechanism. I felt lucky because in the world of repair or home improvement, tech solutions are sketchy at best. Unfortunately, in this case, it booked me to an appointment two days later that didn't really exist – despite sending me a confirmation! I called to confirm, and the representative saw no such appointment. Then they told me that, in reality, I was booked a week later. I gave up and found another vendor.
Then I had another frustrating experience with an online portal with some of the oddest password protection rules I've ever seen. Even not allowing two same characters in a row. And this was just to book a tee time on a golf course - it was a bit excessive given the platform and wasn't clear in the requirements ahead of time. Lastly, I had another experience with a UI that was utterly broken on mobile. And it was for a convenience store that almost all users would use in a mobile setting!
In each case, commonsense QA would have averted these frustrations. But what is commonsense QA and how can you start to engage in such testing? This week we have a few ideas you should consider when testing your software to make sure you aren't scaring people away.
First: Developers Are Not Your Users
Developers aren't your end-users. And in many cases, they think about things entirely the opposite. As such, never trust your developers alone to do the right thing regarding user interface, user experience, and definitely not to identify how your customers do the things they need to do. Solving usability issues comes down to a different skill set: user interface experts (design) and product managers.
As such, don't fall into the habit of telling developers in broad terms what your problems are and from there expecting the proper fixes. In reality, developers should work to implement solutions identified by those who can architect product solutions. To leave developers to do this work alone often results in either over-engineered solutions which are entirely unusable for end-users or wasted budget on many rounds of revisions to deliverables that may not make sense.
Identify common tasks
The first thing you should do before any testing is to identify the most common and essential workflows, use-cases, or tasks you expect users to perform. This should be relatively easy – use these questions as guidelines to finding those use cases:
- When you initially built the software, what did you build it to do?
- What is the core set of tasks a user must complete to accomplish anything within your platform?
- What are the roadblocks or steps users must take to be successful using your software?
- What are the most common tasks performed, based on either analytics or anecdotally?
- What are areas where you may have received negative feedback?
Most of these are definitely common sense, which is why it's odd that so few companies regularly test their functionality. You need not build a list of a hundred different scenarios – just an essential set of essential or critical tasks to your company's day-to-day functionality that can negatively affect your bottom line if not properly operating.
Get "External"
As business owners, we spend all of our time inside our software. But that is often negative because we are too intimately aware of every aspect of our products. Commonsense testing can usually be accomplished by simply stepping outside of our own world or going external and looking at things from an outside perspective.
There has been much talk about this in the world of mindfulness and in business management materials. While it sounds easy, it's actually challenging. To get started, I like to suggest that you first approach testing as a completely new user, in a new environment – perhaps from a new device. Approach as if you've never seen the software before, and go through everyday tasks that you pre-determined, as per the above steps. If that proves difficult, then perhaps watching other users utilize the platform for the first time without your intervention would help.
If you look at my above examples, these are issues that you can identify very quickly without even assuming much of an external perspective. You'll be amazed at what you can uncover going through these steps, and over time the task will be easier and more manageable. The real secret is to be open-minded to the fact that you'll find problems. You don't need to justify them or panic – just observe and identify, and then work for resolutions afterward.
Start Testing
Now that you've identified what you want to test, it's time to finally test. If the goal is to operate externally of your usual usage patterns, try to change things up a bit. Use a different browser, or computer, or operating system. The idea is to be outside your day-to-day operations to view the experience from someone else's perspective. If you are always a mobile user, try desktop. Always MacOS, try Windows. Switch from Chrome to Edge. I'm always amazed when I do this myself, how different the experience is, even on this website.
When testing, try to notice and be aware of justifications you may make. IE, this happens because of this or that. Just try to experience the software and spot any areas where new users may be frustrated or confused. Again, this certainly sounds easy, but it definitely is challenging.
As you test, make a note of issues. Save screenshots, and perhaps if you can, take some video captures as well. We'll use all of this information in the following step to determine a pathway forward from where your system stands at the moment.
Make a Gameplan
Now that you've changed your perspective, identified areas of importance, and run the necessary tests, you either have one or two thought patterns. One is a complete disappointment – and we hope you don't feel that way. The other is a feeling of some level of satisfaction. Either way, you're sure to have some changes you want to implement.
We all know that development takes time and money, so before reaching out to an agency or your in-house team, you should work to prioritize all of these items by severity. I like a simple scale from which to rank issues found:
- Benign: A nit or minor identified issue.
- Concerning: Something that needs addressing, but it may not require urgent attention.
- Severe: Cancel your weekend plans because this needs fixing immediately!
Begin to organize the issues found so you can weed through and isolate the items that need immediate attention. From there, you are ready to involve the help of your production team, whoever it may be.
Wrapping Up
In wrapping this up, I wanted to instill a sense of "can-do" regarding usability testing into each reader of this post. So much of QA falls into the commonsense category. It doesn't require technical knowledge to know whether something is working or not or working as efficiently as possible. It just takes an understanding of goals and whether those goals are best achieved via the software as it exists today. In the above examples that happened to me, almost any employee of those companies could have tried and seen them broken. While there is an entire skillset dedicated to in-depth technical QA, let's not forget that common sense keeps companies in business, and to trust that technical people will always get things right would not be a safe assumption.
Get in Touch
In the past, we have addressed many of the important reasons to take website accessibility seriously.
Get In Touch