How to Maintain a Compatible User Experience

[article]
Summary:

Tara Nicholson explains why it's important to take into account compatibility, which refers to the ability of a software system to function across a variety of client software (browsers), operating systems, and hardware combinations. In this article, Tara shares some helpful strategies for you to consider when maintaining a compatible user experience.

If you’ve ever heard a programmer say “I’m not fixing it, that browser is not supported” while knowing that a significant percentage of site traffic uses that browser, then you understand the impact of compatibility, and you are not alone. (If you haven’t, count yourself special, and see what you’re missing.)  

As a test manager, I support multiple projects at the same time, and I noticed some patterns across these projects that were not well serving. For a long time, the quality team has tracked site traffic  and planned our testing around what we saw. We based our decisions on the highest traffic sources, and, generally, if something dropped below 5 percent, we stopped supporting it, even if that meant multi-million monthly pageviews. This was effective at keeping testing costs reasonable, and the majority of our users were happy.

Let’s back up a little before proceeding again. By compatibility, I mean the ability of a software system to function according to a design on a determined set of client software (browsers), operating systems, and hardware combinations. Algorithmically, compatibility can be a project wildcard with regards to time, cost, and resource needs. In my experience, it is not feasible or cost effective to include every possible combination. You will not ship bugs, but then again, you will not ship anything at all.

Fig 1

Figure 1. Depending on the system under development, this visual depicts the potential compatibility combinations for consideration when designing a system.

We were seeing problems with our strategy, and in the last year we reworked our approach to addressing compatibility head-on. Some of the problems we were seeing had to do with growing diversity in our site traffic. We saw growing device adoption as competition grows in the market, lingering browser versions. Most combinations were around 5 percent of our traffic  rather than a dominant set of browsers and a few mobile operating systems.

The blanket assumption that testers (and thus also developers) were able to cover these additional combinations was unrealistic, and high-cost bugs made it to market on those defined as “unsupported”. There were disputes amongst the delivery teams, sometimes from a vendor source, regarding the relevancy of the compatibility bugs that were reported. And, we would find out near the end of a project that the product owners had a different opinion on compatibility needs.

I wouldn’t call what we were experiencing “complete chaos,” but there was a disconnect. Risk tolerances were different and it was causing the teams to slow, to argue, and to be frustrated with their ability to do good work. As a manager, I seek to alleviate this very thing so we can deliver our best.

To solve this, we addressed ownership, compatibility handling, and the test approaches that support a project.

Ownership
To cut to the chase, we started treating compatibility as a feature, and with doing so comes requirements. Imagine at the conception of a new project, the product owner, developer, and a tester are all sitting in a room together discussing the project’s features. This crew in agile development is sometimes referred to as the “three musketeers”. Of the features discussed, the trio identifies under what conditions they think the system should be compatible and which conditions are most important. Resources and tools align with these requirements. Testers should not spend time validating conditions in which the results will knowingly be disregarded; it’s expensive and burns out good talent. Avoid causing wasted effort by planning for compatibility up front.

Product owners have audience information—current or projected— that will help in understanding the demographic of the system’s traffic as well as the priority of the functionality being developed. Developers and testers have an understanding of potential risks across browsers, operating systems, and devices that might be encountered. For example, it is predictable at this stage that HTML5 will provide differing user experiences depending on how it is accessed. Testers will be able to offer recommendations on how to target bugs efficiently within their test strategy across combinations. Bringing these perspectives together makes for a supportable strategy.

Compatibility Strategies
A one-size compatibility policy across all projects fails to sufficiently support all the unique project needs. On the foundation that each project will be distinct, we found a few general strategies that supported most, and I’d like to share those with you.

First, a well-established public-facing system may thrive with a compatibility strategy such as Top five browsers + Top two mobile devices based on the current traffic sources of the existing system. The right count of browsers and devices is negotiable. This strategy maintains a consistent experience on existing popular sources. However, this approach poorly supports a progressive market where users upgrade quickly with the latest gadgets. System behavior on emerging sources remains unknown until it becomes a significant portion of the traffic. It is important to know the site’s audience behaviors to use this type of approach effectively.

Second, imagine a public-facing system, a new app like Watch Travel Channel, or relaunching a site like FoodNetwork.com; projects that take six-to-twelve months before launch. These types of projects may benefit from a proactive strategy like: Top five traffic sources + top five emerging sources. It requires the same amount of resources to accomplish as the first strategy, but accounts for changes in user access. A poor prediction of the emerging sources could cause this strategy to be weak.

Third, internal tools or systems where accessibility is limited—by the corporate tech policy for example—uniquely influence a compatibility strategy from the two strategies I outlined above. Have you seen a warning message displayed on a page indicating to you the system only works in X browser? Internal systems like a custom CMS system (customer management system) effectively reduce their costs by limiting access.

A potential risk to this strategy is that the tools may run poorly or not at all once the browser ages. The legacy system would need an upgrade in compatibility or forfeit the latest tools. Reviewing the intended longevity and complexity of the internal-system under development and weighing the risks against the costs of having a system become unstable due to obsolete accessibility will help derive a cost-effective strategy.

Would you like to share an alternative approach your projects use? Please leave a comment at the end of this article.

What about Partial Support?
An all-or-nothing approach is limiting. Just because a browser or device is not “tops” does not mean it has to get lost in the shuffle. To partially support a piece of software means to target development and testing to only the highest risks. To be effective, it requires intimate awareness of the functionality product owners or users deem most important. This means communication, breaking down assumptions into knowable bits and pieces, and avoiding testing in a vacuum. It also means becoming familiar with common or known compatibility risks, such as AJAX, CSS, and HTML5 logic.

Consider applying a partial support strategy to browsers in transition, or lingering traffic on a deprecated device. At the time of writing this article, much of corporate America remains on Internet Explorer 8 or 9, while IE 11 is already growing in circulation. When the CEO of the company will see the system on IE8, the compatibility priorities needs to reflect this consideration without expanding the project cost unnecessarily.

Test Strategies for Partial Support
The test approach for partial support opens up interesting possibilities that differ from how we may test a fully supported browser or device. The goal is to build confidence with stakeholders of a state of acceptable user experience.

  • Consider the approach of session-based testing. The premise of session-based testing is that it is time-boxed, propelled by the tester’s intellect and system familiarity, and results in a lightweight report of coverage, risks, and observations. Then, providing these reports to the product owners and team, any critical issues can report into a development-tracking tool for scheduling or immediate resolution. The report itself can be attached to the ticket. The tester avoids time spent entering bugs for which we “do not support,” and less time is spent fixing lower prioritized issues.
  • The smoke test is also useful. For projects that use test cases, a subset of those tests can be used to target critical scope on partially supported devices. These tests provide the business visibility of coverage and testers with guidance.
  • Use automation and leveraging scripts where they exist to test the requirements of the system. Automation in the form of tools like Mogo, SauceLabs, SOASTA, or BrowserLabs—paid or free—provide an array of features for testing compatibility combinations quicker than manually tests. These tools also cut down on hardware costs by providing the instances needed.

Can you think of other approaches that are right for your projects/teams? Are you already using an approach that is effective?

Responsible Compatibility
Product owners, developers, and testers are all responsible for producing effective software. Address compatibility requirements with each of these perspectives, make the requirements visible, and execute!

User Comments

3 comments
Candee Gulick's picture

Great article. While my customers typically glaze over when I mention the Compatibility section of my contracts, it is my job to 'educate' them as to why they should care. Esp. when it comes to the business software we build into some of our sites. In a perfect world, sites should always work everywhere. In reality, we have browser choices. Tara, I'll put this article in my client toolbox! Thx!!

March 24, 2014 - 9:36am
Kevin Dunne's picture

Tara, 

Great article - I think this is something everyone struggles with and is top of mind, especially considering all of the legacy versions of IE that seem to be on the market :) . 

I am assuming you use Google Analytics or other tools to determine what the breakdown of usage across various environments.  I have been thinking about this recently and are wondering your thoughts - are you familiar with a Johari Window (http://en.wikipedia.org/wiki/Johari_window)? 

In this type of model, I wonder how you might combat the Blind Spot and Unknown areas:

-Blind Spot being that the customer knows their configuration is incompatible but you don't - but you hope they'll log a bug

-Unknown being that the customer does not know they are incompatible and neither do you - maybe it just comes across as poor system performance, etc.  This could be that there is a variable in play like a firewall that you may not have considered before.

My question to you is - how do you combat these 2 areas?  Are you using any reports within Google Analytics or something similar to look at things like bounce rate, time on site, etc. to try to uncover compatibility issues that you are not yet aware of?

Kevin

 

 

 

August 20, 2014 - 2:28pm
Larry Wells's picture

Very nice article on mobile compatability. Earlier only the Apple mobiles are compatability and these days every smart phone which we are getting in the market are most of them are compatable and very user friendly. For any kind of service these days mobile app development plays a prominent role and its very helpful for the user to navigate to any kind of services.

August 19, 2015 - 5:09am

About the author

CMCrossroads is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.