Digital Cross-Channel Usability Heuristics: Improving the Digital Health Experience

Peer-reviewed Article

pp. 52-72


The number of ways consumers of health information access digital content has grown rapidly in recent years. People seek information using various hardware devices (e.g., computer, tablet, phone), which can support multiple digital media platforms (e.g., browser, apps, texting/short message service [SMS], email, social media). Users expect to be able to access information from one or more of these digital channels at any time and from any place. For this reason, public health organizations must create a unified experience across all their digital channels to support the on-demand needs of those seeking health information.

Though only in the past few years have many organizations started to think about all their channels together as a larger digital ecosystem. A cross-channel (CC) approach is needed to ensure that all channels are unified, seamless, and consistent. However, this approach is resource and time intensive, and because it’s a relatively recent trend in the digital world, organizations often don’t have the resources allocated for this need. Tools for developing, managing, and improving a CC experience are scarce. Most digital CC research has been focused on e-commerce, leaving a gap in the literature relating to CC experiences with health communication and behavioral health interventions.

This paper proposes a process to identify and prioritize user task and channel relationships with a set of newly developed CC heuristics applied to these priority needs. This process is in the context of a digital public health program. Although this approach was developed from a public health perspective, it can be applicable to a wide range of public and commercial digital services or products.


cross-channel, omni-channel, multi-channel, cross-device, user experience, UX, usability, heuristic evaluation, public health, smoking cessation, quit smoking, behavioral health, eHealth,


Public health organizations use many digital media platforms to publish and disseminate information. As new digital channels[1] evolve, these organizations feel pressure to respond to new technology and develop new channels, often without thinking about pre-existing content, messaging, and context within the organization’s larger digital ecosystem. This development strategy is like building a neighborhood in a highly desirable location without a community development plan, one house at a time. Improved access to the community and high demand to live in the new community attracts new residents. As new residents move in, they build their homes to suit their own needs, without regard to the overall community, so the community grows without any standards to follow. Growth density, lot size and location, street standards, house style, color, size, location, and material quality, among many other unknown issues, will continue to impair the overall community and life experience. Only after living through these community problems do the issues begin to surface. Creating new individual digital channels without a plan to integrate them and without regard to the overall program strategy or digital ecosystem can create similar issues. Inconsistencies in each channel’s information design, visual design, branding, functionality, interactions, messaging, and tone of voice can affect how well a user “gets around” the digital ecosystem. A lack of planning can create an inconsistent, interrupted, and inefficient overall experience. This can negatively affect the users’ satisfaction and confidence and cause them frustration, which in turn can lead them to abandon their task or discontinue the use of an organization’s digital resources altogether. Public health organizations must do everything they can to reduce cognitive load and increase cognitive fluency (Roller, 2011) to enable easy decision making that results in successful behavior change. This is particularly important in tobacco cessation because users seek help when they are stressed, craving cigarettes, or when they have had a slip, which can happen at any time of the day.

The National Cancer Institute’s Initiative (SFGI) is one of the largest federal smoking cessation programs in the digital space. At the time of this writing, SFGI spans across 5 websites, 2 mobile apps, 9 public facing text-based programs, and 10 social media accounts (Appendix A). SFGI proactively develops, tailors, monitors, and tests digital information and services through a myriad of channels. With an emphasis on self-management, SFGI provides comprehensive smoking cessation services and information for smokers at various points in their quit journey. SFGI publishes quit smoking information and resources to the public, with targeted resources available for adults, teens, Spanish-language speakers, pregnant women, active military, and veterans. Because of the constant state of change and the large number of digital channels, SFGI was an ideal candidate to develop and pilot the cross-channel (CC) heuristics.

Digital cross-channel design focuses on experiences across an entire digital ecosystem. An organization must think of their digital products as extensions of one another in order to ensure a familiar, seamless, and consistent customer experience. A proactive, deliberate, and ongoing CC design strategy is the ideal approach to avoid a “slap-on” growth of digital outlets. However, the scope is often substantial and the time, planning, and resources are not always available. Nowadays, user experience (UX) needs to be efficient and lean. Making a pitch to redesign the entire ecosystem is not always an option, so practitioners need to find innovative ways to approach this growing problem. Due to the scale of the full channel scope, we decided to focus on priority content and user tasks. To properly evaluate tasks, it was important to clearly define user needs and goals within and across channels, then analyze expected use, existing use, and also anticipated future use. This prioritization uses the Pareto principle, also known as the 80/20 rule, to identify the most important tasks and channels to evaluate. The Pareto principle (Aldrich, 2016) states that, for many events, roughly 80% of the effects come from 20% of the causes. To apply this to UX, 20% of the things on a website, app, or any channel will give you 80% of the expected or desired actions or outcomes. By conducting user research, such as needs assessments and analyzing web analytics you can learn about the most frequently used features, content, functionality, needs, and tasks. Focusing on this top 20% will enable you to make meaningful improvements to the most important UXs.

Our initial goal in developing the CC heuristics was to apply the heuristics to a limited set of the SFGI individual channels to determine how well they supported tasks, supported one another, and provided a cohesive and unified experience. However, this evolved into a process to address the complex nature of the CC experience. This paper proposes a tactical, structured, and focused process to first identify high-priority user tasks and channel relationships and then to apply a set of CC heuristics in the context of behavioral health.


Usability assesses how easy user interfaces are to use. UX specialists have many methods and approaches (for the top UX research methods, see Farrell, 2017) at their disposal to assess, test, and measure usability. No method is effective at identifying all usability issues on its own. No two projects are alike, and which method to use depends on many factors, including what you want to learn, business goals and needs, time and budget, resources, access to users, digital platform, a project’s current stage of development and UX acceptance, integration, and involvement (Woolrych, Hornbæk, Frøkjær, & Cockton, 2011). There’s no prescriptive approach and it’s best to use a mixed-method approach and learn as you go. Each methodology will give you different insights. The key is to understand the problem, define the research goals and questions, then select the best method that will provide the data and insight to answer those questions. UX strategy and planning is the first, if not the most important, task where the UX specialist must evaluate the project needs and goals then select and adapt the methodologies, as needed, to provide the most user insight to inform the design, content, and information architecture (IA; information hierarchy and organization structure) decisions. Heuristic evaluation and user testing are two commonly used UX methods. A heuristic evaluation is a usability inspection method, and user testing is a usability testing method (Nielsen, 1994). The primary difference between the two methods is that usability tests involve input from actual or representative users and usability inspections rely on input from usability specialists.

Heuristic Evaluation

Usability heuristics are principles, guidelines, or “rules of thumb” that are applied to a website or digital interface. A conventional usability heuristic evaluation is thought of as a routine and low-cost method to uncover usability issues that can be applied at any phase of the software or web development process. There has been abundant debate over the value of heuristic evaluations as an effective method to assess usability. The major criticisms are that (a) there is no user input and (b) the findings are based on an evaluator’s limited and subjective judgement, which can result in inconsistent, incomplete, and unreliable findings. Research has found that individual evaluators may find as little as one third of all usability issues, so it’s recommended that three to five evaluators conduct a heuristic evaluation to account for these gaps and inconsistencies (Nielsen, 1995a). In practice, however, heuristic evaluations are sometimes conducted with less than the necessary three to five evaluators.

Another drawback when relying on UX professionals and not actual users is the potential false positive issues identified by evaluators (Chattratichart & Brodie, 2002). Often evaluators use “representativeness[2]” (Kahneman, & Tversky, 1972) to unnecessarily judge that an interaction is a problem only because they have found it to be a problem many times before. However, the fact that it was previously judged a problem does not actually make it more likely that it will occur again (Paz [F.], Paz [F. A.], Villanueva, & Pow-Sang, 2015). In other words, when an evaluator draws on past decisions to make assumptions, he or she may find issues that might not otherwise surface in actual user testing or be relevant to the user (Paz et al., 2015).

Recently researchers have begun exploring ways to improve and standardize the heuristic evaluation as a reliable methodology (Chattratichart, & Lingaard, 2008). New and more specialized sets of heuristics are also being developed to meet the challenges of increasing and evolving technology, hardware, and user needs (de Lima Salgado, & Freire, 2014; Herrmann, 2009; Jimenez, Lozada, & Rosas, 2016; Jimenez, Rusu, Roncagliolo, Inostroza, & Ruse, 2012; Korhonen, & Koivisto, 2007; Paz et al., 2015). The CC heuristic evaluation is a similar effort to create a specialized expert review for CC experiences. While we acknowledge that heuristic evaluation as a methodology has been historically criticized, the methodological framework was an important foundation from which to build the CC heuristic evaluation. The goal of this new approach is not to produce comprehensive findings, but is to identify the highest priority CC task and channel relationships, then use CC heuristics to evaluate the journeys that span those channels. This approach is another specialized activity for a UX professional’s toolbox that, when applied appropriately and as a supplement to other UX methodologies such as user testing and metrics, should have a significant impact on UX.

CC Heuristic Evaluation

Similar to conventional usability heuristics, the CC heuristics are general guidelines to be applied to the vast array of channels and CC tasks within an entire digital ecosystem. Applying usability to multiple channels is far more complex than applying heuristics to an individual channel. Each device, channel, need, and the context of use must be considered. Just like a recipe requires a knowledgeable cook to turn ingredients into a delicious dinner, assessing CC heuristics requires sound knowledge and judgment from experienced UX specialists to take the ingredients—comprehensive understanding of organizational goals, user needs, and UX best practices across all channels—to produce quality findings.

There are many articles and discussions on CC UXs; however, we were unable to find examples of actionable guidelines that covered the multitude of channels that are represented by the SFGI, so they were of limited use. Most research has focused on (a) marketing and retail, (b) creating a broader, long-term CC strategy, or (c) limiting to only web platforms (e.g., web responsiveness across desktop and mobile devices), which could not be applied across all channels. For example, CC evaluation methods that center on IA focus almost exclusively on websites across devices and physical locations (e.g., web, mobile, tablet, retail store; Fisher, Norris, & Buie, 2012). The web browser has been the primary media platform evaluated for the devices, and all other platforms only had cursory mentions. While a consistent IA is always crucial, some media platforms, such as those found in the SFGI channel lineup—including social media, texting programs, and apps—have different goals, features, and functionality that require unique IA content and structure. Understanding that even with these IA limitations, the experience still needs to be consistent, seamless, available, and context specific (Flaherty, 2017). It should feel familiar in design (which may include branding) and IA in order to enable uninterrupted mental processing.

Design and Development of a New Process Model

We began our research by conducting an informal, unstructured heuristic evaluation on a limited set of SFGI channels while focusing on the CC experience. We immediately found obvious inconsistencies, gaps, and opportunities for improvement. However, the large number of possible user tasks across all of the channels and disparate issues made it difficult to focus. We needed a structured and systematic process so the CC heuristic evaluation evolved into a process. The steps of the process are as follows: (a) identify all organization channels, (b) identify and prioritize user tasks, (c) identify appropriate channels for each task, (d) assess and prioritize CC relationships, and (e) conduct CC heuristic evaluation.

Step 1: Identify All Organization Channels

We took an inventory of all channels supported by the SFGI and mapped them to the digital media platforms that facilitated their use. We documented the channels in Table 1 (see Appendix B for template). Each “X” on the grid represents a potential channel to evaluate.

Table 1. Program Channel Inventory (


Digital Media Platforms




























Smart Phone









Mobile Phone









Each of the various SFGI digital media platforms have discrete native features and functionality, business goals, and user goals. Each platform delivers information in distinct ways. It is important to keep this in mind when evaluating channels. A channel should only be used when it can efficiently and effectively facilitate the user continuing or completing his or her goal or task, within its normal context of use.

For our evaluation purposes, we selected three sample channels based on a cross-section of devices, digital media platforms, and the SFGI’s priorities. The channels selected for the initial CC heuristic evaluation are as follows:

  • gov website (desktop)
  • SmokefreeUS Twitter social media (smartphone)
  • SmokefreeTXT (SFTXT) texting program (standard mobile phone or smartphone)

Step 2: Identify and Prioritize User Tasks

The priority user tasks are one of the most important factors in channel development, so we felt it critical to assess how well each priority channel supported these tasks. Understanding the relationship between a task, device, digital media platform, and the context of use is critical to designing a great experience. We identified primary tasks based on the program goals, user needs, and the sample channels we selected. The SFGI user tasks, in priority order, for the initial evaluation are as follows:

  1. Find information to help quit smoking (for people getting ready to quit).
  2. Sign up for the SFTXT program.
  3. Find information and resources about how to quit smoking during the quit journey.
  4. Get support while in the process of quitting.
  5. Share stories.

Step 3: Identify Appropriate Channels for Each Task

Next, we mapped the task and channel relationships. The goal is to identify important user tasks within a channel or the ecosystem and assess how each channel can and should support that task. For each task, we documented the channel’s current support of the task and whether it was appropriate for that channel to support the task. To analyze the context of use and the channel’s strengths, it is necessary to understand how the device and media platforms are best used for a situation within the larger user journey (Flaherty, 2017). Flaherty (2017) summarized common device strengths, context of use, and assumptions. Recognizing, understanding, and predicting these channel and task relationships is crucial to a successful study. We initially charted the relationship between the channel and task as either a Yes or No. However, the binary measurement was not sufficient to capture the subtleties and nuances. For example, we found that each channel may or may not support a task, in part, or its entirety. We also found that while a channel may currently support a task, the frequency of use and/or need for the channel to support that task could vary significantly. To account for this variance, we created a rating scale. Both the current support and appropriateness were measured on a scale of Always, Sometimes, Rarely, and Never.

The SFGI texting program, SmokefreeTXT (SFTXT), delivers messages to users on their mobile phones to help them quit smoking. The texting program has been developed with very specific goals driven by clinical guidelines. The length of the program, number of messages, language, and support have been carefully constructed to facilitate and influence behavior change.

Table 2 illustrates an example where we assessed the current support and appropriateness of the SFTXT texting program channel with the task “Find information and resources about how to quit smoking during the quit journey.” We first looked at whether Channel 2, SFTXT, currently supported the task. We discovered that the text messages very rarely referenced any other channels for related information, support, and services. The website, in particular, has content that was strongly correlated to the messages. We immediately identified the opportunity to improve the SFTXT program messages by cross-linking specific information and resources on the website. The cross-linking also benefited the business goals by increasing website traffic to a specific targeted audience.

Table 2. Task Channel Matrix (

Task 1: Find information and resources about how to quit smoking during the quit journey.

Channel 1

Channel 2

SmokefreeTXT (SFTXT; Text Program/Cell Phone)

Channel 3

SmokefreeUS Twitter
(Social Media/Smartphone)

Currently Supports Task

Appropriate for Channel

Currently Supports Task

Appropriate for Channel

Currently Supports Task

Appropriate for Channel







The website has ongoing campaigns to support the overall website goals. Specific campaign pages and tailored content are often targeted to specific audiences, such as pregnant women. SFGI also has specific text message programs for specific audiences, such as SmokefreeMOM for pregnant women. Links to this tailored content would be a great benefit for users who sign up for the SmokefreeMOM texting program, but would be of limited benefit for the SFTXT text message program channel. Only a small fraction of those who sign up for this program are pregnant and so most users would not be interested in this content. In fact, if users of the SFTXT channel received a message specifically for pregnant women, this may confuse, upset, and distract many of the users, which may result in a loss of confidence in the program, or they may even choose to opt out. For this reason, it would not be appropriate.

Step 4: Assess and Prioritize CC Relationships

Next, we used the Task Channel Matrix to identify and prioritize the channels that best supported each task. For instance, if the channel indicated Never Currently Supports Task and Never Appropriate, then it is clear that the channel does not need to support the task and should not be evaluated. However, if a channel Rarely Currently Supports Task and Always is Appropriate for Channel, then it is likely that the channel should be a priority and considered for the CC heuristic evaluation. Though results will vary, channels that were always appropriate tended to be the highest priority, particularly when a channel is always appropriate and never or rarely supported a task.

Step 5: Conduct a Cross-Channel Heuristic Evaluation

After identifying the priority channels to evaluate, we began evaluating the heuristics with an initial set of CC heuristic guidelines using Nielsen’s 10 usability heuristics (Nielsen, 1995a) and CC principles repeatedly found in other existing CC research (Fisher et. al., 2012; Nielsen Norman Group, 2014; Tate, 2012) including consistency, continuity/seamlessness, availability, and optimized/context. During the evaluation, the CC heuristics continued to evolve as we assessed the channels and encountered obvious reoccurring CC violations. After several iterative improvements, three senior UX staff at ICF, each with more than 7 years of experience, reviewed the heuristics and provided feedback. Afterwards, we held two review meetings with the same three senior professionals and two additional junior-level professionals (with less than 3 years of experience) to fully discuss and debate the feedback. After further development, we then presented our process and heuristics for open discussion at the 2016 DC Mobile UX Camp. About a dozen UX professionals attended the open discussion, with four providing direct input. The final digital CC heuristics are listed as follows (Appendix C).

The CC heuristics are as follows:

  1. Consistent behavior: Provide equivalent ways to complete a given task. Where appropriate, however, give precedence to following heuristic #3 (Use of channel’s strengths: specific channels’ native technology, design, user interactions), and usability best practices while maintaining a coherent overall experience throughout the channels. The channels should support tasks, in part or in whole, or continue task processes, in similar or familiar ways across appropriate channels.
  2. Appropriate channels: Only use channels that add perceived value to the service or task. Perceived value is the difference between the benefits and costs of the experience, when compared with other experiences. The channels must efficiently and effectively facilitate the completion or continuance of a goal or task. Be sure each channel is appropriate to the content, user, context of use, service, and task.
  3. Use of channel’s strengths: Use each channel’s native features and best practices to optimize the user experience. The channel should engage the user and not make it difficult to perform any part of the task.
  4. Seamless experience: Determine whether a continuous and seamless experience is expected and necessary across channels. Design for an entire journey and integrated experience, not for an individual channel or individual task. If the task is continuous, when possible and if necessary, the channel should recognize the task context from the previous channel. All channels should integrate and link relevant and related content across channels.
  5. Information design and content: Use a single voice and message to provide a clear message. Use consistent terms and language. Information should maintain a consistent structure, organization, and hierarchy across channels.
  6. Branding: Apply branding and design in a familiar and consistent way. Build on brand familiarity, value, experience, and culture across all channels.
  7. Appropriate to audience: Select and design channels that are relevant and suitable and appropriate for the users. Understand the users’ context, domain and technical knowledge, technology, physical limitations, and preferences when deciding which channels to use.

We developed three distinct CC heuristic evaluation approaches. The first approach, task-based, evaluates priority tasks and how priority channels facilitate their completion. The second approach, channel-based, evaluates the CC heuristics of a single channel. The third approach, heuristic-based, assesses how well channels conform to a single CC heuristic.

The CC heuristic evaluation findings report has been modeled after standard heuristic evaluation findings and recommendations report (see Appendix D for examples of findings), which typically includes some or all of the following information: issue number, area of evaluation (categorization of issues), heuristic violated, issue description, screenshot with highlight and caption (optional), UX severity, and frequency.

The most noticeable difference between the standard heuristic report and the CC report is that the CC findings are grouped by task, channel, or heuristic, and each violation may be reported in the context of one or more channels. For our research, we documented the following data from a standard heuristic evaluation table: heuristic, issue, and severity. We also recorded the following data that are not on a standard heuristic evaluation: task and impacted channel.

Step 5a: Task-based evaluation

Taking one task at a time, we used the CC heuristics to evaluate how well each of the channels facilitated the task, but only within the context of normal channel use. We documented our findings in Table 3. As we evaluated the heuristics related to the tasks, we came across many stray issues unrelated to the task. We realized that we needed another way to address the heuristics of these violations. We documented these issues, then refocused our attention on the task.

Table 3 illustrates a sample task: Sign up for the texting program. The example cites a violation related to a call-to-action (CTA) label. The labels used for this CTA on the website and social media channels were inconsistent and contradictory. The purpose of the CTA was to persuade users to text a specific word via SMS message; this action would automatically sign up users for a smoking cessation texting program. The website instructed the user to text the term “Quit,” referring to quit smoking now; the Twitter communications asked the user to text the term “Start,” referring to starting the program. In the context of the individual channels, the terms made sense; however, as an integrated ecosystem, the two channels were not aligned. We determined that this was a high-priority issue because the inconsistent messaging across channels was very confusing. Best of all, it was a low level of effort to make the change.

Step 5b: Channel-based evaluation

We used the second approach to assess a single channel against all or select heuristics. The channel-based evaluation is a general, high-level assessment of the overall CC usability and consistency of an individual channel as it relates to information, messaging, tone of voice, IA, design, and branding across the entire ecosystem. In an effort to go back to re-address the unrelated violations from the task-based evaluation, we categorized the issues by channel. As expected, the greatest number of violations were found on the website because the scope, size, and complexity of the website far exceeded all other channels. We reviewed and prioritized the violations based on organizational and user priorities and expected level of effort. Table 4 illustrates an example where we found violations of multiple heuristics on the website channel. The two primary issues in this example are (a) the loss of opportunity for cross-promotion to other channels and (b) messaging and language inconsistency with social media channels.

Step 5c: Heuristic-based evaluation

As we completed the task-based evaluation and then the channel-based evaluation, we again looked back at the unrelated violations to explore whether there are other ways to manage and address the disparate issues. What became clear was that there were many instances where a single heuristic was consistently violated on multiple channels. After categorizing a small number by individual heuristics, we determined that this would be another constructive way to focus the work and keep it manageable. This evaluation technique has not been tested and is more a proof of concept as we only conducted cursory research. However, the limited violations we reviewed were easily grouped and evaluated, and the categorization allowed us to concentrate on a very narrow set of violations. Because they were grouped in a similar way to the channel-based evaluation, we were able to easily repurpose the channel-based table.

In all cases, we found the key to success is to work on small segments of the overall ecosystem and follow through with it to the end. This keeps the task from growing out of control as it was very easy to wander off on tangents when there were unrelated issues that we wanted to solve.

Table 3. Findings and Recommendations | Task-based Evaluation Table

Task: Sign up for the TXT program.





Applies to:






#5: Information Design and Content


Twitter and the website use different terms to enroll in the SmokefreeTXT texting program.

Twitter uses “Start.”

The website uses “Quit.”

Using two different words for the same functionality can be confusing to the user.

The messaging and labels should be consistent for all channels.






Table 4. Findings and Recommendations | Channel-based Evaluation Table (

Channel: website




#4: Seamless Experience

#6: Branding

All channels are represented on the website; however, the channels are not adequately or consistently promoted or explained. There is no clear way to learn about all of the channels available.

The social media programs do not have adequate exposure on the website. There is no mention of the social media channels on any webpages. The only exposure to the Smokefree social media channels is through a small icon on the footer. These icons do not adequately promote the social media programs for the value they provide to the users.

Here are some examples on how to provide more visibility for the social media programs:

· Create a webpage that lists all of the support channels to better assimilate products and inform users about the benefits of all the products and services.

· Promote these programs on the homepage.

· Cross-promote the website more on each social media channel.


This process grew out of the need to assess the CC usability of a limited set of SFGI channels. There were no existing heuristics that took into account the complexity and diversity of these channels, so we created one. Our initial vision was to produce a set of guidelines; however, the model evolved into a process because of the complexity of the ecosystem and difficulty we had prioritizing and managing so many variables and issues. This process forced us to narrow our scope into a manageable size. The final heuristics evolved through reviewing the available research, applying different techniques and learning from the outcomes, employing UX peer review, and discussing issues and options with our peers. This iterative process allowed us to review, refine, and test our concept many times. The process has been applied only a limited number of times and, based on this limited application, we encourage flexibility when applying it.

The CC heuristic evaluation process provides a framework to prioritize and assess tasks and channel relationships from an entire ecosystem of channels as well as providing guidance on CC best practices. This method can be used by UX practitioners in all industries to evaluate a wide array of digital channels. We must always think beyond an individual channel and instead to the entire journey. Sound UX knowledge and judgment is critical when conducting a CC heuristic evaluation due to the multifaceted variables, complex technologies, and context of use surrounding CC experiences. Every situation must be carefully assessed and ideally tested with users to realize and validate a successful UX. This process has very specific goals and outcomes and should not be the only UX methodology used to determine single channel or CC comprehensive usability. The approach is most effective when used in conjunction with other UX tools, methods, or activities. A step-by-step guide can be found in Appendix E.


This approach was tested with a limited number of channels and only within the context of behavioral health. Future research should apply these heuristics to other services and lines of business to learn more about the utility of the approach. Two usability experts conducted autonomous heuristic evaluations for the test case; however, there was no inter-rater reliability assessment, as the quality and experience of coders can affect the reliability. In our experience as a UX practice in a large general consulting firm, we are often asked to conduct lean expert reviews with only one or two usability experts. However, research has shown that it’s far more reliable when conducted by multiple experts (Nielsen, 1995a). The evaluator’s skills, experience, judgment, and domain knowledge will also have a significant impact on the reliability and effectiveness of this method. Heuristics are general best practices for the UX professional to use as guidelines for identifying possible problems. These issues are not user-informed, instead they rely on the professional judgement, insight, and knowledge of the practitioner. Anyone can apply best design practices; however, to best apply these guidelines as UX practitioners, even without access to users, we must learn as much as we can about the user, their context, behaviors, needs, and goals. We must keep in mind that all solutions from these assessments are hypotheses and should be used along with other user research to make the best possible decisions. There are many exceptions to these rules, so when possible, all design decisions should be validated with user research to determine whether or not the changes improve the experience. Future research should also consider assessing the reliability of the multiple coders (Botella, Alarcon, & Peñalver, 2014). In addition to a solid UX foundation for all channels being used, the evaluator must also understand the business, processes, products, and end users.


Successful CC UX in the behavioral health digital world is increasingly important. UX professionals must always consider how channels interact together and facilitate tasks across the ecosystem. Most often, users control many of these variables and expect them to work on their terms. Users, device type, digital media, information, context, and technology are all variables that influence this experience. To enable the best UX, we must predict as well as simplify the process and align the journey with user expectations.

Due to the array of multifaceted variables surrounding CC experiences, a standardized and repeatable process to focus on priority tasks and channels is necessary before beginning a CC heuristic evaluation. This method—which attempts to focus on the top 20% priority content to potentially reach 80% of the expected actions or outcomes—gives UX professionals the option to assess CC usability on a limited set of highly impactful experiences. Successful limited scope CC design can be achieved and improved with subtle changes given targeted priorities. We acknowledge this approach is not a holistic solution to address the entire CC ecosystem. Instead, it is a way to focus on the most important tasks and channels in a given situation and as part of a broader UX strategy using other UX methodologies. The findings from this effort can be limited, but significant, when looking at the overall benefits to the user’s experience. We found that this type of limited testing, especially when it focuses on priority interactions, channels, tasks, users, and content does have practical real-world value and benefits. This method will continue to evolve over time along with technology, information, and services. Future research is needed to assess how reliable this approach is in other settings.

Tips for Usability Practitioners

The following are tips and recommendations from our experience in developing and applying the CC heuristics:

  • You will come across many tangential usability issues while assessing priority CC tasks, channels, and heuristics. While it is easy to become side tracked, we recommend recording the issues into a backlog and moving on with the current analysis. We found that otherwise you end up spending hours going down countless rabbit holes, which makes it difficult to complete the evaluation. Then, as time permits, go back and prioritize the backlog for major usability issues that need immediate attention.
  • While documenting the available channels it may also be helpful to gather the following additional information to help with understanding and context:
    • Primary use, purpose, and goals
    • Primary audience
    • Channel point of contacts (content manager, developer, project manager)
  • We do not discuss the research that must go into clearly defining user needs, tasks and journeys. There are many existing user-centered design research approaches and methods that can be used for this purpose (see Farrell, 2017).
  • While assessing tasks and journeys in specific channels, analyze whether there are opportunities for users to utilize the features or functionality in other channels. We have found this is an ideal time to identify gaps where channels do not currently support a task, but have the features, functionality, and user need to do so.
  • A channel’s unique features and native heuristics (heuristic #3) should almost always override consistent behaviors across channels (heuristic #1). These native heuristics may include interactions, functionality, layouts, and content. One example of a native functionality can be found when looking at a website photo gallery. Most galleries have right and left arrow (caret) affordances to indicate when there is a previous and next picture to view. On a desktop browser, a user will need to click on the caret to scroll to the next image. The native behavior on a mobile device would be to swipe or tap. So, the user’s expectation would be different on a phone than it would be on a desktop browser, and you would want to enable the swipe as an interactive element despite the inconsistencies.


We thank all the reviewers who provided valuable feedback to the process, the heuristics, as well as the paper, including Kisha Coa, Iva Stoyneva, Jasper Liu, Jane Robbins, Ben Harper, and Tim Gregory. We thank the management for supporting the work to make digital products user-centric and the best they can be, including Erik Augustson, Meredith Grady, Amy Sanders, and Mary Schwarz.


Aldrich, J. (2016, March 11). What is the Pareto principle? UX Magazine (Article No. 1587). Retrieved May 18, 2017, from

Botella, F., Alarcon, E., & Peñalver, A. (2014). How to classify to experts in usability evaluation (Article No. 25). Interacción ’14: Proceedings of the XV International Conference on Human Computer Interaction. Retrieved May 18, 2017, from

Chattratichart, J., & Brodie, J. (2002, September 1). Extending the heuristic evaluation method through contextualisation. Sage Journals, 46(5), 641–645. Retrieved May 18, 2017, from

Chattratichart, J., & Lingaard, G. (2008). A comparative evaluation of heuristic-based usability inspection methods. Proceedings of CHI EA ’08 Extended Abstracts on Human Factors in Computing Systems CHI 2008 (2213–2220). Retrieved May 18, 2017, from doi:10.1145/1358628.1358654

de Lima Salgado A., & Freire A. P. (2014). Heuristic evaluation of mobile usability: A mapping study. In M. Kurosu (Ed.), Human-computer interaction: Applications and services, HCI 2014. Lecture Notes in Computer Science: Vol. 8512 (pp. 178–188). Springer, Cham.

Farrell, S. (2017, February 12). UX research cheat sheet. Nielsen Norman Group. Retrieved May 18, 2017, from

Fisher, J., Norris, S., & Buie, E. (2012). Sense-making in cross-channel design. Journal of Information Architecture, 4(1-2). Retrieved May 18, 2017, from

Flaherty, K. (2013a, October 27). Consistency in the omnichannel experience. Nielsen Norman Group. Retrieved May 18, 2017, from

Flaherty, K. (2013b, November 24). Seamlessness in the omnichannel user experience. Nielsen Norman Group. Retrieved May 18, 2017, from

Flaherty, K. (2017, February 26). Optimizing for context in the omnichannel experience. Nielsen Norman Group. Retrieved May 18, 2017, from

Herrmann, T. (2009). Design heuristics for computer supported collaborative creativity. System Sciences 2009 42nd Hawaii International Conference. Retrieved May 18, 2017, from

Jimenez, C., Lozada, P., & Rosas, P. (2016). Usability heuristics: A systematic review. 2016 IEEE 11th Columbian Computing Science Conference. Retrieved May 18, 2017, from

Jimenez, C., Rusu, C., Roncagliolo, S., Inostroza, & Ruse, V. (2012). Evaluating a methodology to establish usability heuristics. Chilean Computer Science Society 2012 31st Conference. Retrieved May 18, 2017, from

Kahneman, D., & Tversky, A. (1972). Subjective probability: A judgment of representativeness. Cognitive Psychology, 3, 430–454. Retrieved May 18, 2017, from

Korhonen, H., & Koivisto, E. M. I. (2007). Playability heuristics for mobile multi-player games. Dimea 2007 2nd International Conference on Digital Interactive Media in Entertainment and the Arts. Retrieved May 18, 2017, from

Nielsen, J. (1994). Usability inspection methods. Nielsen, J. & Mack, R. L. (Eds.). New York, NY: John Wiley & Sons. Retrieved May 18, 2017, from

Nielsen, J. (1995a, January 1). How to conduct a heuristic evaluation. Nielsen Norman Group. Retrieved May 18, 2017, from

Nielsen, J. (1995b, January 1). 10 usability heuristics for user interface design. Nielsen Norman Group. Retrieved May 18, 2017, from

Nielsen Norman Group. (2014, March 16). Availability in the cross-channel user experience. Retrieved May 18, 2017, from

Paz, F., Paz, F. A., Villanueva, D., & Pow-Sang, J. A. (2015). Heuristic evaluation as a complement to usability testing: A case study in web domain. Information Technology: New Generations 2015 12th International Conference. Retrieved May 18, 2017, from doi:10.1109/ITNG.2015.92

Roller, C. (2011, July 4). How cognitive fluency affects decision making. UXmatters. Retrieved May 18, 2017, from

Tate, T. (2012, December 21). Investigating cross-channel consistency: When designing for cross-channel experiences, it’s helpful to examine the realm and nature of consistency. UX Magazine (Article No. 926). Retrieved May 18, 2017, from

Woolrych, A. Hornbæk, K. Frøkjær, E., & Cockton, (2011). Ingredients and meals rather than recipes: A proposal for research that does not treat usability evaluation methods as indivisible wholes. International Journal of Human-Computer Interaction, 27(10), 940–970.

[1] A digital channel is the intersection of a digital device (computer, tablet, smartphone, cell phone) and a digital media platform (web browser, apps [web, phone, tablet], email, SMS texting, social media).

[2] The representativeness heuristic was first described by psychologists Amos Tversky and Daniel Kahneman during the 1970s. Like other heuristics, making judgments based on representativeness is intended to work as a type of mental shortcut, allowing us to make decisions quickly. However, it can also lead to errors.

Appendix A: National Cancer Institute’s Initiative (SFGI) Digital Channels

Digital Media Platform



SMS/Text Programs

Social Media Accounts






Spanish language

Practice quit

Daily challenges

Veteran (VHA)

Military (DoD)


SmokefreeUS Facebook

Smokefree Women Facebook

SmokefreeVET Facebook

SmokefreeUS Twitter

SmokefreeUS Instagram

SmokefreeUS Pinterest YouTube

SmokefreeWomen YouTube

SmokefreeTeen YouTube

Springboard Beyond Cancer Pinterest

Appendix B: Program Channel Inventory

Inventory of Program Channels


Digital Media Platform






Social Media













Smart Mobile Phone






Non-Smart Mobile Phone






Appendix C: Seven Usability Heuristics for Digital Cross-Channel User Experiences






Consistent behavior

All appropriate channels should provide equivalent ways to complete a given task. However, the user interface for each channel needs to support its typical usage to align with user expectations. Where appropriate, give precedence to specific channels’ native technology, design, user interactions, and usability best practices while maintaining a coherent overall experience throughout the channels. The channels should be designed in a way that users can easily and efficiently support tasks or continue task processes in similar or familiar ways across appropriate channels.

Can you easily, efficiently, and consistently perform a specific task in all the channels where it exists?



Appropriate channels

Only use channels that add perceived value to the service or task. Perceived value is the difference between the benefits and costs of the experience, when compared with other experiences. The channels must efficiently and effectively facilitate the completion or continuance of a goal or task. Be sure each channel is appropriate to content, users, context of use, service, and tasks.


Are the channels used appropriate to

· content?

· user?

· context of use?

· service?

· task?

Does the channel provide perceived value to the overall experience?

Does the channel help facilitate the service and/or task goals?

Is the content written and delivered appropriate for the channel?


Utilization of channel’s strengths

Utilize each channel’s native features and best practices to optimize UX. The channel should engage the user and not make it difficult to perform any part of the task.

Are the channel’s native features utilized to provide the best possible UX?


Seamless experience

Determine whether a continuous and seamless experience is expected and necessary across channels. Design for an entire journey and integrated experience, not for an individual channel or individual task. If the task is continuous, the channel should recognize the context in which the users left another channel. Integrate and link content across channels to provide a unified extended space that facilitates content discovery, engagement, and task completion.

Does the user want to start where he or she left off?

Should the task be continuous? If so, the channel should recognize the context in which they left another channel.

Is the channel appropriate to continue the task?

Do channels provide integrated supporting services, content, or resources across channels?


Information design and content

All channels should use a single voice (and message) to provide a clear message. Use consistent terms and language.

Is the content written and delivered in a consistent, singular voice?

Is the content organized and presented consistently across channels?



Branding and design should be consistent across all channels. Build on brand familiarity, value, experience, and culture across all channels.

Is the branding consistent and noticeable across all channels?


Appropriate to audience

Select and design channels that are suitable for the users so that they can quickly and easily use the services offered. Understand the users’ domain and technical knowledge, technology, physical limitations, and preferences when deciding which channels to use. Only use channels that are relevant to the audience.

Are the channels appropriate for the audience?

Appendix D: Examples of Findings

Below are examples of how we communicate findings to clients. We produce summary reports that include the following for each issue we identify: issue number for reference, clear description of the issue, example (often screenshot), heuristic violation, severity, and an actionable recommendation. Depending on the purpose and goals, reports are generally grouped by either (a) types of violations (e.g., navigation, content, visual design and layout, and interactivity) or (b) pages, pages types, and components (e.g., site-wide, home page, landing page, and individual components). The issue number refers to more detailed documentation

Image of a screenshot with a the title "Site Wide Issue," listing the  issue, violated heuristic, and recommendation for improvement.
Figure D1.
Example of a violation.

Image of a screenshot with a the title "Home Page and Navigation," with a description of the issue, the issue type (Consistency, Visual Assistance), a severity level, and a recommendation.

Figure D2. Example of a violation.

Appendix E: A “How To” Guide to Performing a Cross-Channel Heuristic Evaluation

The following steps outline the process to conduct a CC heuristic evaluation using this model.

Step 1: Identify Organization Channels

Before starting a CC heuristic evaluation, it is necessary to take an inventory of all channels supported by the product or service. Use Appendix B to document the existing digital media platforms and supported devices. A channel is each intersection of a digital media platform and device. Place an “X” in all intersecting channels where the program supports the channel. If the plan is to conduct a Channel and Heuristic Evaluation or Heuristic and Channel Evaluation skip to step 5. However, if you do not complete steps 2–4, you will miss out on very useful insight learned from identifying and prioritizing tasks and assessing task channel relationships.

Step 2: Identify and Prioritize User Tasks

Identify all priority user tasks supported by each channel. These may be selected based on organizational goals and/or user goals. See Susan Farrell’s (2017) article “UX Research Cheat Sheet” for a list of standard user research methods to identify priority tasks.

Step 3: Identify Appropriate Channels for Each Task

Evaluate each task to check how well existing channels support the heuristics for a task (for more information on usability heuristics for user interface design, see Nielsen, 1995b). For each task, list all channels being evaluated (channel name). Next, assess whether the channel (a) currently supports the task and (b) is appropriate for the channel using the scale (Always, Sometimes, Rarely, and Never).

Step 4: Assess and Prioritize CC Relationships

From this data, identify the priority channels that best support each task. For example, if the results indicate Never for Currently Supports Task and Never for Appropriate for Channel, then it is clear that the channel does not need to support the task. However, if the results show Rarely for Currently Supports Task and Always for Appropriate for Channel, then it is evident that the channel should be a priority and needs to be assessed to determine the best way to support the task to meet users’ needs.

Step 5: Conduct a Cross-Channel Heuristic Evaluation

Based on the priorities and goals of your evaluation, select either Step 5a: Task and Channel Evaluation, Step 5b: Channel-based Evaluation, or Step 5c: Heuristic-based Evaluation.

Step 5a: Task and Channel Evaluation

This evaluation assesses how each channel can and should support each priority task. Select individual tasks to evaluate and assess usability across channels, then document the findings. For each task that is being evaluated, record the following: the heuristic being violated within and across each channel; the issue, recommendation, and severity (low, medium, high); and each channel that has been identified as appropriate for the task which is impacted by the heuristic violation. Though it’s not documented in the example, a severity can be set for each channel that violates the heuristic

Step 5b: Channel-based Evaluation

Identify and document priority channels to evaluate and assess the specific channel usability. When conducting a channel-based evaluation, assess the channel for general cross-channel usability issues. Identify instances where each heuristic is being violated within and across each channel in the ecosystem. Issues found using a channel-based evaluation are usually related to consistency, information design, visual design, and branding. For each channel that is being evaluated, record the heuristic being violated, description of issue, recommendation, and severity (low, medium, high).

Step 5b: Heuristics-based Evaluation

After reviewing priority task and channels, if there are any heuristics that are violated consistently across priority channels, assess that individual heuristic across the channels, then document the issues. This approach is best for aligning obvious inconsistencies.


Early in our research, we found that the nomenclature and definitions surrounding CC experiences were not consistent within or across marketing, UX, organization strategy, and other industries. We wanted to clearly define what it meant to create and evaluate a CC experience, so we defined the following terms.


Services apply to a campaign (any work towards a specific goal), marketing (promotion of specific products or services), line of business (products or services related to a business need), event (a planned occasion with particular goals), intervention (actions taken to improve a condition or situation).

Digital Devices

For our evaluations we define digital devices as computers, tablets, smartphones, or mobile phones.

Digital Media Platforms

These platforms include web browsers, apps (for web, phone, tablet), SMS text, email, and social (e.g., Facebook, Twitter, Vine, Instagram, Blog, Pinterest, Periscope).

Digital Channel

The intersection of a digital device and a digital media platform (e.g., Twitter app on a smart phone or web browser on a computer)

Digital Cross-Channel Experience

The overall experience of how a person feels when interfacing with a service while arbitrarily jumping through multiple “channels” to complete tasks.

Although we recognize that most CC experiences have relationships with non-digital media, we focused on digital products and services. We believe that this definition clearly supports our goal of creating products and services that enable users to employ all available and appropriate digital services without impediment.

Digital CC Heuristic Evaluation

An approach to evaluate the continuity, consistency, usability, and appropriate use of digital channels to deliver information and services through the user interfaces of digital devices and their media platforms, informed by established CC heuristics.