This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.
Although patients express an interest in using mobile health (mHealth) interventions to manage their health and chronic conditions, many current mHealth interventions are difficult to use. Usability testing is critical for the success of novel mHealth interventions. Researchers recognize the utility of using qualitative and quantitative approaches for usability testing, but many mHealth researchers lack the awareness of integration approaches from advances in mixed methods research that can add value to mHealth technology.
As efficient usability testing proceeds iteratively, we introduce a novel mixed methods design developed specifically for mHealth researchers. The
This study demonstrates how the iterative convergent mixed methods design provides a novel framework for generating unique insights into multifaceted phenomena impacting mHealth usability. Understanding these practices can help developers and researchers leverage the strengths of an integrated mixed methods design.
Published studies indicate that mobile health (mHealth) interventions are beneficial for patients across various diseases and age groups [
This paper advances the existing literature about the combined use of qualitative and quantitative research for mHealth by advancing a specific, integrated approach to mixed methods design appropriate to mHealth. When using qualitative and quantitative procedures without integration, researchers miss the opportunity for added value. Mixed methods methodologists express this as 1+1=2, as the quantitative and qualitative procedures are conducted as 2 independent studies with no particular synergy [
The purpose of this paper is to articulate and illustrate the features of an iterative convergent mixed methods design. As efficient usability testing proceeds iteratively, we introduce a novel mixed methods design developed specifically for mHealth researchers. It offers a novel framework to generate unique insights into multifaceted phenomena related to mHealth usability. Understanding these practices can help developers and researchers leverage the strengths of an integrated mixed methods design.
Effective health care strategies are required to ensure the right patient receives the right treatment at the right time. Advancements in mobile phones and tablets have led to the emergence of mHealth. The Global Observatory for eHealth of the World Health Organization defines mHealth as “medical and public health practice supported by mobile devices, such as mobile phones, patient monitoring devices, personal digital assistants, and other wireless devices” [
The significance of mHealth is highlighted by its ability to deliver timely care over distance to manage diseases. It is particularly important for rural areas with limited access to health care [
The International Organization for Standardization (ISO) 9241-210 standard defines human-centered design (HCD) as “an approach to systems design and development that aims to make interactive systems more usable by focusing on the use of the system and applying human factors/ergonomics and usability knowledge and techniques” [
HCD has 4 defined activity phases: (1) identify the user and specify the context of use, (2) specify the user requirements, (3) produce design solutions, and (4) evaluate design solutions against requirements. The process model of HCD as defined in ISO 9241-210 is illustrated in
Researchers advocate for involving patients during development who are going to use the mHealth intervention to meet the patient’s needs and facilitate successful uptake. Testing mHealth interventions with patients reveals preferences and concerns unique to the tested population [
This study offers an in-depth account of the HCD’s fourth activity phase, evaluate design solutions against requirements. This clarifies that this framework intends to focus on usability testing as one component of the more extensive design process.
Human-centered design activity phases (ISO, 2010).
The ISO defines usability as the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use [
mHealth involves the interaction between multiple user groups through a system. As a result, the usability aspect is vital for the effective, efficient, and satisfactory use of mHealth interventions. Although patients express an interest in using mHealth to manage their health and chronic conditions, many mHealth interventions are not easy to use [
Researchers recommend frequent and iterative usability testing to respond to users’ preferences, technical issues, and shortcomings [
Contemporary iterative development methods, such as prototyping, reduce the challenges that evolve during the development lifecycle [
The ISO 9241-11 established usability standards [
Usability constructs and descriptions.
Constructsa | Metrics | Description |
Effectiveness | Time to learn and use | Time to read the scenarios and to begin performing tasks |
Data entry time | Time to enter the data necessary for the execution of a task | |
Tasks time | Time to accomplish given tasks | |
Response time | Time of having the response to the requested information | |
Time to install | Installation time of applications or its update | |
Efficiency | Number of errors | Number of errors made while reading scenarios and during the task execution |
Completion rate | The percentage of participants who correctly complete and achieve the goal of each task | |
Satisfaction | Usability score | The System Usability Questionnaire |
aAdapted from Moumane et al [
MMR is gaining popularity and acceptance across disciplines and the world [
Although quantitative research historically has predominated in health sciences research, many contemporary phenomena in health care are difficult, if not impossible, to measure using quantitative methods alone [
Understanding the principles and practices of integration is essential for leveraging the strengths of MMR. Fetters and Molina-Azorin [
Relevant dimensions of the mixed methods research integration.
Integration dimensionsa | Mixed methods researchers integrate by |
Rationale dimension | Citing a rationale for conducting an integrated mixed methods research study (eg, offsetting strengths and weaknesses, comparing, complementing or expanding, developing or building, and promoting social justice) |
Study purpose, aims, and research questions dimension | Composing an overarching mixed methods research purpose and stating qualitative, quantitative, and mixed methods aims or multiple mixed methods aims with quantitative aims and qualitative questions |
Research design dimension | Scaffolding the work in core (eg, convergent, exploratory sequential, and explanatory sequential), advanced (eg, intervention, case study, evaluation, and participatory), or emergent designs. |
Sampling dimension | Sampling through the type, through the relationship of the sources of the qualitative and quantitative data (eg, identical sample, nested sample, separate samples, and multilevel samples), and through the timing (eg, same or different periods for collection of the qualitative and quantitative data) |
Data collection dimension | Collecting both types of data with an intent relative to the mixed methods research procedures (eg, comparing, matching, diffracting, expanding, constructing a case, connecting, building, generating and validating a model, or embedding). |
Data analysis dimension | Analyzing both types of data using intramethod analytics (eg, analyzing each type of data within the respective qualitative and quantitative methods and core integration analytics), using 1 or more core mixed methods analysis approach (eg, by following a thread, spiraling, and back-and-forth exchanges), or employing advanced mixed methods analysis (eg, qualitative to quantitative data transformation, quantitative to qualitative data transformation, creating joint displays, social network analysis, qualitative comparative analysis, repertory grid/other scale development techniques, geographic information systems mapping techniques, and iterative and longitudinal queries of the data). |
Interpretation dimension | Interpreting the meaning of mixed findings (eg, where there are related data and drawing metainferences or conclusions based on interpreting the qualitative and quantitative findings) and examining for the fit of the 2 types of data (eg, confirmation, complementarity, expansion, or discordance). When the results conflict with each other, using procedures for handling the latter including reconciliation, initiation, bracketing, and exclusion. |
aAdapted from Fetters and Molina-Azorin [
Usability is a complex phenomenon. It is challenging to investigate usability comprehensively using only quantitative methods or qualitative methods in isolation, so-called
Despite the recognized and intuitive value of using mixed methods for mHealth usability testing, mixed methodologists have yet to articulate specific designs that guide the development and testing of mHealth interventions. A core MMR study design that is attractive for usability testing is the convergent design [
Owing to the iterative nature of usability testing, we propose a new variation of the convergent design specifically for mHealth, namely, the
In the following, we articulate the features of an iterative convergent mixed methods design appropriate for mHealth intervention development and usability testing that incorporates an iterative process and is conducted according to the user’s health care and usability needs. Leveraging a specific mixed methods design can help fully integrate the 2 forms of data to enhance the understanding of the usability of mHealth interventions.
Evolution in an iterative convergent mixed methods design from qualitatively driven to quantitatively driven.
Fetters et al recommend considering the design, data collection procedures, interpretation, and analysis for achieving integration in a mixed methods study [
An iterative convergent mixed methods design should have an MMR aim as well as specific quantitative research aims and qualitative research questions
The mixed methods aim is to illustrate, explore, and measure how to improve the usability of an mHealth intervention. A mixed methods aim should imply both qualitative and quantitative data collection methods. For example, illustrate and explore imply qualitative data collection, whereas measure implies qualitative data collection [
Appropriate quantitative research aims include measuring effectiveness, efficiency, and satisfaction, as illustrated in
As illustrated in
A recent study by Beatty et al [
The iterative convergent mixed methods research design.
Matching of the construct’s quantitative variables and qualitative questions in a joint display depicting mixed methods of data collection.
Construct | Quantitative variables | Qualitative questions |
Effectiveness | Time to learn and use | How did you learn to use the app? How can we reduce the time it takes to learn the app? What was your experience using the app? How can we reduce the time it takes to use the app? |
Data entry time | How can we reduce the time it takes to enter the data? | |
Tasks time | How can we reduce the time it takes to complete the task? | |
Response time | How do you feel about the app response time? | |
Time to install | What are your thoughts about the time it took to install the app? The time it took to pair the medical device, if applicable? | |
Efficiency | Number of errors | What can we do to help users avoid the same error? |
Completion rate | What can we do to enhance the completion rate? | |
Satisfaction | Usability score | How often would you use the app? Why? Why not?; How do you feel about the complexity of the app?; How can we simplify it?; Do you have any recommendations to make the wording and interface easier to use? ; Would you need the support of a technical person to be able to use this system? How would you contact them: phone, email, or messaging? ; How did you find the integration of various functions in this app? How can we make it better?; How did you feel about the consistency of the app?; How can we simplify it?; Did you have any troubles when using the app? Where? How can we fix it? ; Did you feel confident when using the app? How can we make you more confident?; Did the app capture issues of importance to you?; Are there other ways to gather similar information? |
During usability testing, users will be asked to provide feedback optionally on paper and, later, on working prototypes. Testing usability with 5 participants will generally be sufficient for identifying significant issues for each version [
The
The qualitative questions in this table include both general and specific questions. Depending on the development needs, more general questions may be used initially, whereas later, more specific questions may be asked. When data become available, the same table structure can be populated with the findings; see
The
Diffracting can be used to address external factors to the user; such as the ease of connecting to the internet or connecting medical devices via Bluetooth. It is also important to develop an mHealth intervention that is energy efficient. mHealth interventions that require frequent charging of the smartphone or medical device are not recommended. Finally, developers should ensure that adequate resources are available to address medical and technical difficulties related to the mHealth intervention.
The
Current prototyping platforms, such as InVision and Adobe XD, integrate with Lookback to enable recording of the user’s interaction with a smartphone. These allow recording of the participant’s voice, nonverbal reactions, and mobile phone screen display. The researcher asks participants to complete a set of tasks and assess effectiveness, efficiency, and satisfaction. The researcher records the time to learn and use mHealth technology, data entry time, task completion time, response time, and time to install the mHealth technology. The researcher also records the number of errors and task completion rate. After completing the tasks, the researcher administers the SUS to assess the user’s satisfaction with the mHealth technology.
The methods appropriate for assessment generally involve semistructured interviews during or after the participant’s use of the prototype. Researchers can utilize cognitive testing, also called cognitive interviewing [
Another alternative to these approaches involves a postuse debrief where the interviewer observes the user going through the mHealth intervention, notes decisions made, and, after use, enquires about decisions made along each step of the way. The strength of this approach is that the user can go through the version naturally without disruption as a real user would. However, the downside is the risk that the user may forget what specific thoughts or motivations influenced their decisions during real-time use. Postuse debrief questions may include (1) whether the tool captured issues of importance to the user, (2) whether the tool was easy to use and understand regarding question wording and interface, and (3) whether there were other ways the system could be improved.
A different option involves the collection of observations to record information about behavior. This can be done in real time through the collection of notes while observing or through recordings of the user’s interactions and using video elicitation interviews [
Semistructured interviews and cognitive interviewing are suitable in the early stages of development. The goal is to identify
There are 2 approaches for an integrated analysis: an interactive analysis strategy or an independent intramethod analysis [
The
The
For an iterative convergent design, the research can and likely will use both strategies depending on the stage of testing. The interactive analysis strategy is preferred, especially during early prototype testing when the number of users will invariably be smaller and there is an urgency for identifying major issues. As statistical analyses will not be feasible or necessary, this approach allows rapidly assessing user rankings of certain features, for example, using the SUS as well as their qualitative experiences with the system.
In later cycles of testing, the analysis may shift to a more independent intramethod analysis strategy. As a higher number of users engage and real-time automated digital user data emerge, the interactive analysis approach may become more challenging to conduct. Moreover, the independent intramethod analysis may be preferred when the scale of testing expands such that blinded quantitative data collection becomes more important. Doing so can enable the researcher to avoid validity threats to the data quality that could occur by changing the data collection approach or by sharing patterns with users in real time. For example, Kron et al [
The third option can involve an iteration of both interactive and independent data analyses, that is, user survey and interview data conducted in real time can be looked at interactively, whereas automated data collection that accumulates as the number of users expands may be brought into the results of the interactive analysis after being examined independently. The exact approach may vary and evolve according to development needs.
Comparing both the qualitative and quantitative findings allows researchers to examine the similarities, differences, or contradictions. This comparison also allows researchers to obtain an expanded understanding of when the qualitative and quantitative findings from the analyses are merged for an interpretation. Similarities occur when there is
A key challenge in mixed methods studies is how to merge the qualitative and quantitative data. A very promising approach of growing popularity among mixed methods researchers is the creation of a joint display [
Joint displays allow researchers to integrate data through visual means to draw out new insights beyond the information gained from the separate quantitative and qualitative results [
After merging the data and drawing interpretations about their cumulative meaning (making metainferences), an iterative convergent mixed methods design then involves the results being communicated to the developers who will include the recommendations in the new iteration of the intervention. Moreover, as the developers make changes, they may also have specific questions to be answered in the subsequent cycle of iterative convergent data collection. Thus, newly emerging questions are added into subsequent rounds of data collection. In general, both qualitative and quantitative data (task completion rate, task completion time, number of errors, completing rate, and the SUS questionnaire) should be compared with each iteration for new mHealth versions.
As illustrated in
On the basis of the results of the usability test, many changes may be required. The researcher should prioritize these changes while focusing on the user’s needs. Generally, the magnitude of data collection and intensity will change. In the early rounds of development, the qualitative component of the mixed methods evaluation will weigh more heavily for identifying the macrolevel changes (
Many researchers use the concept of saturation when conducting usability testing [
A joint display adapted from Kron et al’s MPathic-VR mixed methods trial comparing a virtual human simulation and a computer-based communications module that illustrates medical students’ attitudes and experiences in both trial arms.
Domains | MPathic-VR | Computer Based Learning | Interpretation of mixed methods findings | ||
Attitudinal item, mean (SD) | Qualitative reflection; illustrative quotes | Attitudinal item, mean (SD) | Qualitative reflection; illustrative quotes | ||
Verbal communication | 5.02 (1.62) | “How to introduce myself without making assumptions about the cultural background of the patient and the family” | 3.89 (1.67) | “This educational module was useful for clarifying the use of SBAR and addressing ways that all members of a health care team can improve patient care through better communication skills” | Intervention arm comments suggest deeper understanding of the content than teaching using memorization and mnemonics as in the control, a difference confirmed by higher attitudinal scores |
Nonverbal communication | 4.11 (1.85) | “Effective communication involves non-verbal facial expression like smiling and head nodding” | 2.77 (1.45) | None | Intervention arm comments address the value of learning nonverbal communication, the difference confirmed by attitudinal scores |
Training was engaging | 5.43 (1.55) | “Reviewing the video review was a great way to see my facial expressions and it allowed me to improve on these skills the second time around” | 3.69 (1.62) | “This experience can be improved by incorporating more active participation. For example, there could have been a scenario in which we would have to select the appropriate hand-off information per SBAR guideline” | Intervention arm comments reflect engagement through the after-action review, whereas the control comments suggested the need for interaction, the difference confirmed by higher attitudinal scores |
Effectiveness in learning to handle emotionally charged situations | 5.13 (1.48) | “I tend to try to smile more often than not in emotionally charged situations and that may result in conveying the wrong message” | 2.34 (1.35) | “I anticipate that high-stress situations where time is exceedingly crucial requires modification to the methods presented.” | Intervention arm comments indicate awareness of communication in emotionally charged situations, yet control comments indicate the need for additional training, a difference confirmed in attitudinal scores |
Here, we emphasize the need and process for mHealth researchers to use state-of-the-art mixed methods procedures. Previous single method usability studies were limited in their findings. Some studies have assessed usability using only qualitative data [
Despite the recognized value of using mixed methods for usability testing [
We suggest the following criteria for evaluating the quality of studies that have used the iterative convergent mixed methods design:
The authors report on an empirical mHealth-related usability study.
The authors use an integrated mixed methods approach, defined as the collection, analysis, and integration of quantitative and qualitative data [
The authors compare the results of various iterations of the mHealth intervention.
The iterative convergent mixed methods design provides a clear framework for integrating quantitative and qualitative data to assess usability. As illustrated in
With this design, researchers will start with a mockup, prototype, or the actual mHealth intervention that is represented in the diagram by the circle named the mHealth technology version. In each round, the researcher will evaluate aspects of the version using both qualitative and quantitative research aims and, importantly, making overarching interpretations or metainferences based on the findings of both types of data that inform the next steps [
There are potential limitations to the current usability approach. Although the small sample size may resolve the majority of usability issues [
Usability testing can be conducted on the Web or in a laboratory setting. The value of Web-based testing is that users can participate from their natural context and use their own devices. It is also more cost-effective, and users can be in any location with an internet connection. In a laboratory setting, the researcher can probe users while they walk through their tasks, gather visual cues, assist stumped users, and ask new questions during the testing session.
We acknowledge that there are other methods, including other mixed methods designs [
A usable mHealth intervention with high user satisfaction can have a significantly positive effect on mHealth adoption, resulting in improved health outcomes and quality of life and reduced overall health care costs. Effective mHealth interventions are critically important for empowering patients to manage their health and also potentially enable them to participate more actively in shared decision making with their health care providers. This study offers a novel framework to guide mHealth research that has the potential to generate unique insights into multifaceted phenomena related to usability. Understanding these practices can help developers and researchers leverage the strengths of an integrated mixed methods design.
System Usability Scale.
human-centered design
International Organization for Standardization
mobile health
mixed methods research
System Usability Scale
None declared.