The Karma system is currently undergoing maintenance (Monday, January 29, 2018).
The maintenance period has been extended to 8PM EST.

Karma Credits will not be available for redeeming during maintenance.
Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 23.03.18 in Vol 6, No 3 (2018): March

Preprints (earlier versions) of this paper are available at http://preprints.jmir.org/preprint/9054, first published Sep 27, 2017.

This paper is in the following e-collection/theme issue:

    Review

    Evaluating the Impact of Physical Activity Apps and Wearables: Interdisciplinary Review

    1Institute of Health and Wellbeing, University of Glasgow, Glasgow, United Kingdom

    2School of Computing Science, University of Glasgow, Glasgow, United Kingdom

    Corresponding Author:

    Claire McCallum, MA (Hons)

    Institute of Health and Wellbeing

    University of Glasgow

    Room 142

    25-29 Bute Gardens

    Glasgow, G12 8RS

    United Kingdom

    Phone: 44 141 330 4615

    Email:


    ABSTRACT

    Background: Although many smartphone apps and wearables have been designed to improve physical activity, their rapidly evolving nature and complexity present challenges for evaluating their impact. Traditional methodologies, such as randomized controlled trials (RCTs), can be slow. To keep pace with rapid technological development, evaluations of mobile health technologies must be efficient. Rapid alternative research designs have been proposed, and efficient in-app data collection methods, including in-device sensors and device-generated logs, are available. Along with effectiveness, it is important to measure engagement (ie, users’ interaction and usage behavior) and acceptability (ie, users’ subjective perceptions and experiences) to help explain how and why apps and wearables work.

    Objectives: This study aimed to (1) explore the extent to which evaluations of physical activity apps and wearables: employ rapid research designs; assess engagement, acceptability, as well as effectiveness; use efficient data collection methods; and (2) describe which dimensions of engagement and acceptability are assessed.

    Method: An interdisciplinary scoping review using 8 databases from health and computing sciences. Included studies measured physical activity, and evaluated physical activity apps or wearables that provided sensor-based feedback. Results were analyzed using descriptive numerical summaries, chi-square testing, and qualitative thematic analysis.

    Results: A total of 1829 abstracts were screened, and 858 articles read in full. Of 111 included studies, 61 (55.0%) were published between 2015 and 2017. Most (55.0%, 61/111) were RCTs, and only 2 studies (1.8%) used rapid research designs: 1 single-case design and 1 multiphase optimization strategy. Other research designs included 23 (22.5%) repeated measures designs, 11 (9.9%) nonrandomized group designs, 10 (9.0%) case studies, and 4 (3.6%) observational studies. Less than one-third of the studies (32.0%, 35/111) investigated effectiveness, engagement, and acceptability together. To measure physical activity, most studies (90.1%, 101/111) employed sensors (either in-device [67.6%, 75/111] or external [23.4%, 26/111]). RCTs were more likely to employ external sensors (accelerometers: P=.005). Studies that assessed engagement (52.3%, 58/111) mostly used device-generated logs (91%, 53/58) to measure the frequency, depth, and length of engagement. Studies that assessed acceptability (57.7%, 64/111) most often used questionnaires (64%, 42/64) and/or qualitative methods (53%, 34/64) to explore appreciation, perceived effectiveness and usefulness, satisfaction, intention to continue use, and social acceptability. Some studies (14.4%, 16/111) assessed dimensions more closely related to usability (ie, burden of sensor wear and use, interface complexity, and perceived technical performance).

    Conclusions: The rapid increase of research into the impact of physical activity apps and wearables means that evaluation guidelines are urgently needed to promote efficiency through the use of rapid research designs, in-device sensors and user-logs to assess effectiveness, engagement, and acceptability. Screening articles was time-consuming because reporting across health and computing sciences lacked standardization. Reporting guidelines are therefore needed to facilitate the synthesis of evidence across disciplines.

    JMIR Mhealth Uhealth 2018;6(3):e58

    doi:10.2196/mhealth.9054

    KEYWORDS



    Introduction

    Physical inactivity is a major public health problem [1], with 23% of adults worldwide not meeting recommended levels of physical activity (only 35% and 40% in the United States and the United Kingdom, respectively [2]). Many smartphone apps and wearables designed to improve physical activity are available. They often use data from in-device sensors to provide self-monitoring and feedback [3]. The potential of apps and wearables to increase physical activity and ultimately improve health outcomes, such as management of cardiovascular disease, obesity, and type 2 diabetes, has been widely recognized [4-9]. However, evaluating the impact of physical activity technologies can be challenging, because of the rapid rate at which they evolve [10-12]. Randomized controlled trials (RCTs), the “gold standard” of effectiveness evaluations, can take several years to conduct [11] and require interventions to be stable and unchanged throughout this period [12]. Consequently, researchers have emphasized the need for greater “efficiency” (ie, rapid, responsive, and relevant [11], or agile [13] research) when evaluating mobile health (mHealth) technologies.

    Evaluating the effectiveness of mHealth technologies can be particularly challenging because of their “complexity” [14]. Physical activity apps and wearables often contain multiple components, which can interact with context and produce different outcomes for different people in different settings [15,16]. To understand overall effectiveness, studies should evaluate real-world engagement with, and response to, an intervention [17]. Measuring these factors alongside effectiveness can help interpret and explain variation in effectiveness outcomes, (ie, why the intervention worked or did not work [16-19]). Accordingly, mHealth researchers have been encouraged to assess “engagement” and “acceptability” [14,20]. However, how to define and distinguish these constructs is still a subject of debate; for example, some digital health researchers have conceptualized engagement as a behavioral construct [21,22], whereas others propose that it is composed of both behavioral and subjective components [20,23]. The latter view produces overlaps between engagement and acceptability, and therefore for clarity during this review, we define “engagement” as users’ interaction and usage behavior (ie, a purely behavioral construct), and “acceptability” as users’ subjective perceptions and experiences.

    To increase the efficiency of mHealth evaluations, particular research designs and data collection methods have been recommended [11,14,24,25]. Single-case designs or “n-of-1” studies, in which participants serve as their own control, may be conducted relatively quickly and easily using mHealth technology [13,26]. To evaluate overall effectiveness, the Continuous Evaluation of Evolving Behavioral Intervention Technologies was developed to test multiple versions of an app simultaneously [27]. To test the impact of individual components, quick factorial approaches have been developed, including the multiphase optimization strategy (MOST), which rapidly tests many experimental conditions [28,29], and Sequential Multiple Assignment Randomized Trials [30] and micro-randomized trials [31], which both evaluate components that adapt across time.

    To improve the efficiency of data collection, researchers can capitalize on the technological capabilities of consumer devices. In-device sensors (ie, accelerometers, gyroscopes, and other sensors embedded in smartphones and wearables) can be used to measure outcomes objectively [24,26]. Their internet connectivity and ability to collect continuous, high-density data remotely can improve efficiency over other “intermittent and limited” methods [24], such as questionnaires and traditional pedometers. Smartphones and wearables can also automatically record user interactions and app use [20]. Human computer interaction (HCI) researchers have used such device-generated logs to measure engagement objectively and remotely [32,33]. Log data has also been used for exploring acceptability, when used alongside qualitative methods [33].

    Recommended evaluation designs and methods, as well as multidisciplinary approaches, may advance mHealth research [10,25]. Yet, a recent review of registered clinical trials found that evaluations of mHealth apps targeting a range of clinical conditions did not use either rapid research designs or innovative data collection methods [34]. The authors recommended that future reviews should incorporate a broader set of studies beyond those on ClinicalTrials.gov to identify rapid research designs.

    The study team aimed to investigate, across health and HCI disciplines, the extent to which evaluations of physical activity apps and wearables (1) use recommended rapid research designs; (2) assess engagement and acceptability as well as effectiveness; and (3) employ efficient data collection methods (ie, in-device sensors and device-generated logs). The team also aimed to explore those dimensions of engagement and acceptability that are assessed.


    Methods

    Study Design

    The study team conducted an interdisciplinary scoping review of the research designs, objectives, and data collection methods used in evaluations of physical activity apps and wearables. Scoping reviews are used to rigorously and comprehensively map the range of research activities undertaken in an emerging field [35]. In accordance with scoping review methodology [36], the team did not assess quality or reject studies on the basis of research design, as this would have excluded many HCI studies. The team adapted the framework suggested by Arksey & O’Malley [35] and Levac et al [37], to include 4 steps (1) identification of relevant articles; (2) study selection; (3) charting and extraction of the data; and (4) collation, summarization, and reporting of results.

    Identification of Relevant Articles

    An initial literature search of 8 databases was conducted between August to September 2015 and updated in March 2017. These included 3 health and clinical databases (PubMed, PsycINFO, and Web of Science), 4 computing science databases (Association for Computing Machinery Digital Library (ACM), Institute of Electrical and Electronics Engineers (IEEE), Springer and Science Direct) and 1 interdisciplinary database (mHealth Evidence). The search terms used for different database are presented in Textbox 1. Articles were restricted to English language. No time limit was specified. Protocols, conference proceedings, and extended abstracts were all eligible. The reference lists of systematic reviews were hand-searched for further relevant articles.

    Study Selection

    Studies were included if they evaluated mobile technologies that provided sensor-based feedback on physical activity. To describe the full range of data collection methods used to measure physical activity, studies using objective and self-report measures were both included. Exclusion criteria were (1) no empirical data was collected (ie, systematic or methodological reviews, position papers and articles that only described technologies); (2) physical activity was not measured (ie, studies measured only sedentary time, activity skills, and gait); (3) the study only evaluated sensor or algorithmic performance (ie, accuracy in recognizing or classifying physical activity); (4) the sensor was not mobile; (5) the only mobile technology used was a pedometer without the capacity to connect to another device or the internet (this exclusion criterion was included to focus the review on wearable devices with more advanced feedback capabilities than standard pedometers).

    All abstracts and full-text articles were reviewed by CM, and 5% of abstracts were independently reviewed by CG or JR. Discrepancies were discussed by the 3 authors, and all were resolved. Any articles representing the same study were merged.

    Data Extraction

    A data extraction form was developed to include (1) study characteristics (ie, publication year, country of study, number of participants, age of participants, study duration, whether a protocol or full trial); (2) research design details (ie, experimental or nonexperimental design, number of groups, experimental or control group details, randomization), and intervention characteristics (ie, technologies or devices used to deliver intervention, key intervention features); (3) research objectives and outcomes measured; (4) analyses undertaken (descriptive, inferential, thematic); and (5) data collection methods used (eg, in-device or external sensors, user-logs, questionnaires, interviews, focus groups). All reviewers independently extracted 5 papers (5%) to ensure consistency and reliability of data extraction.

    Collation, Summarization, and Reporting of Results

    The study team adopted a mixed-methods descriptive approach to analyze the extracted data [35]. The team first calculated frequencies in relation to study characteristics and each research design identified and mapped intervention characteristics (ie, the components or app features that studies evaluated). Next, the research objectives and outcomes that studies measured, as reported by authors, were used to categorize studies according to whether they investigated effectiveness (ie, changes in physical activity). Categorizing studies according to whether they investigated engagement and acceptability required a more iterative approach, as definitions of these constructs are less widely agreed. Working definitions of engagement (ie, user interaction with the device or usage behavior) and acceptability (ie, users’ subjective perceptions and experiences) were applied to extracted research objectives, outcome measures, and data collection methods to develop a series of broad codes in relation to engagement (ie, engagement, usage, use, adherence, compliance) and acceptability (ie, acceptability, satisfaction, user experience, usability). These codes were applied to all studies to allow them to be categorized according to whether they investigated engagement and/or acceptability. Frequencies are reported for the number of studies in each category.


    Textbox 1. Search terms used in the scoping review.
    View this box

    In relation to effectiveness, the team calculated the proportion of studies that used only descriptive statistics (as opposed to inferential statistical analysis) and grouped studies that used sensors to collect physical activity data according to whether they used in-device sensors or external sensors (ie, additional, validated devices). The team then calculated frequencies for the data collection methods used in each group, and a chi-square test of independence was conducted to examine whether the type of sensor used was related to the type of research design using R statistical software (RStudio, version 1.0.136).

    In relation to engagement and acceptability, the data collection methods extracts were first used to calculate frequencies in relation to the data collection methods studies employed (eg, user-logs, questionnaires, focus groups, interviews). Each extract was then read carefully to identify detailed subcodes that described the different elements assessed for each construct (ie, any specific behaviors logged, questionnaire items used, or interview or focus group topics described), and the One Sheet of Paper method [38] was used to generate broad dimensions of engagement and acceptability by grouping these subcodes according to their similarity.

    A random sample of all studies (20.7%, 23/111) was independently coded (by CG) to improve rigor in categorizing studies and generating the dimensions in relation to engagement and acceptability; discrepancies were discussed and consensus was reached on the final dimensions. Discussions suggested that some of the dimensions initially associated with acceptability were specifically related to the properties of the app or device and therefore did not relate to acceptability per se. These dimensions were retained and categorized as “usability.”


    Results

    Summary of Search Results

    A total of 6521 articles were retrieved during the initial database search (see Figure 1). After title screening, we reviewed 1272 abstracts and excluded 645 articles that did not meet the inclusion criteria. The full texts of the remaining 627 articles, and an additional 13 articles identified from reference lists searches, were read. Furthermore, 572 studies were excluded, leaving 68 articles. An additional 60 articles were included from the updated search in March 2017 (where we reviewed 557 abstracts and excluded 338 articles that did not meet criteria; then 219 full texts and excluded 159 articles that did not meet criteria). Overall, from the 1829 abstracts and 858 full texts read, a total of 128 articles were included in the review [39-166], representing 111 unique studies.

    Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) diagram.
    View this figure

    Study Characteristics

    The study characteristics are presented in Multimedia Appendix 1. Of the included studies, 22/111 (19.8%,) were protocols. Over half (55.0%, 61/111) were published in 2015 or later. Many (42.3%, 47/111) were conducted in the United States. The majority of studies (93.0%, 103/111 ) involved adult participants; 8/111 studies (7.0%) involved children and adolescents. Participant numbers ranged from 2 [39] to 2980 [40]: 18.9% (21/111) of studies contained fewer than 13 participants. Study duration ranged from less than a day to 52 weeks. Intervention characteristics are included in Multimedia Appendix 2.

    Research Designs

    Of the included studies (see Multimedia Appendix 3), 61/111 (55.0%) used an RCT design. Most of these (66%, 40/61) were 2-group RCTs; 12 (23%, 12/61) were 3-group RCTs and 9 (15%, 9/61) were 4-group RCTs. Control group participants within RCTs received (1) standard care or minimal contact or print materials (39%, 24/61); (2) active comparison treatments (26%, 16/61); (3) noninteractive devices that did not display feedback (18%, 11/61); or (4) waitlist or no intervention (16%, 10/61). The remaining studies included 23/111 (22.5%) repeated measures designs; 11/111 (9.9%) nonrandomized group designs; 10/111 (9.0%) case studies (6/10 [60%] of which included an experimental baseline phase) and 4/111 (3.6%) observational studies. Only 2/111 studies (1.8%) used rapid research designs: one single-case design and one MOST.

    As shown in Textbox 2, studies investigated a variety of intervention components, including the addition of apps or wearables to non-technology based interventions delivered by health care professionals, and a range of in-app components, such as automated adaptive goal-setting versus static or manual input of goals, and different social components.

    Objectives and Data Collection Methods

    Multimedia Appendix 3 shows the objectives that each study investigated effectiveness, engagement, acceptability and/or usability. Almost all studies (96.4%, 107/111) investigated effectiveness, including 14/111 (12.6%) that explored preliminary impact using only descriptive statistics or visual analysis. Only 35/111 studies (31.5%) investigated effectiveness, engagement and acceptability together, and 14 of these (40%, 14/35), did not use inferential statistics analysis to assess effectiveness. Usability was assessed in 16/111 studies (14.4%).

    Effectiveness

    The majority of studies (90.9%, 101/111) used sensors to measure physical activity. These were most often the in-device sensors used to deliver feedback on physical activity (67.6%, 75/111) (eg, Fitbit [105,162]). Some studies used external sensors (eg, Acti-Graph GT3X [ActiGraph, Shalimar, FL, USA], Sensewear Armband [BodyMedia, Inc., Pittsburgh, PA], Omron pedometer [Omron Healthcare, Inc., Bannockburn, I]), instead of, or in triangulation with, in-device sensors (23.4%, 26/111). Physical activity data collected via in-device and external sensors included step counts (eg,[159]) and time spent being active (eg, [84,151]). An external device was significantly more likely to be used in RCTs than in other research designs (χ21=7.8, P=.005).

    Of the included studies, 10/111 (9.0%) used a questionnaire alone to measure self-reported physical activity, and 17/111 (15.0%) used a questionnaire to triangulate with sensor data. Questionnaires included the International Physical Activity Questionnaire [167], the Community Health Activities Model Program for Seniors [168], the Recent Physical Activity Questionnaire [169], the Godin Leisure-Time Exercise Questionnaire [170], the Active Australia survey [171], the 7-day Sedentary and Light Intensity Physical Activity Log (7-day SLIPA Log [172], the Yale Physical Activity Scale [173], and the WHO Global Physical Activity Questionnaire [174].

    Engagement

    Engagement (ie, users’ interaction with the device and usage behavior) was measured by 58/111 studies (52.3%) (Multimedia Appendix 3), with most (91%, 53/58) using device-generated logs to do so. Seven (12%, 7/58) used both logs and self-report questionnaires as a form of triangulation, and 5/58 (8%) used self-report questionnaires alone. Three dimensions of engagement were identified (1) frequency or amount of use; (2) depth of engagement (ie, active vs passive); and (3) length of use. These are described in Textbox 3.


    Textbox 2. Intervention components and features investigated for impact on physical activity in included studies.
    View this box

    Textbox 3. Dimensions of engagement assessed by included studies.
    View this box

    Acceptability

    Of the studies included, 64/111 (57.5%) investigated acceptability (ie, users’ subjective perceptions and experiences; see Multimedia Appendix 3). Most used questionnaires (64%, 41/64), and just over half (53%, 34/64) used qualitative interviews or focus groups, either alone or in addition to questionnaires. Questionnaires included a range of standardized questionnaires (eg, the IBM Computer Usability Satisfaction Questionnaire [175], the Persuasive Technology Acceptance Model Questionnaire [176], the Intrinsic Motivation Inventory [177], the Fun Toolkit [178] and the Working Alliance Inventory [179]), or questionnaires developed especially for the study (eg, [73,88]). A few studies employed user logs (11%, 7/64), of which, 3 used device-generated usage logs as a “proxy” of users’ interest [135] or preferences [143,150]; 4 used user-entered text (eg, the content of social media messages to understand the types of social support that users experienced [86,106,130], and digital diary entries to understand experiences of using the device [106,127]). Studies that used text-based logs also employed face-to-face qualitative methods (ie, interviews, focus groups) or questionnaires, in addition to collecting log data. Five dimensions were identified in relation to measuring acceptability (1) appreciation; (2) perceived effectiveness and usefulness; (3) user satisfaction; (4) users’ intention to continue use of the app or device, and (5) social acceptability. These are described in Textbox 4.

    Usability

    Usability was investigated by 16 studies (14.4%, 16/111), out of which, 9 (56%, 9/16) used questionnaires (eg, the System Usability Scale [180]); 4 (25%, 4/16) used interviews; 2 (13%, 2/16) used focus groups; and 1 (6%, 1/16) [70] used observation of participants’ completing timed tasks. Three dimensions were identified in relation to assessing usability (1) burden of device wear and use, (2) interface complexity, and (3) perceived technical performance. These are described in Textbox 5.


    Textbox 4. Dimensions of acceptability assessed by included studies.
    View this box

    Textbox 5. Dimensions of usability assessed by included studies.
    View this box

    Discussion

    Principal Findings

    Of the 111 studies included, around half were published between 2015 and 2017, 55.0% were RCTs, and only 2 studies used rapid designs. Almost all studies measured physical activity objectively using sensors (either in-device or external), with RCTs more likely to employ external sensors (accelerometers). Less than one-third of the studies investigated effectiveness, engagement, and acceptability together. According to our working definitions, studies that measured engagement mostly used device-generated logs to assess the frequency, depth, and length of engagement. Studies exploring acceptability most often used questionnaires and/or qualitative methods to assess appreciation, perceived effectiveness and usefulness, satisfaction, users’ intention to continue use of the app or device, and social acceptability. A small number of studies explored usability of the device (including burden of sensor wear and use, interface complexity, perceived technical performance) using questionnaires, qualitative methods, or participant observation.

    The fact that more than half of the included studies were published between 2015 and 2017 demonstrates that research into the impact of physical activity apps and wearables is a growing area of interest, underscoring the timeliness of this review. Despite this, we found that only 2 studies used the rapid research designs that have been recommended for evaluating mHealth technologies (single-case design [131] and the MOST approach [164]). A low uptake of rapid research designs was similarly reported in a recent review of clinical mHealth app evaluations [34]; however, while the vast majority of evaluations of clinical apps were RCTs, our findings show that evaluations of physical activity apps and wearables use alternative research designs (including repeated measures designs, nonrandomized group designs, case studies and observational studies) more often. This may reflect the interdisciplinary nature of our review, and the view held by some HCI researchers that RCTs, as well as being impractical and resource intensive, are of limited usefulness [181]. It is nevertheless surprising that few studies used single-case designs and new factorial approaches, as it has been suggested that mHealth technologies can support the data collection procedures and experimental setup these research designs require (ie, frequent measurement and several experimental conditions) [25,26,182].

    Further research is needed to explore the reasons that rapid research designs are not being used. It could be that the requirements for these designs are not feasible for some research projects. MOST, for example, requires several decisions to be made in advance of conducting the trial (eg, deciding which specific theory-based components of the intervention should be tested, and assessing the feasibility of carrying out a research design that can often require large sample sizes [29]). These requirements can themselves be time and resource intensive [183]. Barriers to using rapid research designs may also be conceptual: preliminary evidence suggests that the value of, and requirements for, single-case designs were not fully understood by clinical health practitioners [184], which may also apply to mHealth researchers.

    In addition to effectiveness, assessing user engagement and acceptability are important to (1) generate a better understanding of the overall impact; (2) explain variation in the outcomes; and (3) reveal (potentially interactive) influences on effectiveness [16,19]. Despite this, only around one-third of the studies (32.0%) investigated all 3 objectives together. Furthermore, 40.0% of these did not use inferential statistics to assess effectiveness (instead using descriptive statistics and visual analysis), and almost one-fifth of all studies (18.9%) contained fewer than 13 participants. These preliminary, small-N studies are typical of iterative HCI research focused on developing novel technologies [185], and are unlikely to be sufficiently powered to test important hypotheses on mediators of effectiveness [17,186]. Although this study did not explore the specific statistical analyses undertaken, Bayesian methods are considered a promising approach for mHealth evaluations [13,25,187] and can be used to investigate mediating variables in small-N studies [188]. As such, Bayesian methods could be key when exploring results from early developmental evaluations to reveal potential relationships between mHealth engagement, acceptability, and effectiveness.

    Many evaluations of physical activity apps and wearables appear to be taking advantage of efficient data collection methods: two-thirds of studies employed in-device sensors in smartphones and wearables to measure physical activity. The fact that RCTs used external, validated sensors more often than other study designs exacerbates their inefficiency (eg, through adding extra resource costs [189]). Furthermore, using external sensors often involves measurement procedures that may reduce the generalizability of findings to real-world contexts (eg, requiring participants to wear additional devices and visit the lab). The coupling of gold standard RCTs and sensors with established validity indicates a well-grounded concern for methodological rigor. Yet, balancing this need for rigor with the need for efficiency requires further investigation. Addressing any “trade-offs” between efficiency and rigor when evaluating physical activity apps and wearables (and mHealth technologies more generally [11]) will require, at the very least, understanding the validity and reliability of internal sensors. Evidence could be quickly accumulated using industry-based “research libraries,” such as Fitabase [190], and then used to inform decision making when designing a pragmatic evaluation. Relatedly, empirical evidence is needed to support recently proposed digital health evaluation models that outline all phases of the research process [191,192]: these frameworks combine HCI and implementation science methods to ensure evaluations are both rigorous and sustainable in real-world settings.

    Most studies that measured engagement, used device-generated logs: these can be more efficient than qualitative self-report methods, which can be time-consuming and burdensome [20]. In contrast, acceptability was generally assessed via questionnaires and/or qualitative face-to-face methods. HCI researchers have emphasized the need to collect subjective qualitative data alongside device-generated logs to fully understand not only “what” people are doing but “why” [32,33]. We found a handful of studies (11%) used log data (eg, device-generated usage logs or user-entered text logs) to assess some dimensions of acceptability. The validity of this approach (ie, whether either form of log data can sufficiently capture the rich contextual details typically afforded by traditional qualitative methods) should be explored. For example, device-generated logs showing continued engagement with the app could imply user “satisfaction,” “appreciation,” and “perceived effectiveness or usefulness of the app,” whereas investigating “social acceptability” (eg, user attitudes toward publicly sharing data) may require user-entered text logs (eg, from digital diaries, Web-based questionnaires, and social media posts), or even face-to-face methods.

    In this review, we defined engagement as users’ interaction and usage behavior [21,22] and acceptability as users’ subjective perceptions and experiences. The dimensions of engagement and acceptability that we identified rested upon these working definitions. There is still no consensus in mHealth and related fields on what constitutes engagement and acceptability, and how each should be measured. One recent review [23] proposed that engagement is a multidimensional construct that includes not only dimensions related to “usage” (ie, amount, frequency, depth, and duration of engagement) but also subjective experiences of engagement (eg, affect, attention, and interest). Another review conceptualized engagement as “any process by which patients and the public became aware of or understood a digital health intervention” [193]. In response to varying definitions of engagement, researchers have undertaken valuable consensus-building exercises (and have emphasized the need to focus on “effective engagement” that accounts for engagement with behavior change) [20]. Clarification and consensus will advance our understanding of how engagement and acceptability may individually, or interactively, influence effectiveness.

    A few studies assessed usability. In line with other conceptualizations of usability (ie, whether the device or app is easily used to achieve specified goals successfully and quickly [194,195]), we distinguished usability from acceptability by considering it to be a characteristic of the device. Understanding the degree to which usability varies across users and interacts with context to ultimately influence effectiveness (as opposed to being a stable device characteristic) will determine whether it should be assessed during within effectiveness evaluations (or instead optimized beforehand).

    The screening process in this interdisciplinary review involved a very high number of abstracts and full papers being read to identify the final studies for inclusion. Many of the articles retrieved from the database searches had ambiguous titles; and many authors omitted key study details from their abstracts. Furthermore, data extraction from the full-text articles involved negotiating different publication formats across disciplines. These challenges meant the review process was far more time-consuming than originally envisaged. Currently, HCI studies are not required to follow heath science reporting guidelines that promote the inclusion of specific study details in titles and abstracts [196]. Standardized reporting drawing on existing guidelines (eg, CONSORT-EHEALTH [197]) would allow different disciplines to more easily synthesize the large amount of research that is being conducted in this area and would also aid current efforts to develop automated processes to increase the accessibility of evidence from digital health publications [198].

    Limitations

    The review was conducted systematically and comprehensively across health, clinical, and computing science databases. However, the scoping methodology followed did not include any assessment of the methodological quality of studies [37]. The focus on physical activity, engagement, and acceptability (and usability) meant that other important aspects of evaluation, such as reach and uptake, secondary clinical and psychological outcomes, cost-effectiveness, and the statistical analysis methods that studies used, were not reported. Furthermore, without established definitions of engagement and acceptability, the dimensions identified in this review are necessarily provisional.

    The review did not examine the context in which apps and wearables were developed and evaluated, such as within academia versus industry. The development context may influence the assessment and reporting of engagement, acceptability, usability, and effectiveness of the apps and wearables. Commercially-developed apps, for example, often do not incorporate behavior change techniques that improve effectiveness [199-202] and may focus more on enhancing user experience: therefore, industry professionals may be more likely to assess engagement, acceptability, and usability rather than effectiveness. Finally, to understand whether studies employed in-device sensors to measure physical activity, studies were included only if they evaluated apps and wearables that provide sensor-based feedback on physical activity. Therefore, the findings of the review cannot be generalized to other technologies or health behaviors.

    Future Research

    Future research should investigate why recommended rapid research designs are not yet widely adopted. For example, qualitative explorations of researchers’ and industry professionals’ perceptions and daily research practices and experiences would allow an understanding of the practical challenges in using rapid designs in academia and industry; and feasibility studies should explore the extent to which rapid designs can be supported and automated by mHealth technologies [11]. Consensus is needed on how to define and distinguish engagement and acceptability, and on the specific dimensions of these constructs, which could then be tested as potential mediators and moderators of effectiveness. Finally, the validity and usefulness of logging methods for assessing acceptability should be explored.

    Conclusions

    Despite the rapid increase of evaluations of the impact of physical activity apps and wearables, few are optimized in relation to efficiency and assessment of the key constructs of effectiveness, engagement, and acceptability. The findings of this review will inform future guidance to support health and HCI researchers in making greater use of rapid research designs (eg, single-case designs), in-device sensors, and user-logs to collect effectiveness, engagement, and acceptability data. The difficulties encountered in conducting this interdisciplinary review also highlight the need for standardized reporting guidelines. These would facilitate the synthesis of evidence across health and HCI disciplines, and thus support rapid advancement in understandings of the extent to which apps and wearables can support users to become more physically active.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Study characteristics.

    PDF File (Adobe PDF File), 179KB

    Multimedia Appendix 2

    Intervention characteristics.

    PDF File (Adobe PDF File), 161KB

    Multimedia Appendix 3

    Research designs used in included studies and objectives investigated.

    PDF File (Adobe PDF File), 192KB

    References

    1. Blair SN. Physical inactivity: the biggest public health problem of the 21st century. Br J Sports Med 2009 Jan;43(1):1-2. [Medline]
    2. Mendis S. Global status report on noncommunicable diseases 2014. Geneva, Switzerland: World health organization; 2014.
    3. Sanders JP, Loveday A, Pearson N, Edwardson C, Yates T, Biddle SJ, et al. Devices for self-monitoring sedentary time or physical activity: a scoping review. J Med Internet Res 2016 May 04;18(5):e90 [FREE Full text] [CrossRef] [Medline]
    4. Hickey AM, Freedson PS. Utility of consumer physical activity trackers as an intervention tool in cardiovascular disease prevention and treatment. Prog Cardiovasc Dis 2016;58(6):613-619. [CrossRef] [Medline]
    5. Lewis ZH, Lyons EJ, Jarvis JM, Baillargeon J. Using an electronic activity monitor system as an intervention modality: a systematic review. BMC Public Health 2015 Jun 24;15:585 [FREE Full text] [CrossRef] [Medline]
    6. Bort-Roig J, Gilson ND, Puig-Ribera A, Contreras RS, Trost SG. Measuring and influencing physical activity with smartphone technology: a systematic review. Sports Med 2014 May;44(5):671-686. [CrossRef] [Medline]
    7. Muntaner A, Vidal-Conti J, Palou P. Increasing physical activity through mobile device interventions: a systematic review. Health Informatics J 2015 Feb 3;22(3):451-469. [CrossRef] [Medline]
    8. Stephens J, Allen J. Mobile phone interventions to increase physical activity and reduce weight: a systematic review. J Cardiovasc Nurs 2013;28(4):320-329 [FREE Full text] [CrossRef] [Medline]
    9. Direito A, Carraça E, Rawstorn J, Whittaker R, Maddison R. mHealth technologies to influence physical activity and sedentary behaviors: behavior change techniques, systematic review and meta-analysis of randomized controlled trials. Ann Behav Med 2017 Apr;51(2):226-239. [CrossRef] [Medline]
    10. Nilsen W, Kumar S, Shar A, Varoquiers C, Wiley T, Riley WT, et al. Advancing the science of mHealth. J Health Commun 2012;17(Suppl 1):5-10. [CrossRef] [Medline]
    11. Riley WT, Glasgow RE, Etheredge L, Abernethy AP. Rapid, responsive, relevant (R3) research: a call for a rapid learning health research enterprise. Clin Transl Med 2013;2(1):10 [FREE Full text] [CrossRef] [Medline]
    12. Ben-Zeev D, Schueller SM, Begale M, Duffecy J, Kane JM, Mohr DC. Strategies for mHealth research: lessons from 3 mobile intervention studies. Adm Policy Ment Health 2015 Mar;42(2):157-167. [CrossRef] [Medline]
    13. Hekler EB, Klasnja P, Riley WT, Buman MP, Huberty J, Rivera DE, et al. Agile science: creating useful products for behavior change in the real world. Transl Behav Med 2016 Jun;6(2):317-328 [FREE Full text] [CrossRef] [Medline]
    14. Murray E, Hekler EB, Andersson G, Collins LM, Doherty A, Hollis C, et al. Evaluating digital health interventions: key questions and approaches. Am J Prev Med 2016 Nov;51(5):843-851. [CrossRef] [Medline]
    15. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M, Medical Research Council Guidance. Developing and evaluating complex interventions: the new Medical Research Council guidance. Br Med J 2008;337:a1655 [FREE Full text] [Medline]
    16. Oakley A, Strange V, Bonell C, Allen E, Stephenson J. Process evaluation in randomised controlled trials of complex interventions. Br Med J 2006 Feb 18;332(7538):413-416 [FREE Full text] [CrossRef] [Medline]
    17. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: Medical Research Council guidance. Br Med J 2015;350:h1258 [FREE Full text] [Medline]
    18. Donkin L, Christensen H, Naismith SL, Neal B, Hickie IB, Glozier N. A systematic review of the impact of adherence on the effectiveness of e-therapies. J Med Internet Res 2011;13(3):e52 [FREE Full text] [CrossRef] [Medline]
    19. Grant A, Treweek S, Dreischulte T, Foy R, Guthrie B. Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting. Trials 2013 Jan 12;14:15 [FREE Full text] [CrossRef] [Medline]
    20. Yardley L, Spring BJ, Riper H, Morrison LG, Crane DH, Curtis K, et al. Understanding and promoting effective engagement with digital behavior change interventions. Am J Prev Med 2016 Nov;51(5):833-842. [CrossRef] [Medline]
    21. Couper MP, Alexander GL, Zhang N, Little RJA, Maddy N, Nowak MA, et al. Engagement and retention: measuring breadth and depth of participant use of an online intervention. J Med Internet Res 2010;12(4):e52 [FREE Full text] [CrossRef] [Medline]
    22. Sharpe EE, Karasouli E, Meyer C. Examining factors of engagement with digital interventions for weight management: rapid review. JMIR Res Protoc 2017 Oct 23;6(10):e205 [FREE Full text] [CrossRef] [Medline]
    23. Perski O, Blandford A, West R, Michie S. Conceptualising engagement with digital behaviour change interventions: a systematic review using principles from critical interpretive synthesis. Transl Behav Med 2016 Dec 13;7(2):254-267. [CrossRef] [Medline]
    24. Kumar S, Nilsen WJ, Abernethy A, Atienza A, Patrick K, Pavel M, et al. Mobile health technology evaluation: the mHealth evidence workshop. Am J Prev Med 2013 Aug;45(2):228-236 [FREE Full text] [CrossRef] [Medline]
    25. Michie S, Yardley L, West R, Patrick K, Greaves F. Developing and evaluating digital interventions to promote behavior change in health and health care: recommendations resulting from an international workshop. J Med Internet Res 2017 Jun 29;19(6):e232 [FREE Full text] [CrossRef] [Medline]
    26. Dallery J, Cassidy RN, Raiff BR. Single-case experimental designs to evaluate novel technology-based health interventions. J Med Internet Res 2013 Feb;15(2):e22 [FREE Full text] [CrossRef] [Medline]
    27. Mohr DC, Cheung K, Schueller SM, Hendricks BC, Duan N. Continuous evaluation of evolving behavioral intervention technologies. Am J Prev Med 2013 Oct;45(4):517-523 [FREE Full text] [CrossRef] [Medline]
    28. Collins LM, Murphy SA, Nair VN, Strecher VJ. A strategy for optimizing and evaluating behavioral interventions. Ann Behav Med 2005 Aug;30(1):65-73. [CrossRef] [Medline]
    29. Collins LM, Murphy SA, Strecher V. The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): new methods for more potent eHealth interventions. Am J Prev Med 2007 May;32(5 Suppl):S112-S118 [FREE Full text] [CrossRef] [Medline]
    30. Murphy SA. An experimental design for the development of adaptive treatment strategies. Stat Med 2005 May 30;24(10):1455-1481. [CrossRef] [Medline]
    31. Liao P, Klasnja P, Tewari A, Murphy SA. Sample size calculations for micro-randomized trials in mHealth. Stat Med 2016 May 30;35(12):1944-1971 [FREE Full text] [CrossRef] [Medline]
    32. Dumais S, Jeffries R, Russell D, Tang D, Teevan J. Understanding user behavior through log data and analysis. In: Ways of Knowing in HCI. New York: Springer; 2014:349-372.
    33. El-Nasr MS, Drachen A, Canossa A. Game analytics - maximizing the value of player data. London: Springer-Verlag; 2013.
    34. Pham Q, Wiljer D, Cafazzo JA. Beyond the randomized controlled trial: a review of alternatives in mHealth clinical trial methods. JMIR Mhealth Uhealth 2016 Sep 09;4(3):e107 [FREE Full text] [CrossRef] [Medline]
    35. Arksey H, O'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol 2005 Feb;8(1):19-32. [CrossRef]
    36. Peters MD, Godfrey CM, Khalil H, McInerney P, Parker D, Soares CB. Guidance for conducting systematic scoping reviews. Int J Evid Based Healthc 2015 Sep;13(3):141-146. [CrossRef] [Medline]
    37. Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implement Sci 2010;5:69 [FREE Full text] [CrossRef] [Medline]
    38. Ziebland S, McPherson A. Making sense of qualitative data analysis: an introduction with illustrations from DIPEx (personal experiences of health and illness). Med Educ 2006 May;40(5):405-414. [CrossRef] [Medline]
    39. Albaina I, Visser T, van der Mast C, Vastenburg M. Flowie: A persuasive virtual coach to motivate elderly individuals to walk. 2009 Presented at: 3rd International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) 2009; April 1-3; London. [CrossRef]
    40. Gomes N, Merugu D, O'Brien G, Mandayam C, Jia SY, Atikoglu B. Steptacular: An incentive mechanism for promoting wellness. 2012 Presented at: 4th International Conference on Communication Systems and Networks (COMSNETS) 2012; January 3-7; Bangalore. [CrossRef]
    41. Walters DL, Sarela A, Fairfull A, Neighbour K, Cowen C, Stephens B, et al. A mobile phone-based care model for outpatient cardiac rehabilitation: the care assessment platform (CAP). BMC Cardiovasc Disord 2010;10:5 [FREE Full text] [CrossRef] [Medline]
    42. Kharrazi H, Vincz L. Increasing physical activity by implementing a behavioral change intervention using pervasive personal health record system: an exploratory study. 2011 Presented at: 6th International Conference on Universal Access in Human-Computer Interaction (UAHCI). HCI International 2011; July 9-14; Orlando, FL. [CrossRef]
    43. Pellegrini CA, Duncan JM, Moller AC, Buscemi J, Sularz A, DeMott A, et al. A smartphone-supported weight loss program: design of the ENGAGED randomized controlled trial. BMC Public Health 2012;12:1041 [FREE Full text] [CrossRef] [Medline]
    44. Jimenez GJ, Romero N, Keyson D, Havinga P. ESTHER 1.3: integrating in-situ prompts to trigger self-reflection of physical activity in knowledge workers. 2013 Presented at: ChileCHI'13 Chilean Conference on Human-Computer Interaction 2013; November 11-15; Temuco, Chile. [CrossRef]
    45. Geraedts HA, Zijlstra W, Zhang W, Bulstra S, Stevens M. Adherence to and effectiveness of an individually tailored home-based exercise program for frail older adults, driven by mobility monitoring: design of a prospective cohort study. BMC Public Health 2014;14:570 [FREE Full text] [CrossRef] [Medline]
    46. Recio-Rodríguez JI, Martín-Cantera C, González-Viejo N, Gómez-Arranz A, Arietaleanizbeascoa MS, Schmolling-Guinovart Y, et al. Effectiveness of a smartphone application for improving healthy lifestyles, a randomized clinical trial (EVIDENT II): study protocol. BMC Public Health 2014;14:254 [FREE Full text] [CrossRef] [Medline]
    47. Clayton C, Feehan L, Goldsmith CH, Miller WC, Grewal N, Ye J, et al. Feasibility and preliminary efficacy of a physical activity counseling intervention using Fitbit in people with knee osteoarthritis: the TRACK-OA study protocol. Pilot Feasibility Stud 2015 Aug;1:30 [FREE Full text] [CrossRef] [Medline]
    48. Cooper AJ, Dearnley K, Williams KM, Sharp SJ, van Sluijs EM, Brage S, et al. Protocol for Get Moving: a randomised controlled trial to assess the effectiveness of three minimal contact interventions to promote fitness and physical activity in working adults. BMC Public Health 2015 Mar 27;15:296 [FREE Full text] [CrossRef] [Medline]
    49. Granado-Font E, Flores-Mateo G, Sorlí-Aguilar M, Montaña-Carreras X, Ferre-Grau C, Barrera-Uriarte M, et al. Effectiveness of a Smartphone application and wearable device for weight loss in overweight or obese primary care patients: protocol for a randomised controlled trial. BMC Public Health 2015 Jun 04;15:531 [FREE Full text] [CrossRef] [Medline]
    50. Hurley JC, Hollingshead KE, Todd M, Jarrett CL, Tucker WJ, Angadi SS, et al. The Walking Interventions Through Texting (WalkIT) trial: rationale, design, and protocol for a factorial randomized controlled trial of adaptive interventions for overweight and obese, inactive adults. JMIR Res Protoc 2015 Sep 11;4(3):e108 [FREE Full text] [CrossRef] [Medline]
    51. Pellegrini CA, Steglitz J, Johnston W, Warnick J, Adams T, McFadden HG, et al. Design and protocol of a randomized multiple behavior change trial: Make Better Choices 2 (MBC2). Contemp Clin Trials 2015 Mar;41:85-92 [FREE Full text] [CrossRef] [Medline]
    52. Agboola S, Palacholla RS, Centi A, Kvedar J, Jethwani K. A multimodal mHealth intervention (FeatForward) to improve physical activity behavior in patients with high cardiometabolic risk factors: rationale and protocol for a randomized controlled trial. JMIR Res Protoc 2016 May 12;5(2):e84 [FREE Full text] [CrossRef] [Medline]
    53. Amorim AB, Pappas E, Simic M, Ferreira ML, Tiedemann A, Jennings M, et al. Integrating Mobile health and Physical Activity to reduce the burden of Chronic low back pain Trial (IMPACT): a pilot trial protocol. BMC Musculoskelet Disord 2016;17:36 [FREE Full text] [CrossRef] [Medline]
    54. Duncan MJ, Vandelanotte C, Trost SG, Rebar AL, Rogers N, Burton NW, et al. Balanced: a randomised trial examining the efficacy of two self-monitoring methods for an app-based multi-behaviour intervention to improve physical activity, sitting and sleep in adults. BMC Public Health 2016 Jul 30;16:670 [FREE Full text] [CrossRef] [Medline]
    55. Jones D, Skrepnik N, Toselli RM, Leroy B. Incorporating novel mobile health technologies into management of knee osteoarthritis in patients treated with intra-articular hyaluronic acid: rationale and protocol of a randomized controlled trial. JMIR Res Protoc 2016 Aug 09;5(3):e164 [FREE Full text] [CrossRef] [Medline]
    56. Ortiz AM, Tueller SJ, Cook SL, Furberg RD. ActiviTeen: a protocol for deployment of a consumer wearable device in an academic setting. JMIR Res Protoc 2016 Jul;5(3):e153 [FREE Full text] [CrossRef] [Medline]
    57. Shin DW, Joh H, Yun JM, Kwon HT, Lee H, Min H, et al. Design and baseline characteristics of participants in the Enhancing Physical Activity and Reducing Obesity through Smartcare and Financial Incentives (EPAROSFI): a pilot randomized controlled trial. Contemp Clin Trials 2016 Mar;47:115-122. [CrossRef] [Medline]
    58. Taylor D, Murphy J, Ahmad M, Purkayastha S, Scholtz S, Ramezani R, et al. Quantified-self for obesity: physical activity behaviour sensing to improve health outcomes. Stud Health Technol Inform 2016;220:414-416. [Medline]
    59. van Nassau F, van der Ploeg HP, Abrahamsen F, Andersen E, Anderson AS, Bosmans JE, et al. Study protocol of European Fans in Training (EuroFIT): a four-country randomised controlled trial of a lifestyle program for men delivered in elite football clubs. BMC Public Health 2016 Jul 19;16:598. [CrossRef]
    60. Brickwood K, Smith ST, Watson G, Williams AD. The effect of ongoing feedback on physical activity levels following an exercise intervention in older adults: a randomised controlled trial protocol. BMC Sports Sci Med Rehabil 2017 Jan 10;9:1. [CrossRef]
    61. Ridgers ND, Timperio A, Brown H, Ball K, Macfarlane S, Lai SK, et al. A cluster-randomised controlled trial to promote physical activity in adolescents: the Raising Awareness of Physical Activity (RAW-PA) Study. BMC Public Health 2017 Jan 4;17:6. [CrossRef]
    62. Wolk S, Meißner T, Linke S, Müssle B, Wierick A, Bogner A, et al. Use of activity tracking in major visceral surgery—the Enhanced Perioperative Mobilization (EPM) trial: study protocol for a randomized controlled trial. Trials 2017 Feb 21;18(1):77. [CrossRef]
    63. Slootmaker SM, Chin A Paw MJ, Schuit AJ, Seidell JC, van Mechelen W. Promoting physical activity using an activity monitor and a tailored web-based advice: design of a randomized controlled trial [ISRCTN93896459]. BMC Public Health 2005 Dec 15;5:134. [CrossRef]
    64. Slootmaker SM, Chinapaw MJ, Schuit AJ, Seidell JC, Van Mechelen W. Feasibility and effectiveness of online physical activity advice based on a personal activity monitor: randomized controlled trial. J Med Internet Res 2009;11(3):e27 [FREE Full text] [CrossRef] [Medline]
    65. Fujiki Y, Kazakos K, Puri C, Pavlidis I, Starren J, Levine J. NEAT-o-games: ubiquitous activity-based gaming. In: CHI'07 extended abstracts. 2007 Presented at: CHI '07 Conference on Human Factors in Computing Systems 2007; April 30 - May 3; San Jose, CA. [CrossRef]
    66. Hurling R, Catt M, Boni MD, Fairley BW, Hurst T, Murray P, et al. Using internet and mobile phone technology to deliver an automated physical activity program: randomized controlled trial. J Med Internet Res 2007 Apr;9(2):e7 [FREE Full text] [CrossRef] [Medline]
    67. Polzien KM, Jakicic JM, Tate DF, Otto AD. The efficacy of a technology-based system in a short-term behavioral weight loss intervention. Obesity (Silver Spring) 2007 Apr;15(4):825-830. [CrossRef] [Medline]
    68. Consolvo S, Klasnja P, McDonald D, Avrahami D, Froehlich J, LeGrand L. Flowers or a robot army? Encouraging awareness & activity with personal, mobile displays. 2008 Presented at: 10th International Conference on Ubiquitous Computing (UBICOMP) 2008; September 21-24; Florence, Italy. [CrossRef]
    69. Faridi Z, Liberti L, Shuval K, Northrup V, Ali A, Katz DL. Evaluating the impact of mobile telephone technology on type 2 diabetic patients' self-management: the NICHE pilot study. J Eval Clin Pract 2008 Jun;14(3):465-469. [CrossRef] [Medline]
    70. Fujiki Y, Kazakos K, Puri C, Buddharaju P, Pavlidis I, Levine J. NEAT-o-Games. Comput Entertain 2008 Jul 01;6(2):21. [CrossRef]
    71. Goris A, Holmes R. The effect of a lifestyle activity intervention program on improving physical activity behavior of employees. 2008 Presented at: Third International Conference on Persuasive Technology (PERSUASIVE) 2008; June 4-6; Oulu, Finland. [CrossRef]
    72. Lacroix J, Saini P, Holmes R. The relationship between goal difficulty performance in the context of a physical activity intervention program. 2008 Presented at: 10th International Conference on Human Computer Interaction with mobile devices and services 2008; September 2-5; Amsterdam, NL. [CrossRef]
    73. Bickmore TW, Mauer D, Brown T. Context awareness in a handheld exercise agent. Pervasive Mob Comput 2009 Jun 01;5(3):226-235 [FREE Full text] [CrossRef] [Medline]
    74. Fialho A, van den Heuval H, Shahab Q, Liu Q, Li L, Saini P. ActiveShare: sharing challenges to increase physical activities. In: CHI'09 Extended Abstracts. 2009 Presented at: CHI'09 Conference on Human Factors in Computing Systems 2009; April 4-9; Boston, MA. [CrossRef]
    75. Arsand E, Tatara N, Østengen G, Hartvigsen G. Mobile phone-based self-management tools for type 2 diabetes: the few touch application. J Diabetes Sci Technol 2010 Mar;4(2):328-336 [FREE Full text] [Medline]
    76. Mattila E, Pärkkä J, Hermersdorf M, Kaasinen J, Vainio J, Samposalo K, et al. Mobile diary for wellness management--results on usage and usability in two user studies. IEEE Trans Inf Technol Biomed 2008 Jul;12(4):501-512. [CrossRef] [Medline]
    77. Mattila E, Lappalainen R, Pärkkä J, Salminen J, Korhonen I. Use of a mobile phone diary for observing weight management and related behaviours. J Telemed Telecare 2010;16(5):260-264. [CrossRef] [Medline]
    78. Leal Penados A, Gielen M, Stappers P, Jongert T. Get up and move: an interactive cuddly toy that stimulates physical activity. Pers Ubiquit Comput 2009 Dec 17;14(5):397-406. [CrossRef]
    79. Lim B, Shick A, Harrison C, Hudson S. Pediluma: Motivating Physical Activity Through Contextual Information and Social Influence. 2011 Presented at: 4th International Conference on Tangible, Embedded and Embodied Interaction (TEI) 2011; January 22-26; Funchal, Portugal. [CrossRef]
    80. Shuger SL, Barry VW, Sui X, McClain A, Hand GA, Wilcox S, et al. Electronic feedback in a diet- and physical activity-based lifestyle intervention for weight loss: a randomized controlled trial. Int J Behav Nutr Phys Act 2011 May 18;8:41 [FREE Full text] [CrossRef] [Medline]
    81. Burns P, Lueg C, Berkovsky S. Activmon: encouraging physical activity through ambient social awareness. In: CHI'12 Extended Abstracts.: ACM; 2012 Presented at: CHI'12 Conference on Human Factors in Computing Systems 2012; May 5-10; Austin, TX. [CrossRef]
    82. Pellegrini CA, Verba SD, Otto AD, Helsel DL, Davis KK, Jakicic JM. The comparison of a technology-based system and an in-person behavioral weight loss intervention. Obesity (Silver Spring) 2012 Feb;20(2):356-363 [FREE Full text] [CrossRef] [Medline]
    83. Reijonsaari K, Vehtari A, Kahilakoski O, van Mechelen W, Aro T, Taimela S. The effectiveness of physical activity monitoring and distance counseling in an occupational setting - results from a randomized controlled trial (CoAct). BMC Public Health 2012;12:344 [FREE Full text] [CrossRef] [Medline]
    84. Van HK, Boen F, Lefevre J. The effects of physical activity feedback on behavior and awareness in employees: study protocol for a randomized controlled trial. Int J Telemed Appl 2012;2012:460712 [FREE Full text] [CrossRef] [Medline]
    85. Xu Y, Poole E, Miller A, Eiriksdottir E, Catrambone R, Mynatt E. Designing pervasive health games for sustainability, adaptability and sociability. 2012 Presented at: International Conference on the Foundations of Digital Games 2012; May 29 - June 1; Raleigh, NC. [CrossRef]
    86. Poole E, Eiríksdóttir E, Miller A, Xu Y, Catrambone R, Mynatt E. Designing for spectators and coaches: social support in pervasive health games for youth. 2013 Presented at: 7th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) 2013; May 5-8; Venice, Italy. [CrossRef]
    87. Barwais FA, Cuddihy TF, Tomson LM. Physical activity, sedentary behavior and total wellness changes among sedentary adults: a 4-week randomized controlled trial. Health Qual Life Outcomes 2013;11:183 [FREE Full text] [CrossRef] [Medline]
    88. Bentley F, Tollmar K, Stephenson P, Levy L, Jones B, Robertson S, et al. Health Mashups. ACM Trans Comput Hum Interact 2013 Nov 01;20(5):1-27. [CrossRef]
    89. Chatterjee S, Byun J, Pottathil A, Moore M, Dutta K, Xie H. Persuasive sensing: a novel in-home monitoring technology to assist elderly adult diabetic patients. 2012 Presented at: 7th International Conference on Persuasive Technology (PERSUASIVE). Design for Health and Safety 2012; June 6-8; Linköping, Sweden. [CrossRef]
    90. Chatterjee S, Dutta K, Xie H, Jongbok B, Pottathil A, Moore M. Persuasive and pervasive sensing: a new frontier to monitor, track and assist older adults suffering from type-2 diabetes. 2013 Presented at: 46th Hawaii International Conference on System Sciences (HICSS) 2013; January 7-10; Maui, Hawaii. [CrossRef]
    91. Fitzsimons CF, Kirk A, Baker G, Michie F, Kane C, Mutrie N. Using an individualised consultation and activPAL™ feedback to reduce sedentary time in older Scottish adults: results of a feasibility and pilot study. Prev Med 2013 Nov;57(5):718-720. [CrossRef] [Medline]
    92. Harries T, Eslambolchilar P, Stride C, Rettie R, Walton S. Walking in the wild - using an always-on smartphone application to increase physical activity. 2013 Presented at: 14th IFIP International Conference on Human-Computer Interaction INTERACT 2013; September 2-6; Cape Town, South Africa. [CrossRef]
    93. Harries T, Eslambolchilar P, Rettie R, Stride C, Walton S, van Woerden HC. Effectiveness of a smartphone app in increasing physical activity amongst male adults: a randomised controlled trial. BMC Public Health 2016 Sep 02;16:925 [FREE Full text] [CrossRef] [Medline]
    94. Hirano S, Farrell R, Danis C, Kellogg W. WalkMinder: encouraging an active lifestyle using mobile phone interruptions. In: CHI'13 Extended Abstracts. 2013 Presented at: CHI'13 Conference on Human Factors in Computing Systems 2013; April 27 - May 2; Paris, France. [CrossRef]
    95. Khalil A, Abdallah S. Harnessing social dynamics through persuasive technology to promote healthier lifestyle. Comput Human Behav 2013 Nov;29(6):2674-2681. [CrossRef]
    96. Khan A, Lee S. Need for a Context-Aware Personalized Health Intervention System to Ensure Long-Term Behavior Change to Prevent Obesity. 2013 Presented at: 35th International Conference on Software Engineering (ICSE), 5th International Workshop on Software Engineering in Health Care (SEHC) 2013; May 18-26; San Francisco, CA p. 71-74. [CrossRef]
    97. King AC, Hekler EB, Grieco LA, Winter SJ, Sheats JL, Buman MP, et al. Harnessing different motivational frames via mobile phones to promote daily physical activity and reduce sedentary behavior in aging adults. PLoS One 2013 Apr;8(4):e62613 [FREE Full text] [CrossRef] [Medline]
    98. King AC, Hekler EB, Grieco LA, Winter SJ, Sheats JL, Buman MP, et al. Effects of three motivationally targeted mobile device applications on initial physical activity and sedentary behavior change in midlife and older adults: a randomized trial. PLoS One 2016;11(6):e0156370 [FREE Full text] [CrossRef] [Medline]
    99. Nakajima T, Lehdonvirta V. Designing motivation using persuasive ambient mirrors. Pers Ubiquit Comput 2011 Oct 4;17(1):107-126. [CrossRef]
    100. Tabak M, op den Akker H, Hermens H. Motivational cues as real-time feedback for changing daily activity behavior of patients with COPD. Patient Educ Couns 2014 Mar;94(3):372-378. [CrossRef]
    101. Tabak M, Vollenbroek-Hutten MM, van der Valk P, van der Palen J, Hermens HJ. A telerehabilitation intervention for patients with Chronic Obstructive Pulmonary Disease: a randomized controlled pilot trial. Clin Rehabil 2014 Jun;28(6):582-591. [CrossRef] [Medline]
    102. Valentin G, Howard A. Dealing with childhood obesity: passive versus active activity monitoring approaches for engaging individuals in exercise. 2013 Presented at: ISSNIP Biosignals and Biorobotics Conference (BRC) 2013; February 18-20; Rio de Janeiro, Brazil p. 166-170. [CrossRef]
    103. Bond DS, Thomas JG, Raynor HA, Moon J, Sieling J, Trautvetter J, et al. B-MOBILE--a smartphone-based intervention to reduce sedentary time in overweight/obese individuals: a within-subjects experimental trial. PLoS One 2014;9(6):e100821 [FREE Full text] [CrossRef] [Medline]
    104. Thomas JG, Bond DS. Behavioral response to a just-in-time adaptive intervention (JITAI) to reduce sedentary behavior in obese adults: implications for JITAI optimization. Health Psychol 2015 Dec;34S:1261-1267 [FREE Full text] [CrossRef] [Medline]
    105. Caulfield B, Kaljo I, Donnelly S. Use of a consumer market activity monitoring and feedback device improves exercise capacity and activity levels in COPD. Conf Proc IEEE Eng Med Biol Soc 2014;2014:1765-1768. [CrossRef] [Medline]
    106. Chen Y, Pu P. HealthyTogether: exploring social incentives for mobile fitness applications. 2014 Presented at: 2nd International Symposium of Chinese CHI 2014; April 26-27; Toronto, ON, Canada. [CrossRef]
    107. Glynn LG, Hayes PS, Casey M, Glynn F, Alvarez-Iglesias A, Newell J, et al. SMART MOVE - a smartphone-based intervention to promote physical activity in primary care: study protocol for a randomized controlled trial. Trials 2013;14:157 [FREE Full text] [CrossRef] [Medline]
    108. Glynn LG, Hayes PS, Casey M, Glynn F, Alvarez-Iglesias A, Newell J, et al. Effectiveness of a smartphone application to promote physical activity in primary care: the SMART MOVE randomised controlled trial. Br J Gen Pract 2014 Jul;64(624):e384-e391. [CrossRef] [Medline]
    109. Casey M, Hayes PS, Glynn F, OLaighin G, Heaney D, Murphy AW, et al. Patients' experiences of using a smartphone application to increase physical activity: the SMART MOVE qualitative study in primary care. Br J Gen Pract 2014 Aug;64(625):e500-e508 [FREE Full text] [CrossRef] [Medline]
    110. Miller A, Mynatt E. StepStream: a school-based pervasive social fitness system for everyday adolescent health. 2014 Presented at: CHI '14 Conference on Human Factors in Computing Systems 2014; April 26 - May 1; Toronto, ON, Canada. [CrossRef]
    111. Thompson WG, Kuhle CL, Koepp GA, McCrady-Spitzer SK, Levine JA. “Go4Life” exercise counseling, accelerometer feedback, and activity levels in older people. Arch Gerontol Geriatr 2014 May;58(3):314-319. [CrossRef] [Medline]
    112. Thorndike AN, Mills S, Sonnenberg L, Palakshappa D, Gao T, Pau CT, et al. Activity monitor intervention to promote physical activity of physicians-in-training: randomized controlled trial. PLoS One 2014 Jun;9(6):e100251 [FREE Full text] [CrossRef] [Medline]
    113. Verwey R, van der Weegen S, Spreeuwenberg M, Tange H, van der Weijden T, de Witte L. A pilot study of a tool to stimulate physical activity in patients with COPD or type 2 diabetes in primary care. J Telemed Telecare 2014 Jan 10;20(1):29-34. [CrossRef]
    114. Walsh G, Golbeck J. A preliminary investigation of a personal informatics-based social game on behavior change. In: CHI'14 Extended Abstracts. 2014 Presented at: CHI'14 Conference on Human Factors in Computing Systems 2014; April 26 - May 1; Toronto, ON, Canada. [CrossRef]
    115. Zuckerman O, Gal-Oz A. Deconstructing gamification: evaluating the effectiveness of continuous measurement, virtual rewards, and social comparison for promoting physical activity. Pers Ubiquit Comput 2014 Jul 5;18(7):1705-1719. [CrossRef]
    116. Cadmus-Bertram LA, Marcus BH, Patterson RE, Parker BA, Morey BL. Randomized trial of a fitbit-based physical activity intervention for women. Am J Prev Med 2015 Sep;49(3):414-418. [CrossRef] [Medline]
    117. Cadmus-Bertram L, Marcus BH, Patterson RE, Parker BA, Morey BL. Use of the Fitbit to measure adherence to a physical activity intervention among overweight or obese, postmenopausal women: self-monitoring trajectory during 16 weeks. JMIR Mhealth Uhealth 2015;3(4):e96 [FREE Full text] [CrossRef] [Medline]
    118. Direito A, Jiang Y, Whittaker R, Maddison R. Apps for IMproving FITness and increasing physical activity among young people: the AIMFIT pragmatic randomized controlled trial. J Med Internet Res 2015 Aug 27;17(8):e210 [FREE Full text] [CrossRef] [Medline]
    119. Finkelstein EA, Sahasranaman A, John G, Haaland BA, Bilger M, Sloan RA, et al. Design and baseline characteristics of participants in the TRial of Economic Incentives to Promote Physical Activity (TRIPPA): a randomized controlled trial of a six month pedometer program with financial incentives. Contemp Clin Trials 2015 Mar;41:238-247. [CrossRef] [Medline]
    120. Finkelstein EA, Haaland BA, Bilger M, Sahasranaman A, Sloan RA, Nang EE, et al. Effectiveness of activity trackers with and without incentives to increase physical activity (TRIPPA): a randomised controlled trial. Lancet Diabetes Endocrinol 2016 Dec;4(12):983-995. [CrossRef]
    121. Frederix I, Van Driessche N, Hansen D, Berger J, Bonne K, Alders T, et al. Increasing the medium-term clinical benefits of hospital-based cardiac rehabilitation by physical activity telemonitoring in coronary artery disease patients. Eur J Prev Cardiol 2015 Feb;22(2):150-158. [CrossRef] [Medline]
    122. Frederix I, Hansen D, Coninx K, Vandervoort P, Vandijck D, Hens N, et al. Medium-term effectiveness of a comprehensive internet-based and patient-specific telerehabilitation program with text messaging support for cardiac patients: randomized controlled trial. J Med Internet Res 2015;17(7):e185 [FREE Full text] [CrossRef] [Medline]
    123. Garde A, Umedaly A, Abulnaga SM, Robertson L, Junker A, Chanoine JP, et al. Assessment of a mobile game ('MobileKids Monster Manor') to promote physical activity among children. Games Health J 2015 Apr;4(2):149-158. [CrossRef] [Medline]
    124. Gouveia R, Karapanos E, Hassenzahl M. How do we engage with activity trackers?: A longitudinal study of Habito. 2015 Presented at: ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP) 2015; September 7-11; Osaka, Japan. [CrossRef]
    125. Guthrie N, Bradlyn A, Thompson SK, Yen S, Haritatos J, Dillon F, et al. Development of an accelerometer-linked online intervention system to promote physical activity in adolescents. PLoS One 2015 May;10(5):e0128639 [FREE Full text] [CrossRef] [Medline]
    126. Komninos A, Dunlop M, Rowe D, Hewitt A, Coull S. Using Degraded Music Quality to Encourage a Health Improving Walking Pace: BeatClearWalker. 2015 Presented at: 9th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) 2015; May 20-23; Istanbul, Turkey p. 57-64. [CrossRef]
    127. Lee M, Kim J, Forlizzi J, Kiesler S. Personalization revisited: a reflective approach helps people better personalize health services and motivates them to increase physical activity. : ACM; 2015 Presented at: ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP) 2015; September 7-11; Osaka, Japan. [CrossRef]
    128. Lee M, Cha S, Nam T. Patina engraver: Visualizing activity logs as patina in fashionable trackers. 2015 Presented at: CHI '15 Conference on Human Factors in Computing Systems 2015; April 18-23; Seoul, Korea. [CrossRef]
    129. Martin SS, Feldman DI, Blumenthal RS, Jones SR, Post WS, McKibben RA, et al. mActive: a randomized clinical trial of an automated mHealth intervention for physical activity promotion. J Am Heart Assoc 2015 Nov 09;4(11):e002239 [FREE Full text] [CrossRef] [Medline]
    130. Munson S, Krupka E, Richardson C, Resnick P. Effects of public commitments and accountability in a technology-supported physical activity intervention. 2015 Presented at: CHI'15 Conference on Human Factors in Computing Systems 2015; Apilr 18-23; Seoul, Korea. [CrossRef]
    131. Rabbi M, Aung M, Zhang M, Choudhury T. MyBehavior: Automatic Personalized Health Feedback from User Behaviors and Preferences using Smartphones. 2015 Presented at: Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP) 2015; September 7-11; Osaka, Japan. [CrossRef]
    132. Verwey R, van der Weegen S, Spreeuwenberg M, Tange H, van der Weijden T, de Witte L. A monitoring and feedback tool embedded in a counselling protocol to increase physical activity of patients with COPD or type 2 diabetes in primary care: study protocol of a three-arm cluster randomised controlled trial. BMC Fam Pract 2014;15:93 [FREE Full text] [CrossRef] [Medline]
    133. van der Weegen S, Verwey R, Spreeuwenberg M, Tange H, van der Weijde T, de Witte L. It's LiFe! Mobile and web-based monitoring and feedback tool embedded in primary care increases physical activity: a cluster randomized controlled trial. J Med Internet Res 2015 Jul 24;17(7):e184 [FREE Full text] [CrossRef] [Medline]
    134. Verwey R, van der Weegan S, Spreeuwenberg M, Tange H, Van der Weijden T, De Witte L. Process evaluation of physical activity counselling with and without the use of mobile technology: A mixed methods study. Int J Nurs Stud 2016 Jan;53:3-16. [CrossRef] [Medline]
    135. Wadhwa R, Chugh A, Kumar A, Singh M, Yadav K, Eswaran S. SenseX: DesignDeployment of a Pervasive Wellness Monitoring Platform for Workplaces. 2015 Presented at: 13th International Conference on Service-Oriented Computing (ICSOC) 2015; November 16-19; Goa, India p. 427-443. [CrossRef]
    136. Wang JB, Cadmus-Bertram LA, Natarajan L, White MM, Madanat H, Nichols JF, et al. Wearable sensor/device (Fitbit One) and SMS text-messaging prompts to increase physical activity in overweight and obese adults: a randomized controlled trial. Telemed J E Health 2015 Oct;21(10):782-792. [CrossRef] [Medline]
    137. Watson S, Woodside JV, Ware LJ, Hunter SJ, McGrath A, Cardwell CR, et al. Effect of a web-based behavior change program on weight loss and cardiovascular risk factors in overweight and obese adults at high risk of developing cardiovascular disease: randomized controlled trial. J Med Internet Res 2015;17(7):e177 [FREE Full text] [CrossRef] [Medline]
    138. Broekhuizen K, de Gelder J, Wijsman CA, Wijsman LW, Westendorp RGJ, Verhagen E, et al. An internet-based physical activity intervention to improve quality of life of inactive older adults: a randomized controlled trial. J Med Internet Res 2016;18(4):e74 [FREE Full text] [CrossRef] [Medline]
    139. Butryn ML, Arigo D, Raggio GA, Colasanti M, Forman EM. Enhancing physical activity promotion in midlife women with technology-based self-monitoring and social connectivity: a pilot study. J Health Psychol 2014 Dec 8;21(8):1548-1555. [CrossRef] [Medline]
    140. Choi J, Lee JH, Vittinghoff E, Fukuoka Y. mHealth physical activity intervention: a randomized pilot study in physically inactive pregnant women. Matern Child Health J 2016 May;20(5):1091-1101. [CrossRef] [Medline]
    141. Ciman M, Donini M, Gaggi O, Aiolli F. Stairstep recognition and counting in a serious Game for increasing users’ physical activity. Pers Ubiquit Comput 2016 Sep 22;20(6):1015-1033. [CrossRef]
    142. Darvall JN, Parker A, Story DA. Feasibility and acceptability of remotely monitored pedometer-guided physical activity. Anaesth Intensive Care 2016 Jul;44(4):501-506 [FREE Full text] [Medline]
    143. Ding X, Xu J, Wang H, Chen G, Thind H, Zhang Y. WalkMore: Promoting Walking with Just-in-Time Context-Aware Prompts. 2016 Presented at: IEEE Wireless Health (WH) 2016; October 25-27; Bethesda, MD p. 65-72. [CrossRef]
    144. Fennell C, Gerhart H, Seo Y, Hauge K, Glickman EL. Combined incentives versus no-incentive exercise programs on objectively measured physical activity and health-related variables. Physiol Behav 2016 Sep 01;163:245-250. [CrossRef] [Medline]
    145. Garde A, Umedaly A, Abulnaga SM, Junker A, Chanoine JP, Johnson M, et al. Evaluation of a novel mobile exergame in a school-based environment. Cyberpsychol Behav Soc Netw 2016 Mar;19(3):186-192. [CrossRef] [Medline]
    146. Gilson ND, Pavey TG, Vandelanotte C, Duncan MJ, Gomersall SR, Trost SG, et al. Chronic disease risks and use of a smartphone application during a physical activity and dietary intervention in Australian truck drivers. Aust N Z J Public Health 2016 Feb;40(1):91-93. [CrossRef] [Medline]
    147. Glance D, Ooi E, Berman Y, Glance C, Barrett P. Impact of a Digital Activity Tracker-based Workplace Activity Program on Health and Wellbeing. 2016 Presented at: DH'16: Proceedings of 6th International Digital Health Conference 2016; April 11-13; Montreal, Québec, Canada p. 37-41. [CrossRef]
    148. H-Jennings F, Clément M, Brown M, Leong B, Shen L, Dong C. Promote students' healthy behavior through sensor and game: a randomized controlled trial. Med Sci Educ 2016 May 3;26(3):349-355. [CrossRef]
    149. Hartman SJ, Nelson SH, Cadmus-Bertram LA, Patterson RE, Parker BA, Pierce JP. Technology- and phone-based weight loss intervention: pilot RCT in women at elevated breast cancer risk. Am J Prev Med 2016 Nov;51(5):714-721. [CrossRef] [Medline]
    150. Herrmanny K, Ziegler J, Dogangün A. Supporting Users in Setting Effective Goals in Activity Tracking. : Springer International Publishing; 2016 Presented at: 11th International Conference on Persuasive Technology (PERSUASIVE) 2016; April 5-7; Salzburg, Austria p. 11-26. [CrossRef]
    151. Melton BF, Buman MP, Vogel RL, Harris BS, Bigham LE. Wearable devices to improve physical activity and sleep. ‎J Black Stud 2016 Jul 27;47(6):610-625. [CrossRef]
    152. Patel MS, Asch DA, Rosin R, Small DS, Bellamy SL, Eberbach K, et al. Individual versus team-based financial incentives to increase physical activity: a randomized, controlled trial. J Gen Intern Med 2016 Jul;31(7):746-754 [FREE Full text] [CrossRef] [Medline]
    153. Patel MS, Volpp KG, Rosin R, Bellamy SL, Small DS, Fletcher MA, et al. A randomized trial of social comparison feedback and financial incentives to increase physical activity. Am J Health Promot 2016 Jul;30(6):416-424. [CrossRef] [Medline]
    154. Patel MS, Asch DA, Rosin R, Small DS, Bellamy SL, Heuer J, et al. Framing financial incentives to increase physical activity among overweight and obese adults: a randomized, controlled trial. Ann Intern Med 2016 Mar 15;164(6):385-394. [CrossRef] [Medline]
    155. Paul L, Wyke S, Brewster S, Sattar N, Gill JM, Alexander G, et al. Increasing physical activity in stroke survivors using STARFISH, an interactive mobile phone application: a pilot study. Top Stroke Rehabil 2016 Jun;23(3):170-177. [CrossRef] [Medline]
    156. Quintiliani LM, Mann DM, Puputti M, Quinn E, Bowen DJ. Pilot and feasibility test of a mobile health-supported behavioral counseling intervention for weight management among breast cancer survivors. JMIR Cancer 2016;2(1) [FREE Full text] [CrossRef] [Medline]
    157. Vorrink SN, Kort HS, Troosters T, Zanen P, Lammers JJ. Efficacy of an mHealth intervention to stimulate physical activity in COPD patients after pulmonary rehabilitation. Eur Respir J 2016 Oct;48(4):1019-1029. [CrossRef] [Medline]
    158. Walsh JC, Corbett T, Hogan M, Duggan J, McNamara A. An mHealth intervention using a smartphone app to increase walking behavior in young adults: a pilot study. JMIR Mhealth Uhealth 2016 Sep 22;4(3):e109 [FREE Full text] [CrossRef] [Medline]
    159. Yingling LR, Brooks AT, Wallen GR, Peters-Lawrence M, McClurkin M, Cooper-McCann R, et al. Community engagement to optimize the use of web-based and wearable technology in a cardiovascular health and needs assessment study: a mixed methods approach. JMIR Mhealth Uhealth 2016 Apr 25;4(2):e38 [FREE Full text] [CrossRef] [Medline]
    160. Ashton LM, Morgan PJ, Hutchesson MJ, Rollo ME, Collins CE. Feasibility and preliminary efficacy of the 'HEYMAN' healthy lifestyle program for young men: a pilot randomised controlled trial. Nutr J 2017 Jan 13;16(1):2 [FREE Full text] [CrossRef] [Medline]
    161. Chen Y, Chen Y, Randriambelonoro M, Geissbuhler A, Pu P. Design Considerations for Social Fitness Applications: Comparing Chronically Ill Patients and Healthy Adults. 2017 Presented at: Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW) 2017; February 25 - Mar 1; Portland, OR p. 1753-1762. [CrossRef]
    162. Chung AE, Skinner AC, Hasty SE, Perrin EM. Tweeting to health: a novel mHealth intervention using Fitbits and Twitter to foster healthy lifestyles. Clin Pediatr (Phila) 2016 Jun 16;56(1):26-32. [CrossRef] [Medline]
    163. Gell NM, Grover KW, Humble M, Sexton M, Dittus K. Efficacy, feasibility, and acceptability of a novel technology-based intervention to support physical activity in cancer survivors. Support Care Cancer 2017 Apr;25(4):1291-1300. [CrossRef] [Medline]
    164. McMahon SK, Lewis B, Oakes JM, Wyman JF, Guan W, Rothman AJ. Assessing the effects of interpersonal and intrapersonal behavior change strategies on physical activity in older adults: a factorial experiment. Ann Behav Med 2017 Jun;51(3):376-390. [CrossRef] [Medline]
    165. Neil-Sztramko SE, Gotay CC, Sabiston CM, Demers PA, Campbell KC. Feasibility of a telephone and web-based physical activity intervention for women shift workers. Transl Behav Med 2017 Jun;7(2):268-276 [FREE Full text] [CrossRef] [Medline]
    166. Valle CG, Deal AM, Tate DF. Preventing weight gain in African American breast cancer survivors using smart scales and activity trackers: a randomized controlled pilot study. J Cancer Surviv 2017 Feb;11(1):133-148. [CrossRef] [Medline]
    167. Craig CL, Marshall AL, Sjöström M, Bauman AE, Booth ML, Ainsworth BE, et al. International physical activity questionnaire: 12-country reliability and validity. Med Sci Sports Exerc 2003 Aug;35(8):1381-1395. [CrossRef] [Medline]
    168. Stewart AL, Mills KM, King AC, Haskell WL, Gillis D, Ritter PL. CHAMPS physical activity questionnaire for older adults: outcomes for interventions. Med Sci Sports Exerc 2001 Jul;33(7):1126-1141. [Medline]
    169. Besson H, Brage S, Jakes RW, Ekelund U, Wareham NJ. Estimating physical activity energy expenditure, sedentary time, and physical activity intensity by self-report in adults. Am J Clin Nutr 2010 Jan;91(1):106-114 [FREE Full text] [CrossRef] [Medline]
    170. Godin G, Shephard RJ. A simple method to assess exercise behavior in the community. Can J Appl Sport Sci 1985 Sep;10(3):141-146. [Medline]
    171. Australian Institute of Health and Welfare. AIHW. 2003. The Active Australia Survey: A guide and manual for implementation, analysis and reporting   URL: https://www.aihw.gov.au/getmedia/ff25c134-5df2-45ba-b4e1-6c214ed157e6/aas.pdf.aspx?inline=true [accessed 2018-02-15] [WebCite Cache]
    172. Barwais FA, Cuddihy TF, Washington T, Tomson LM, Brymer E. Development and validation of a new self-report instrument for measuring sedentary behaviors and light-intensity physical activity in adults. J Phys Act Health 2014 Aug;11(6):1097-1104. [CrossRef] [Medline]
    173. Dipietro L, Caspersen CJ, Ostfeld AM, Nadel ER. A survey for assessing physical activity among older adults. Med Sci Sports Exerc 1993 May;25(5):628-642. [Medline]
    174. Armstrong T, Bull F. Development of the World Health Organization Global Physical Activity Questionnaire (GPAQ). J Public Health 2006 Mar 2;14(2):66-70. [CrossRef]
    175. Lewis JR. IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int J Hum Comput Interact 1995 Jan;7(1):57-78. [CrossRef]
    176. Connelly K. On developing a technology acceptance model for pervasive computing. In: Workshop of Ubiquitous System Evaluation (USE). 2007 Presented at: 9th International Conference on Ubiquitous Computing (UBICOMP) 2007; September 16-19; Innsbruck, Austria.
    177. Ryan RM. Control and information in the intrapersonal sphere: an extension of cognitive evaluation theory. J Pers Soc Psychol 1982;43(3):450-461. [CrossRef]
    178. Read JC. Validating the Fun Toolkit: an instrument for measuring children’s opinions of technology. Cogn Tech Work 2007 May 22;10(2):119-128. [CrossRef]
    179. Horvath AO, Greenberg LS. Development and validation of the Working Alliance Inventory. J Couns Psychol 1989;36(2):223-233. [CrossRef]
    180. Brooke J. SUS: A quick and dirty usability scale. In: Usability evaluation in industry. London: Taylor & Francis; 1996.
    181. Klasnja P, Consolvo S, Pratt W. How to evaluate technologies for health behavior change in HCI research. 2011 Presented at: CHI'11 Conference on Human Factors in Computing Systems 2011; May 7-12; Vancouver, BC, Canada. [CrossRef]
    182. Hekler E, Klasnja P, Froehlich J, Buman M. Mind the theoretical gap: interpreting, using, and developing behavioral theory in HCI research. 2013 Presented at: CHI'13 Conference on Human Factors in Computing Systems 2013; April 27 - May 2; Paris, France. [CrossRef]
    183. Whittaker R, Merry S, Dorey E, Maddison R. A development and evaluation process for mHealth interventions: examples from New Zealand. J Health Commun 2012;17(Suppl 1):11-21. [CrossRef] [Medline]
    184. Kravitz RL, Duan N, Niedzinski EJ, Hay MC, Subramanian SK, Weisner TS. What ever happened to N-of-1 trials? Insiders' perspectives and a look to the future. Milbank Q 2008 Dec;86(4):533-555 [FREE Full text] [CrossRef] [Medline]
    185. Kay M, Nelson G, Hekler E. Researcher-centered design of statistics: Why Bayesian statistics better fit the culture and incentives of HCI. 2016 Presented at: CHI'16 Conference on Human Factors in Computing 2016; May 7-12; San Jose, CA. [CrossRef]
    186. Petticrew M, Tugwell P, Kristjansson E, Oliver S, Ueffing E, Welch V. Damned if you do, damned if you don't: subgroup analysis and equity. J Epidemiol Community Health 2012 Jan;66(1):95-98. [CrossRef] [Medline]
    187. Dobkin BH, Dorsch A. The promise of mHealth: daily activity monitoring and outcome assessments by wearable sensors. Neurorehabil Neural Repair 2011;25(9):788-798 [FREE Full text] [CrossRef] [Medline]
    188. Miočević M, MacKinnon DP, Levy R. Power in Bayesian mediation analysis for small sample research. Struct Equ Model 2017 Apr 25;24(5):666-683. [CrossRef]
    189. Ferguson T, Rowlands AV, Olds T, Maher C. The validity of consumer-level, activity monitors in healthy adults worn in free-living conditions: a cross-sectional study. Int J Behav Nutr Phys Act 2015;12:42 [FREE Full text] [CrossRef] [Medline]
    190. Mack H. Mobihealthnews. 2016. Fitbit creates research library with Fitabase, publishes results of corporate wellness study   URL: http:/​/www.​mobihealthnews.com/​content/​fitbit-creates-research-library-fitabase-publishes-results-corporate-wellness-study [accessed 2017-06-14] [WebCite Cache]
    191. Mohr DC, Schueller SM, Riley WT, Brown CH, Cuijpers P, Duan N, et al. Trials of intervention principles: evaluation methods for evolving behavioral intervention technologies. J Med Internet Res 2015;17(7):e166 [FREE Full text] [CrossRef] [Medline]
    192. Mohr DC, Lyon AR, Lattie EG, Reddy M, Schueller SM. Accelerating digital mental health research from early design and creation to successful implementation and sustainment. J Med Internet Res 2017 May 10;19(5):e153 [FREE Full text] [CrossRef] [Medline]
    193. O'Connor S, Hanlon P, O'Donnell CA, Garcia S, Glanville J, Mair FS. Understanding factors affecting patient and public engagement and recruitment to digital health interventions: a systematic review of qualitative studies. BMC Med Inform Decis Mak 2016 Sep 15;16(1):120 [FREE Full text] [CrossRef] [Medline]
    194. The international organization for standardization (ISO). ISO. 1998. 9241-11 Ergonomic requirements for office work with visual display terminals (VDTs)   URL: https://www.iso.org/standard/16883.html [accessed 2018-02-26] [WebCite Cache]
    195. Quesenbery W. The five dimensions of usability. In: Content and complexity: Information design in technical communication. Mahwah, NJ, USA: Lawrence Erlbaum Associates; 2003:81-102.
    196. Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. Br Med J 2010;340:c332 [FREE Full text] [Medline]
    197. Eysenbach G, CONSORT-EHEALTH Group. CONSORT-EHEALTH: improving and standardizing evaluation reports of web-based and mobile health interventions. J Med Internet Res 2011;13(4):e126 [FREE Full text] [CrossRef] [Medline]
    198. Michie S, Thomas J, Johnston M, Aonghusa PM, Shawe-Taylor J, Kelly MP, et al. The Human Behaviour-Change Project: harnessing the power of artificial intelligence and machine learning for evidence synthesis and interpretation. Implement Sci 2017 Oct 18;12(1):121 [FREE Full text] [CrossRef] [Medline]
    199. Direito A, Dale LP, Shields E, Dobson R, Whittaker R, Maddison R. Do physical activity and dietary smartphone applications incorporate evidence-based behaviour change techniques? BMC Public Health 2014;14:646 [FREE Full text] [CrossRef] [Medline]
    200. Cowan LT, Van Wagenen SA, Brown BA, Hedin RJ, Seino-Stephan Y, Hall PC, et al. Apps of steel: are exercise apps providing consumers with realistic expectations?: a content analysis of exercise apps for presence of behavior change theory. Health Educ Behav 2013 Apr;40(2):133-139. [CrossRef] [Medline]
    201. Lyons EJ, Lewis ZH, Mayrsohn BG, Rowland JL. Behavior change techniques implemented in electronic lifestyle activity monitors: a systematic content analysis. J Med Internet Res 2014;16(8):e192 [FREE Full text] [CrossRef] [Medline]
    202. Winter SJ, Sheats JL, King AC. The use of behavior change techniques and theory in technologies for cardiovascular disease prevention and treatment in adults: a comprehensive review. Prog Cardiovasc Dis 2016;58(6):605-612. [CrossRef] [Medline]


    Abbreviations

    ACM: Association for Computing Machinery Digital Library
    HCI: human computer interaction
    IEEE: Institute of Electrical and Electronics Engineers
    mHealth: mobile health
    MOST: multiphase optimisation strategy
    RCTs: randomized controlled trials


    Edited by A Aguilera; submitted 27.09.17; peer-reviewed by S Winter, L Yardley; comments to author 09.11.17; revised version received 01.01.18; accepted 07.01.18; published 23.03.18

    ©Claire McCallum, John Rooksby, Cindy M Gray. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 23.03.2018.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.