Accessibility settings

Published on in Vol 14 (2026)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/77616, first published .
Clinical Usability of Exercise Prescription Apps for Professional Use: Systematic Review and Multidimensional Evaluation

Clinical Usability of Exercise Prescription Apps for Professional Use: Systematic Review and Multidimensional Evaluation

Clinical Usability of Exercise Prescription Apps for Professional Use: Systematic Review and Multidimensional Evaluation

1Department of Family Medicine, Taipei Veterans General Hospital, Taipei, Taiwan

2Department of Family Medicine, MacKay Memorial Hospital, Taipei, Taiwan

3Center for Traditional Chinese Medicine, Chang Gung Memorial Hospital, Guishan, Taiwan, Taiwan

4Taipei Veterans General Hospital, Yuli Branch, No. 91, Xinxing St., Yuli Township, Hualien County, Taiwan

*these authors contributed equally

Corresponding Author:

Yu-Chun Chen, MD, Dr med


Background: Exercise prescription is a structured and individualized intervention that requires appropriate progression, tailoring, and behavioral support to ensure safety and long-term effectiveness. With the expansion of mobile health technologies, exercise prescription apps are increasingly used to support the remote delivery of prescribed exercise programs. However, the extent to which widely adopted apps align with established clinical standards remains unclear.

Objective: This study aimed to evaluate the clinical usability of popular, no-cost exercise prescription apps from a professional perspective, focusing on clinical integrity, intervention fidelity, behavioral mechanisms, and clinician-assessed digital usability.

Methods: A systematic search of Google Play and the Apple App Store identified widely adopted apps that enable clinician-directed exercise prescription. Eligible apps were evaluated using established frameworks, including the frequency, intensity, time, and type (FITT) and FITT, volume, and progression (FITT-VP) principles; the Consensus on Exercise Reporting Template (CERT); the Behavior Change Technique Taxonomy version 1 (BCTTv1); and the Mobile App Rating Scale (MARS). Descriptive analyses and interrater reliability assessments were performed.

Results: Six apps met the inclusion criteria. All satisfied the basic FITT requirements; however, none incorporated explicit guidance on exercise progression or individualized adjustment consistent with the FITT-VP principles. CERT evaluation demonstrated comprehensive reporting of structural components but a consistent absence of progression logic, tailoring strategies, and adverse event documentation. Although multiple behavior change techniques were identified, several techniques considered important for graded progression and sustained adherence in unsupervised settings were infrequently implemented or absent. Overall app quality was moderate, characterized by strong functionality but limited engagement. Only 2 apps reported evidence of scientific evaluation.

Conclusions: Widely adopted exercise prescription apps meet fundamental structural requirements but do not fully support the progressive and individualized processes central to clinical exercise prescription. These findings highlight a gap between structural prescription delivery and independent clinical exercise management. Exercise prescription apps may therefore be most appropriately positioned as adjunctive tools within clinician-guided or hybrid care models. Future development should prioritize transparent progression mechanisms, individualized adjustment, and the implementation of clinically relevant behavior change strategies to enhance safety and long-term effectiveness.

JMIR Mhealth Uhealth 2026;14:e77616

doi:10.2196/77616

Keywords



Exercise prescription is a structured and individualized program of exercise, systematically designed to improve health or functional capacity, and these programs are commonly developed by specialists on the basis of patients’ clinical status, goals, and needs [1]. Physicians in physical medicine and rehabilitation, as well as in primary care, routinely prescribe exercise as both treatment and secondary prevention for a broad range of conditions, including musculoskeletal, cardiovascular, neurological, and pulmonary diseases [2-9]. Given this wide scope of application, precision is essential to ensure both safety and effectiveness. According to the American College of Sports Medicine (ACSM), exercise prescription should specify exercise frequency, intensity, time, and type (FITT), with additional consideration of volume and progression (FITT-VP) to support appropriate physiological adaptation and minimize the risk of adverse events [10]. When these elements are appropriately defined and individualized, exercise therapy can deliver evidence-based benefits while reducing the likelihood of exercise-related harm [11].

With the expansion of mobile health (mHealth) technologies, exercise prescription apps have emerged as tools to support the remote delivery of prescribed exercise programs. These systems typically consist of a clinician-facing prescription platform and a patient-facing app, enabling clinicians to design structured exercise programs that patients can access via mobile devices or computers. Prior studies indicate that digitally supported exercise interventions can be feasible, cost-effective, and minimally disruptive to daily life while maintaining patient satisfaction with outpatient services [12-15]. Randomized controlled trials have further demonstrated improvements in adherence to home exercise programs and increased confidence in exercising [16]. Since the COVID-19 pandemic, the growing adoption of digital health solutions has also expanded the use of online training platforms and home-based exercise apps [17-20]. Moreover, app-supported exercise prescription has shown potential benefits in specific clinical populations, including individuals with sarcopenia [21], myocardial infarction [22], or long COVID [23].

Despite growing evidence supporting the feasibility and effectiveness of digitally delivered exercise interventions, concerns remain regarding the professionalism and completeness of exercise prescriptions generated by mobile apps [24]. Compared with face-to-face care, mobile platforms may limit the delivery of detailed instructions, real-time biomechanical feedback, and hands-on correction, all of which are integral to safe and effective exercise prescription. From a professional perspective, the suitability of exercise prescription apps depends on their ability to support clinicians in prescribing, monitoring, and adjusting exercise programs in accordance with established clinical standards. Key considerations include whether exercise prescriptions adequately address essential ACSM components, whether intervention delivery is transparent and reproducible, and whether behavioral strategies are incorporated to support adherence in unsupervised or remote settings. In addition, the rapid development cycles of commercial apps raise concerns regarding usability, consistency, and quality from a clinical perspective [25]. While existing evaluations of health apps often focus on general users, single-disease outcomes, or basic activity tracking [26-30], evidence remains limited regarding the extent to which widely used exercise prescription apps meet professional requirements for clinical prescription and implementation.

To address these gaps, this study adopted a clinician-centered, multidimensional evaluation framework to assess whether popular, no-cost exercise prescription apps are suitable for professional clinical use. The evaluation focused on 4 complementary domains reflecting core requirements of exercise prescription in real-world practice: clinical integrity of exercise prescriptions, intervention fidelity and transparency, behavioral mechanisms supporting adherence, and digital usability and overall app quality. These domains were operationalized using established and validated instruments, including the FITT and FITT-VP principles, the Consensus on Exercise Reporting Template (CERT), the Behavior Change Technique Taxonomy version 1 (BCTTv1), and the Mobile App Rating Scale (MARS). By integrating these perspectives, this study extends beyond general app quality or user-centered evaluations and provides a structured professional assessment of exercise prescription apps intended for clinician-guided care.


Study Design and Conceptual Framework

This study adopted a clinician-centered, cross-sectional evaluation design to assess exercise prescription apps intended for professional use. A multidimensional evaluation framework was applied to reflect the key components of clinical exercise prescription practice (Figure 1).

Figure 1. Conceptual framework for clinician-centered evaluation of exercise prescription apps. BCTTv1: Behavior Change Technique Taxonomy version 1; CERT: Consensus on Exercise Reporting Template; FITT: frequency, intensity, time, and type; FITT-VP: frequency, intensity, time, type, volume, and progression; MARS: Mobile App Rating Scale.

The framework included four domains: (1) clinical integrity, (2) intervention fidelity, (3) behavioral mechanism, and (4) digital usability. These domains were operationalized using established instruments: the FITT and FITT-VP principles, CERT, BCTTv1, and MARS.

Ethical Considerations

This study did not involve human participants, human data, or human biological materials. Therefore, institutional ethics board review and approval were not required.

App Identification and Search Strategy

This review adhered to the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) protocol, as detailed in the PRISMA 2020 statement (Checklist 1) [31]. A systematic search of Google Play and the Apple App Store was conducted on July 13, 2024, using Taiwanese regional settings. The following search terms were used: “exercise prescription AND doctor,” “exercise prescription AND therapist,” and “exercise prescription AND rehabilitation.” Duplicate apps were removed.

Apps were eligible if they were free, available in English or Chinese, designed for adult users, and not disease specific; did not require additional equipment; and allowed clinicians to prescribe exercise programs. To ensure market relevance, only apps available on both platforms with more than 10,000 downloads were included. A comprehensive and transparent description of this search process is provided in Multimedia Appendix 1.

Evaluation Framework and Domains

Clinical Integrity (FITT and FITT-VP)

Clinical integrity was assessed using the FITT and FITT-VP principles [10]. Apps were considered to meet the FITT criteria if all 4 core components were explicitly specified within the prescribed exercise programs. In addition, adherence to FITT-VP required explicit progression rules or defined mechanisms for adjusting the exercise volume or intensity over time.

Intervention Fidelity (CERT)

Intervention fidelity was evaluated using CERT, a 16-item checklist designed to assess the completeness and transparency of exercise interventions [32,33]. The CERT domains include materials, provider qualifications, delivery procedures, setting, dosage, tailoring, and adherence. Apps were assessed using a checklist approach to determine whether each CERT item was addressed. Achievement rates were calculated for individual items. The primary outcome was the achievement rate for individual CERT items.

Behavioral Mechanism (BCTTv1)

Behavioral mechanisms were evaluated using BCTTv1, a standardized framework for identifying discrete behavior change techniques (BCTs) within interventions [34]. The taxonomy includes 93 techniques and has been extended in mHealth apps to encompass 102 techniques [35]. Each app was systematically reviewed to identify the presence of BCTs according to BCTTv1 definitions. Achievement rates were calculated as the proportion of apps implementing each technique.

In this study, a predefined subset of BCTs within the BCTTv1 was designated as critical BCTs. These techniques were selected on the basis of prior studies examining strategies associated with exercise initiation, maintenance, and physical activity promotion in digital or remote interventions [36-38]. The selected techniques were grouped into three functional categories: (1) behavior initiation, (2) behavior maintenance, and (3) promotion of physical activity (Multimedia Appendix 2). Achievement rates for this predefined subset were calculated separately.

Digital Usability (MARS)

Digital usability and overall app quality were evaluated using MARS, which assesses engagement, functionality, aesthetics, and information quality on a 5-point scale [39,40]. To assess the evidence base component of information quality, supporting scientific literature was identified through database searches in PubMed and Google Scholar, as well as through app descriptions and developer websites. Each item was independently rated by 2 reviewers.

Sample Assessment Procedure

All included apps were evaluated using their full range of patient-facing functions. Web-based prescription platforms were not assessed.

Each app was independently reviewed by 2 clinicians (CHW and CNC), both resident physicians with more than 3 years of clinical experience and extensive experience using Android and iOS systems. Prior to evaluation, both reviewers completed formal BCTTv1 training and certification and familiarized themselves with the CERT and MARS protocols. To calibrate scoring interpretations, 5 additional health and fitness apps not included in the final sample were jointly reviewed before data collection.

All apps were installed on an Apple iPhone 8 running iOS version 16.3.1 and a Samsung SM A5360 running Android version 13. Reviewers examined all available features for at least 30 minutes per app. Discrepancies in coding were resolved through discussion, and a third reviewer (YCC) was consulted when consensus could not be reached.

Data Extraction and Statistical Analysis

For our study, descriptive data were gathered from Google Play on July 29, 2024, using a Taiwanese account and regional settings. We collected information on app titles, developers, versions, and download numbers directly from the app listings. We chose not to include data from the App Store, as it closely mirrored the information available on Google Play but was less extensive. Ratings and review counts were also excluded from our analysis due to their significant variability across different countries, which complicates comparisons due to the apps’ broad international user base. All statistical analyses were conducted using MedCalc Statistical Software (version 22.0.21; MedCalc Software Ltd). Given the exploratory design, descriptive statistics were primarily employed to present the findings. Interrater reliability was also assessed using Cohen κ or intraclass correlation coefficient (ICC) to determine the level of agreement between the 2 raters [41].


App Identification and Selection

The initial search yielded a total of 1020 apps (each search term yielded 240 apps on Google Play and 100 on the App Store). After removing duplicates and applying the inclusion and exclusion criteria, 6 exercise prescription apps were included in the final analysis (Figure 2). All included apps were available on both platforms; were free to download; supported clinical exercise prescription; and had exceeded 10,000 downloads at the time of assessments.

Figure 2. Flow diagram of app identification, screening, eligibility assessment, and multidimensional professional evaluation. BCTTv1: Behavior Change Technique Taxonomy version 1; CERT: Consensus on Exercise Reporting Template; FITT: frequency, intensity, time, and type; FITT-VP: frequency, intensity, time, type, volume, and progression; MARS: Mobile App Rating Scale.

App Characteristics

As of July 29, 2024, the 6 apps included in our study demonstrated a wide range of installations, with numbers ranging from 10,000 to 1 million downloads (Table 1). The app developers were based in the United Kingdom, United States, Canada, and Australia. All apps were initially released between 2017 and 2020 and had all been updated in 2023 or 2024.

Table 1. Characteristics of included exercise prescription apps (N=6; July 29, 2024).
App characteristicPhysiAppMedBridge GO for PatientsWibbiTrackActive Pro - Patient AppRehab Guru ClientTelehab
Number of installs>1 million>1 million>100,000>10,000>10,000>10,000
DeveloperPhysitrack PLCMedBridgeWibbiActive Health Tech Pty LtdRehab Guru TeamVALD
Country of developerUnited KingdomUnited StatesCanadaAustraliaUnited KingdomAustralia
Date of releaseOct 30, 2018May 15, 2017Feb 20, 2017Feb 17, 2017Aug 1, 2018May 26, 2020
Version4.20.04.6.31.3.11.6.73.1.01.3.0
Date of updateJul 22, 2024Apr 3, 2024Jul 16, 2024Feb 13, 2024Oct 26, 2023Mar 3, 2023

Clinical Integrity Assessment Based on FITT and FITT-VP Principles

All 6 apps met the FITT criteria, as they explicitly specified the exercise frequency, intensity, time, and type within prescribed exercise programs (Table 2). In contrast, none of the apps met the FITT-VP criteria. Across all apps, explicit guidance on progression of exercise volume or intensity over time was absent, and no app provided decision rules for adjusting exercise prescriptions based on user performance or program duration.

Table 2. Multidimensional evaluation results of included exercise prescription apps (N=6; July 29, 2024).
PhysiAppMedBridge GO for PatientsWibbiTrackActive Pro - Patient AppRehab Guru ClientTelehab
Clinical integrity
FITTaYesYesYesYesYesYes
FITT-VPbNoNoNoNoNoNo
Intervention fidelity
CERTc1010108910
Behavioral mechanism
BCTTv1d181615141519
Digital usability
MARSe3.823.853.363.363.553.99

aFITT: Frequency, intensity, type, time. “Yes” denotes full adherence to all 4 elements.

bFITT-VP: Frequency, intensity, type, time, volume, progression. “Yes” denotes full adherence to all 6 elements.

cCERT: Consensus on Exercise Reporting Template. Score indicates number of checklist items met (maximum=16).

dBCTTv1: Behavior Change Technique Taxonomy version 1. Number of identified behavior change techniques.

eMARS: Mobile App Rating Scale. Mean overall score on a 5-point scale.

Intervention Fidelity Assessment Based on CERT

Intervention fidelity varied across CERT domains (Table 3 and Figure 3). All 6 apps (100%) achieved complete reporting for core structural elements, including materials, provider qualifications, exercise setting, and dosage. Within the delivery domain, all apps specified whether exercises were performed individually or in groups and whether sessions were supervised or unsupervised. Of the 6 apps, 5 apps (83%) reported methods for assessing exercise adherence, and 4 apps (67%) included nonexercise components such as educational or dietary content.

Figure 3. Domain-level reporting patterns of the Consensus on Exercise Reporting Template across exercise prescription apps (N=6; 2024).
Table 3. Item-level achievement rates for the Consensus on Exercise Reporting Template (CERT) across included exercise prescription apps (N=6; 2024).
Abbreviated item description of CERTAchievement rate
Materials (what)
Detailed description of the type of exercise equipment100%
Provider (who)
Description of qualifications/expertise/training of instructor100%
Delivery (how)
Description of whether exercises are performed individually or in a group100%
Description of whether exercises are supervised/unsupervised100%
Description of the measurement/reporting of adherence to exercise83%
Detailed description of motivation strategies0%
Detailed description of the decision rule(s) of exercise progression0%
Detailed description of how the exercise program was progressed0%
Description of each exercise to enable replication (eg, illustrations and photographs)0%
Description of any home programmed component100%
Description of any nonexercise component67%
Description of the type/number of adverse events that occurred during exercise0%
Location (where)
Description of exercise setting100%
Dosage (when and how much)
Description of exercise intervention and dosage100%
Tailoring (what and how)
Description of whether exercises are generic or tailored to the individual0%
Detailed description of how exercises are tailored to the individual0%
Decision rule for starting level of exercise0%
Adherence/fidelity (how well)
Description of how adherence or fidelity to the exercise intervention is assessed/measured100%
Description of the extent to which the intervention was delivered as planned0%

In contrast, no app (0%) reported motivational strategies, decision rules for exercise progression, or descriptions of how exercise programs were progressed over time. Similarly, none of the apps described whether or how exercises were tailored to individual users or how starting exercise levels were determined. Reporting of adverse events was absent across all apps, and none documented whether the prescribed intervention was delivered as planned. The comprehensive table with respective scoring can be found in Multimedia Appendix 3. Interrater reliability for CERT assessment was high (Cohen κ=0.88).

Behavioral Mechanism Assessment Based on BCTTv1

Overall Implementation of BCTs

Across the 6 included apps, a limited but heterogeneous set of BCTs was identified. When assessed against the original 93-item BCTTv1 taxonomy, the mean number of techniques implemented by apps was 13.2 (SD 1.77; range 11‐16). When extended mHealth categories were included (102 techniques), the mean number of techniques increased to 16.2 (SD 1.77; range 14‐19). The interrater reliability for BCT coding was high (Cohen κ=0.85).

Fourteen techniques were implemented across all apps (100% achievement), including goal setting (behavior), action planning, feedback on behavior, self-monitoring of behavior, instruction on how to perform the behavior, demonstration of the behavior, and behavioral practice or rehearsal. Additional techniques were implemented in a subset of apps. The full list of identified BCTs and their achievement rates is presented in Table 4, and detailed coding results, including the extended 102-technique classification, are provided in Multimedia Appendix 4.

Table 4. Achievement rates of behavior change techniques (BCTs) identified from the full BCTTv1a taxonomy across included exercise prescription apps (N=6; 2024), ordered by frequency of implementation.
BCTTv1 codeBCTCritical BCTb?Achievement rate
1.1Goal setting (behavior)Yes100%
1.4Action planningYes100%
1.5Review behavior goal(s)No100%
2.1Monitoring of behavior by others without feedbackNo100%
2.2Feedback on behaviorYes100%
2.3Self-monitoring of behaviorYes100%
4.1Instruction on how to perform a behaviorYes100%
6.1Demonstration of the behaviorYes100%
8.1Behavioral practice/rehearsalYes100%
9.1Credible sourceNo100%
12.6Body changesNo100%
17.1Tailoring to demographic characteristicsNo100%
17.2Tailoring to health statusNo100%
17.4Adjusting intervention content to performanceNo100%
5.1Information about health consequencesNo66%
10.4Social rewardNo50%
1.3Goal setting (outcome)Yes33%
1.7Review outcome goal(s)No33%
2.5Monitoring outcome(s) of behavior by others without feedbackNo33%

aBCTTv1: The Behavior Change Technique Taxonomy version 1. It comprises 102 distinct techniques. Only techniques identified in at least 1 app are presented; all remaining BCTTv1 items had an achievement rate of 0% across the evaluated exercise prescription apps.

bCritical BCTs were defined a priori as techniques with established relevance to exercise initiation, long-term maintenance, or physical activity promotion in digital or remote intervention contexts.

Predefined Clinically Relevant BCTs

Table 5 shows the predefined critical BCTs grouped into 3 functional categories: behavior initiation, behavior maintenance, and promotion of physical activity. Within behavior initiation, demonstration of the behavior and behavioral practice or rehearsal were incorporated by all apps (6/6, 100%), whereas biofeedback and graded tasks were not identified (0/6, 0%). For behavior maintenance, action planning and instruction on how to perform the behavior were universally implemented (6/6, 100%), while prompts or cues, graded tasks, and self-reward were absent (0/6, 0%). Among techniques associated with physical activity promotion, goal setting (behavior), action planning, feedback on behavior, and self-monitoring were consistently implemented, whereas social support, restructuring the physical environment, and framing or reframing were not identified.

Table 5. Achievement rates of predefined critical behavior change techniques across functional roles among included exercise prescription apps (N=6; 2024).
Functional category and behavior change techniques (BCTTv1a code)Achievement rate
Behavior activation
6.1 Demonstration of the behavior100%
8.1 Behavioral practice/rehearsal100%
2.6 Biofeedback0%
8.7 Graded tasks0%
Behavior maintenance
1.4 Action planning100%
4.1 Instruction on how to perform a behavior100%
8.1 Behavioral practice/rehearsal100%
7.1 Prompts/cues0%
8.7 Graded tasks0%
10.9 Self-reward0%
Physical activity promotion
1.1, 1.3 Goal setting (behavior and outcome)100%
1.4 Action planning100%
2.2 Feedback on behavior100%
2.3, 2.4 Self-monitoring of behavior with its outcome(s)100%
10.1‐10.11 Items in group “Reward and threat”50%
3.1‐3.3 Items in group “Social support”0%
12.1 Restructuring the physical environment0%
13.2 Framing/reframing0%

aBCTTv1: The Behavior Change Technique Taxonomy version 1.

Digital Usability Based on MARS

Assessment of overall app quality using MARS yielded a mean score of 3.66 (SD 0.25) across the 6 apps, with individual app scores ranging from 3.36 to 4.00 (Table 6 and Figure 4). Among MARS domains, engagement received the lowest ratings, with a mean score of 2.50 (SD 0.28; range 2.10‐2.80), indicating limited use of interactive, motivational, or personalized features. In contrast, functionality was rated highest, achieving a mean score of 4.36 (SD 0.22; range 4.00‐4.63), reflecting stable performance, ease of navigation, and technical reliability across all apps. Aesthetic quality was rated moderately to highly, with a mean score of 3.89 (SD 0.48; range 3.17‐4.50), and information quality was rated similarly, with a mean score of 3.90 (SD 0.18; range 3.58‐4.07). Comprehensive tables with both reviewer and respective scoring results are provided in Multimedia Appendices 5 and 6. The interrater reliability of MARS was assessed using the ICC (2-way random effects, absolute agreement, average measures). The resulting ICC was 0.71 (95% CI 0.59‐0.81), indicating good reliability.

Figure 4. Radar chart illustrating the Mobile App Rating Scale scores of the 6 included exercise prescription apps (N=6; 2024).
Table 6. Digital usability and quality scores of included exercise prescription apps (N=6; 2024) assessed using the Mobile App Rating Scale (MARS).
MARS domainPhysiAppMedBridge GO for PatientsWibbiTrackActive Pro - Patient AppRehab Guru ClientTelehabMARS score, mean (SD)a
Engagement2.702.802.102.202.402.802.50 (0.28)
Functionality4.504.384.134.004.504.634.36 (0.22)
Aesthetics4.174.333.173.673.504.503.89 (0.48)
Information3.933.924.083.583.834.073.90 (0.17)
Overall quality3.823.863.373.363.564.003.66 (0.25)

aThe maximum possible MARS score is 5.0, based on a 5-point Likert scale ranging from 1 (inadequate) to 5 (excellent).

Our review of the evidence item revealed that by July 2024, only 2 apps among the cohort had undergone scientific evaluation, suggesting a lack of empirically validated health apps in the current market.


Principal Results

This study uses an integrated, clinician-centered, multidimensional evaluation framework to provide a novel professional perspective on how exercise prescription apps operate in real-world clinical contexts. Overall, popular, no-cost exercise prescription apps were found to meet the basic structural requirements for exercise prescription but fell short of key standards required for independent clinical use. Although prescriptions consistently specified core FITT elements, the absence of explicit progression and individualized adjustment limited alignment with established clinical practice. While multiple BCTs were identified, those critical for sustaining exercise behavior in unsupervised or remote settings were frequently lacking. Despite strong technical functionality and high information quality, low engagement scores suggest that usability alone is insufficient to support long-term adherence.

When considered collectively across domains, these apps nonetheless demonstrate operational strengths, including structured translation of prescriptions into actionable tasks, continuous visibility of adherence, low cognitive load instruction delivery, and delegation of routine monitoring. Taken together, these findings highlight a gap between clinical expectations for exercise prescription and the current capabilities of widely used digital tools, supporting the role of exercise prescription apps as adjuncts within clinician-guided care rather than as stand-alone prescribing solutions.

Conceptual Foundation and Innovation of the Multidimensional Evaluation Framework

This study introduces a clinician-centered evaluation framework that integrates clinical exercise prescription standards, behavior change theory, and digital usability into a unified, practice-oriented model. Unlike prior evaluations of physical activity and mHealth apps that have primarily examined usability, engagement, or behavior change features in isolation [26-28], this framework situates app assessment within the logic of real-world clinical exercise prescription and decision-making. By aligning established instruments with sequential stages of professional practice, including prescription structure and progression logic (FITT and FITT-VP), transparency and reproducibility of intervention delivery (CERT), support for exercise initiation and maintenance (BCTTv1), and interface quality and engagement (MARS), the framework enables clinicians to directly compare app features with clinical expectations. In doing so, it provides a shared reference to guide both informed clinical selection and future app development.

Adherence to FITT Principles and the Need for Enhanced Program Progression in Exercise Apps

Unlike general physical activity apps that often fail to meet the basic FITT criteria [24,42], the specialized apps in our study successfully incorporated these fundamental elements. However, the primary clinical limitation lies in their failure to satisfy the full FITT-VP framework, specifically regarding structured progression and individualized adjustment. In established exercise prescription practice, progression is essential for achieving physiological adaptation while minimizing injury risk, particularly by avoiding abrupt increases in exercise volume or intensity [43,44]. Clear guidance on how and when to advance exercises is also critical for informed consent and may facilitate patient engagement and adherence.

In addition, limited support for tailoring exercise programs to individual needs constrains the safe use of these tools in populations such as older adults and individuals with chronic diseases, for whom inappropriate exercise loading may increase the risk of adverse events [45]. The absence of explicit information regarding potential exercise-related risks further complicates informed decision-making by both clinicians and patients. To optimize the integration of exercise prescription apps into clinical practice, more comprehensive and transparent intervention descriptions are required to support individualized care and the effective translation of evidence-based exercise prescription into real-world settings.

Intervention Fidelity and Clinical Transparency in App-Based Exercise Prescription

The evaluation of intervention fidelity via the CERT domains revealed a stark disparity between the reporting of static structural elements and dynamic clinical processes. While all apps achieved 100% compliance in describing core components such as materials, provider qualifications, and settings, there was a universal failure (0%) to report motivational strategies, progression logic, or tailoring mechanisms. This deficiency mirrors findings from previous systematic reviews in pregnancy [46] and musculoskeletal rehabilitation [47], which similarly identified a pervasive lack of customization features in mHealth tools. Crucially, the absence of documented decision rules for exercise progression limits the transparency of the intervention. As highlighted by Hansford et al [48] and Conrado Ignacio et al [49], detailed reporting of intervention components—beyond simple dosage parameters—is essential for ensuring reproducibility and clearly defining the functional mechanisms of the prescribed therapy. Consequently, while digital platforms successfully deliver standardized content, the lack of explicit reporting on how exercises are adjusted obscures the clinical reasoning necessary to ensure safety and effectiveness throughout the rehabilitative trajectory.

Furthermore, the total lack of information regarding baseline determination and individual tailoring (0%) highlights a fundamental challenge in digital health. According to updated preparticipation health screening recommendations [50], a comprehensive evaluation of a patient’s physical status and functional capacity is the indispensable prerequisite for ensuring safety before initiating any exercise program. The inability of current apps to document how starting levels are calibrated or adjusted to individual constraints suggests a reliance on generic protocols rather than precise clinical titration. Coupled with the complete omission of adverse event reporting, these findings indicate that current apps function as digital exercise libraries rather than sophisticated therapeutic tools, falling short of the safety and individualization standards required for clinical populations.

Enhanced BCTs With Challenges in Addressing Supervision and Adherence in Exercise Apps

The exercise prescription apps reviewed in this study incorporated a higher number of BCTs than reported in previous app evaluations [29,30]; the specific selection of strategies remains highly consistent with the broader mHealth literature [29,51-54]. The predominance of self-monitoring, feedback, goal setting, and action planning in our sample mirrors the most frequently identified features in recent content analyses, confirming that these specialized tools adhere to the standard design patterns observed in general physical activity interventions. However, consistent with established behavior change theory, the mere presence of multiple techniques does not ensure effective behavioral support if their selection and configuration are not aligned with the clinical requirements of long-term exercise adherence [55].

From a clinical perspective, several predefined critical BCTs for sustaining exercise behavior in unsupervised or remote settings were absent or infrequently implemented. These included graded tasks, biofeedback, prompts or cues, and self-reward—mechanisms closely associated with progression, self-regulation, and adaptive feedback over time. Their limited representation suggests that current exercise prescription apps are primarily optimized for exercise initiation and short-term compliance rather than for dynamic adjustment and sustained behavior change, which are central to clinical exercise prescription.

Conversely, several frequently implemented techniques that were not classified as critical were consistently present across all apps, including presentation of a credible source, routine review of behavioral goals, monitoring without feedback, and repeated behavioral practice. Although these techniques may have limited independent effects on long-term adherence, they may provide complementary clinical value by reinforcing professional authority, standardizing instruction, and supporting structured follow-up [56,57]. Taken together, these findings indicate that while current exercise prescription apps insufficiently implement several clinically critical BCTs, their operational strengths may still augment clinician-guided or hybrid care models, although they may not effectively function as stand-alone behavioral interventions.

Digital Usability in Exercise Apps

In our analysis, the exercise prescription apps achieved an average MARS quality score of 3.66 (SD 0.25), consistent with prior studies by Paganini et al [58] and Simões et al [59], which reported moderate overall quality scores for physical activity apps. Paganini et al [58] found the average quality score to be 3.60, with variations across different subcategories: information averaged 3.24; engagement, 3.19; aesthetics, 3.65; and functionality, the highest at 4.35. Simões et al [59] reported a slightly higher overall MARS score of 3.88, with functionality again scoring the highest at 4.30, followed by aesthetics, information quality, and engagement.

Notably, in our study, the score for information quality was higher at 3.90, whereas the engagement score was considerably lower at 2.50. This discrepancy may reflect the more specific and task-oriented nature of our evaluated apps compared to the broader range of apps included in other studies, which might engage users differently. Despite substantial download figures, the apps we reviewed showed limited evidence of effectiveness, a finding that corroborates previous evaluations [60-62], which consistently highlighted a shortage of evidence-based content and professional guideline adherence in the mHealth landscape.

Clinical Implication

From a clinical perspective, the findings of this study indicate that currently available exercise prescription apps should not be used as stand-alone prescribing tools but rather incorporated into clinician-guided or hybrid care models. Although these apps consistently deliver structured exercise instructions and provide visibility of patient adherence, the absence of explicit progression logic, individualized adjustment, and several clinically critical BCTs limits their capacity to independently support safe and effective exercise prescription. Consequently, clinicians remain essential for determining appropriate starting levels, explaining progression criteria, and managing exercise-related risks, particularly in older adults and individuals with chronic diseases [63].

At the same time, the operational features that are consistently implemented across apps such as standardized exercise demonstrations, routine goal review, adherence tracking, and low cognitive load instruction delivery may still offer complementary clinical value. When used alongside professional judgment, these features can reduce instructional burden, support treatment consistency, and facilitate structured follow-up between clinical encounters. In this context, exercise prescription apps may function as digital extensions of clinician-led care rather than autonomous therapeutic systems.

For developers, these findings underscore the need to move beyond technically feasible features toward closer alignment with clinical exercise prescription standards. Enhancing transparency around progression rules, incorporating mechanisms for individualized adjustment, and prioritizing BCTs associated with long-term adherence may improve the clinical relevance of future applications. Most importantly, adopting a co-design approach—actively involving both clinical practitioners and end users in the development process—is essential to bridge the gap between technical implementation and real-world clinical utility [64,65].

Limitations

This study has several limitations. First, search results were inherently transient and geographically specific due to localized app store algorithms and the rapidly evolving mobile market [66]. While our semantic search strategy targeted popular, clinically relevant apps rather than an exhaustive catalog, the findings represent a 2024 snapshot, and the currency of these results may diminish, as apps undergo frequent updates. Second, while the interrater reliability was high for CERT and BCT assessments, the MARS evaluation yielded relatively lower consistency, reflecting the subjective nature of aesthetic and engagement metrics; however, all discrepancies were resolved through rigorous consensus-based discussion to ensure data validity.

Third, the CERT and BCTTv1 frameworks, originally designed for research interventions, may not fully capture modern app functionalities such as automated feedback, potentially creating a structural mismatch in scoring. Fourth, we identified the presence of BCTs but did not evaluate the quality of their implementation or their actual impact on user behavior. Furthermore, the lack of longitudinal adherence and clinical outcome data limits our assessment of real-world therapeutic impact.

Future Direction

While our study establishes fundamental professional quality and usability benchmarks for exercise prescription apps, it serves primarily as a foundational step mapping the current landscape. Future work should build upon these findings by transitioning from quality assessment to rigorous clinical verification.

Specifically, longitudinal randomized controlled trials are needed to determine if the high-scoring, professionally vetted apps identified in this study translate to better patient adherence and health outcomes compared to traditional methods. Furthermore, considering that the effectiveness of digital exercise prescription systems may vary significantly across conditions, these investigations should target specific clinical cohorts—such as patients with cardiac conditions, neurological disorders (eg, poststroke rehabilitation), or musculoskeletal issues (eg, sarcopenia)—rather than general users.

Crucially, to match the complexity of clinical needs, efficacy trials must move beyond self-reported measures and incorporate objective physiological outcomes, such as laboratory data (eg, inflammatory markers and glycemic control) and body composition metrics assessed via bioelectrical impedance analysis. Finally, integrating wearable sensors for real-time biofeedback represents a promising frontier for ensuring that these digital prescriptions are delivered safely and effectively.

Conclusions

Using a clinician-centered, multidimensional evaluation framework, this study systematically assessed the clinical usability of popular, no-cost exercise prescription apps. Although these apps consistently met basic structural requirements and demonstrated strong technical functionality, they lacked essential features related to progression, individualization, intervention transparency, and sustained behavior change support. As a result, their capacity to independently deliver clinically appropriate exercise prescriptions remains limited.

Nevertheless, the standardized delivery of exercise instructions, structured translation of prescriptions into actionable tasks, and support for adherence monitoring suggest that these apps may still add value within clinician-guided or hybrid care models. Future development should prioritize a co-design approach involving both clinicians and users to bridge existing gaps. This collaborative strategy is essential to align apps with clinical standards and behavior change principles, thereby enhancing their safety, effectiveness, and clinical relevance.

Funding

The article processing fee was supported by intramural funding from Taipei Veterans General Hospital, Taiwan, and Chang Gung Memorial Hospital, Taiwan. These funding sources had no role in the interpretation of the study results. No external funding was received for this study.

Data Availability

The data underlying the analyses and results reported in this paper are available in the electronic appendix accompanying the manuscript.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Detailed search strategy and selection protocol.

DOCX File, 23 KB

Multimedia Appendix 2

The behavior change techniques in prior studies coded in the Behavior Change Technique Taxonomy version 1 framework.

DOCX File, 18 KB

Multimedia Appendix 3

The Consensus on Exercise Reporting Template checklist with detailed description across included exercise prescription apps (N=6; 2024).

DOCX File, 22 KB

Multimedia Appendix 4

The behavior change techniques coded in Behavior Change Technique Taxonomy version 1 framework across included exercise prescription apps (N=6; 2024).

DOCX File, 34 KB

Multimedia Appendix 5

The Mobile App Rating Scale scores of included exercise prescription apps on the Android system (N=6; 2024).

DOCX File, 21 KB

Multimedia Appendix 6

The Mobile App Rating Scale scores of included exercise prescription apps on the iOS system (N=6; 2024).

DOCX File, 21 KB

Checklist 1

PRISMA 2020 checklist.

PDF File, 175 KB

  1. Luan X, Tian X, Zhang H, et al. Exercise as a prescription for patients with various diseases. J Sport Health Sci. Sep 2019;8(5):422-441. [CrossRef] [Medline]
  2. Hayden JA, Ellis J, Ogilvie R, Malmivaara A, van Tulder MW. Exercise therapy for chronic low back pain. Cochrane Database Syst Rev. Sep 28, 2021;9(9):CD009790. [CrossRef] [Medline]
  3. Cools AM, Dewitte V, Lanszweert F, et al. Rehabilitation of scapular muscle balance: which exercises to prescribe? Am J Sports Med. Oct 2007;35(10):1744-1751. [CrossRef] [Medline]
  4. Billinger SA, Arena R, Bernhardt J, et al. Physical activity and exercise recommendations for stroke survivors: a statement for healthcare professionals from the American Heart Association/American Stroke Association. Stroke. Aug 2014;45(8):2532-2553. [CrossRef] [Medline]
  5. Garvey C, Bayles MP, Hamm LF, et al. Pulmonary rehabilitation exercise prescription in chronic obstructive pulmonary disease: review of selected guidelines: an official statement from the American Association of Cardiovascular and Pulmonary Rehabilitation. J Cardiopulm Rehabil Prev. 2016;36(2):75-83. [CrossRef] [Medline]
  6. Hansen D, Abreu A, Ambrosetti M, et al. Exercise intensity assessment and prescription in cardiovascular rehabilitation and beyond: why and how: a position statement from the Secondary Prevention and Rehabilitation Section of the European Association of Preventive Cardiology. Eur J Prev Cardiol. Feb 19, 2022;29(1):230-245. [CrossRef] [Medline]
  7. Petrella RJ, Koval JJ, Cunningham DA, Paterson DH. Can primary care doctors prescribe exercise to improve fitness? The Step Test Exercise Prescription (STEP) project. Am J Prev Med. May 2003;24(4):316-322. [CrossRef] [Medline]
  8. Booth FW, Roberts CK, Laye MJ. Lack of exercise is a major cause of chronic diseases. Compr Physiol. Apr 2012;2(2):1143-1211. [CrossRef] [Medline]
  9. Moore SC, Lee IM, Weiderpass E, et al. Association of leisure-time physical activity with risk of 26 types of cancer in 1.44 million adults. JAMA Intern Med. Jun 1, 2016;176(6):816-825. [CrossRef] [Medline]
  10. Garber CE, Blissmer B, Deschenes MR, et al. American College of Sports Medicine position stand. Quantity and quality of exercise for developing and maintaining cardiorespiratory, musculoskeletal, and neuromotor fitness in apparently healthy adults: guidance for prescribing exercise. Med Sci Sports Exerc. Jul 2011;43(7):1334-1359. [CrossRef] [Medline]
  11. Hoffmann TC, Maher CG, Briffa T, et al. Prescribing exercise interventions for patients with chronic conditions. CMAJ. Apr 19, 2016;188(7):510-518. [CrossRef] [Medline]
  12. Lawford BJ, Delany C, Bennell KL, Hinman RS. “I was really pleasantly surprised”: firsthand experience and shifts in physical therapist perceptions of telephone‐delivered exercise therapy for knee osteoarthritis–a qualitative study. Arthritis Care Res (Hoboken). Apr 2019;71(4):545-557. [CrossRef] [Medline]
  13. Negrini S, Donzelli S, Negrini A, Negrini A, Romano M, Zaina F. Feasibility and acceptability of telemedicine to substitute outpatient rehabilitation services in the COVID-19 emergency in Italy: an observational everyday clinical-life study. Arch Phys Med Rehabil. Nov 2020;101(11):2027-2032. [CrossRef] [Medline]
  14. Tenforde AS, Borgstrom H, Polich G, et al. Outpatient physical, occupational, and speech therapy synchronous telemedicine: a survey study of patient satisfaction with virtual visits during the COVID-19 pandemic. Am J Phys Med Rehabil. Nov 2020;99(11):977-981. [CrossRef] [Medline]
  15. Suso-Martí L, La Touche R, Herranz-Gómez A, Angulo-Díaz-Parreño S, Paris-Alemany A, Cuenca-Martínez F. Effectiveness of telerehabilitation in physical therapist practice: an umbrella and mapping review with meta–meta-analysis. Phys Ther. May 4, 2021;101(5):pzab075. [CrossRef] [Medline]
  16. Bennell KL, Marshall CJ, Dobson F, Kasza J, Lonsdale C, Hinman RS. Does a web-based exercise programming system improve home exercise adherence for people with musculoskeletal conditions? A randomized controlled trial. Am J Phys Med Rehabil. Oct 2019;98(10):850-858. [CrossRef] [Medline]
  17. Kalgotra P, Raja U, Sharda R. Growth in the development of health and fitness mobile apps amid COVID-19 pandemic. Digit Health. 2022;8:20552076221129070. [CrossRef] [Medline]
  18. Cottrell MA, Russell TG. Telehealth for musculoskeletal physiotherapy. Musculoskelet Sci Pract. Aug 2020;48:102193. [CrossRef] [Medline]
  19. Thompson WR. Worldwide survey of fitness trends for 2021. ACSMs Health Fit J. 2021;25(1):10-19. [CrossRef]
  20. Thompson WR. Worldwide survey of fitness trends for 2022. ACSMs Health Fit J. 2022;26(1):11-20. [CrossRef]
  21. Bonato M, Marmondi F, Mastropaolo C, et al. A digital platform for home-based exercise prescription for older people with sarcopenia. Sensors (Basel). Jul 24, 2024;24(15):15. [CrossRef] [Medline]
  22. Jo HS, Kim HM, Go CH, Yu HY, Park HK, Han JY. Effectiveness of home-based cardiac rehabilitation with optimized exercise prescriptions using a mobile healthcare app in patients with acute myocardial infarction: a randomized controlled trial. Life (Basel). Sep 5, 2024;14(9):1122. [CrossRef] [Medline]
  23. Espinoza-Bravo C, Arnal-Gómez A, Martínez-Arnau FM, et al. Effectiveness of functional or aerobic exercise combined with breathing techniques in telerehabilitation for patients with long COVID: a randomized controlled trial. Phys Ther. Nov 4, 2023;103(11):11. [CrossRef] [Medline]
  24. Chen S, Wu Y, Bushey EL, Pescatello LS. Evaluation of exercise mobile applications for adults with cardiovascular disease risk factors. J Cardiovasc Dev Dis. Nov 28, 2023;10(12):477. [CrossRef] [Medline]
  25. Gell NM, Smith PA, Wingood M. Physical therapist and patient perspectives on mobile technology to support home exercise prescription for people with arthritis: a qualitative study. Cureus. Mar 2024;16(3):e55899. [CrossRef] [Medline]
  26. De Santis KK, Jahnel T, Matthias K, Mergenthal L, Al Khayyal H, Zeeb H. Evaluation of digital interventions for physical activity promotion: scoping review. JMIR Public Health Surveill. May 23, 2022;8(5):e37820. [CrossRef] [Medline]
  27. Bearne LM, Sekhon M, Grainger R, et al. Smartphone apps targeting physical activity in people with rheumatoid arthritis: systematic quality appraisal and content analysis. JMIR Mhealth Uhealth. Jul 21, 2020;8(7):e18495. [CrossRef] [Medline]
  28. Milne-Ives M, Lam C, De Cock C, Van Velthoven MH, Meinert E. Mobile apps for health behavior change in physical activity, diet, drug and alcohol use, and mental health: systematic review. JMIR Mhealth Uhealth. Mar 18, 2020;8(3):e17046. [CrossRef] [Medline]
  29. Kebede M, Steenbock B, Helmer SM, Sill J, Möllers T, Pischke CR. Identifying evidence-informed physical activity apps: content analysis. JMIR Mhealth Uhealth. Dec 18, 2018;6(12):e10314. [CrossRef] [Medline]
  30. Bondaronek P, Alkhaldi G, Slee A, Hamilton FL, Murray E. Quality of publicly available physical activity apps: review and content analysis. JMIR Mhealth Uhealth. Mar 21, 2018;6(3):e53. [CrossRef] [Medline]
  31. Page MJ, McKenzie JE, Bossuyt PM, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Int J Surg. Apr 2021;88:105906. [CrossRef] [Medline]
  32. Slade SC, Dionne CE, Underwood M, Buchbinder R. Consensus on Exercise Reporting Template (CERT): explanation and elaboration statement. Br J Sports Med. Dec 2016;50(23):1428-1437. [CrossRef] [Medline]
  33. Slade SC, Dionne CE, Underwood M, et al. Consensus on Exercise Reporting Template (CERT): modified Delphi study. Phys Ther. Oct 2016;96(10):1514-1524. [CrossRef] [Medline]
  34. Michie S, Richardson M, Johnston M, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. Aug 2013;46(1):81-95. [CrossRef] [Medline]
  35. Dugas M, Gao GG, Agarwal R. Unpacking mHealth interventions: a systematic review of behavior change techniques used in randomized controlled trials assessing mHealth effectiveness. Digit Health. 2020;6:2055207620905411. [CrossRef] [Medline]
  36. Yang X, Ma L, Zhao X, Kankanhalli A. Factors influencing user’s adherence to physical activity applications: a scoping literature review and future directions. Int J Med Inform. Feb 2020;134:104039. [CrossRef] [Medline]
  37. Howlett N, Trivedi D, Troop NA, Chater AM. Are physical activity interventions for healthy inactive adults effective in promoting behavior change and maintenance, and which behavior change techniques are effective? A systematic review and meta-analysis. Transl Behav Med. Jan 1, 2019;9(1):147-157. [CrossRef] [Medline]
  38. Sullivan AN, Lachman ME. Behavior change with fitness technology in sedentary adults: a review of the evidence for increasing physical activity. Front Public Health. 2016;4:289. [CrossRef] [Medline]
  39. Stoyanov SR, Hides L, Kavanagh DJ, Zelenko O, Tjondronegoro D, Mani M. Mobile app rating scale: a new tool for assessing the quality of health mobile apps. JMIR Mhealth Uhealth. Mar 11, 2015;3(1):e27. [CrossRef] [Medline]
  40. Terhorst Y, Philippi P, Sander LB, et al. Validation of the Mobile Application Rating Scale (MARS). PLoS One. 2020;15(11):e0241480. [CrossRef] [Medline]
  41. Cicchetti DV. Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychol Assess. 1994;6(4):284-290. [CrossRef]
  42. Elder A, Guillen G, Isip R, Zepeda R, Lewis ZH. A deeper look into exercise intensity tracking through mobile applications: a brief report. Technologies (Basel). 2023;11(3):66. [CrossRef]
  43. Collado-Mateo D, Lavín-Pérez AM, Peñacoba C, et al. Key factors associated with adherence to physical exercise in patients with chronic diseases and older adults: an umbrella review. Int J Environ Res Public Health. Feb 19, 2021;18(4):2023. [CrossRef] [Medline]
  44. Lund ML, Tamm M, Br nholm IB. Patients’ perceptions of their participation in rehabilitation planning and professionals’ view of their strategies to encourage it. Occup Ther Int. 2001;8(3):151-167. [CrossRef] [Medline]
  45. Franklin BA, Thompson PD, Al-Zaiti SS, et al. Exercise-related acute cardiovascular events and potential deleterious adaptations following long-term exercise training: placing the risks into perspective–an update: a scientific statement from the American Heart Association. Circulation. Mar 31, 2020;141(13):e705-e736. [CrossRef] [Medline]
  46. Hayman MJ, Alfrey KL, Waters K, et al. Evaluating evidence-based content, features of exercise instruction, and expert involvement in physical activity apps for pregnant women: systematic search and content analysis. JMIR Mhealth Uhealth. Jan 19, 2022;10(1):e31607. [CrossRef] [Medline]
  47. Ryan S, Ní Chasaide N, O’ Hanrahan S, Corcoran D, Caulfield B, Argent R. mHealth apps for musculoskeletal rehabilitation: systematic search in app stores and content analysis. JMIR Rehabil Assist Technol. Aug 1, 2022;9(3):e34355. [CrossRef] [Medline]
  48. Hansford HJ, Wewege MA, Cashin AG, et al. If exercise is medicine, why don’t we know the dose? An overview of systematic reviews assessing reporting quality of exercise interventions in health and disease. Br J Sports Med. Jun 2022;56(12):692-700. [CrossRef] [Medline]
  49. Conrado Ignacio A, Oliveira NL, Xavier Neves da Silva L, et al. Methodological rigor and quality of reporting of clinical trials published with physical activity interventions: a report from the Strengthening the Evidence in Exercise Sciences Initiative (SEES Initiative). PLoS One. 2024;19(8):e0309087. [CrossRef] [Medline]
  50. Riebe D, Franklin BA, Thompson PD, et al. Updating ACSM’s recommendations for exercise preparticipation health screening. Med Sci Sports Exerc. Nov 2015;47(11):2473-2479. [CrossRef] [Medline]
  51. Patterson K, Davey R, Keegan R, Kunstler B, Woodward A, Freene N. Behaviour change techniques in cardiovascular disease smartphone apps to improve physical activity and sedentary behaviour: systematic review and meta-regression. Int J Behav Nutr Phys Act. Jul 7, 2022;19(1):81. [CrossRef] [Medline]
  52. Yang CH, Maher JP, Conroy DE. Implementation of behavior change techniques in mobile applications for physical activity. Am J Prev Med. Apr 2015;48(4):452-455. [CrossRef] [Medline]
  53. Middelweerd A, Mollee JS, van der Wal CN, Brug J, Te Velde SJ. Apps to promote physical activity among adults: a review and content analysis. Int J Behav Nutr Phys Act. Jul 25, 2014;11:97. [CrossRef] [Medline]
  54. Wang Y, Wang Y, Greene B, Sun L. An analysis and evaluation of quality and behavioral change techniques among physical activity apps in China. Int J Med Inform. Jan 2020;133:104029. [CrossRef] [Medline]
  55. Schroé H, Van Dyck D, De Paepe A, et al. Which behaviour change techniques are effective to promote physical activity and reduce sedentary behaviour in adults: a factorial randomized trial of an e- and m-health intervention. Int J Behav Nutr Phys Act. Oct 7, 2020;17(1):127. [CrossRef] [Medline]
  56. Carraça E, Encantado J, Battista F, et al. Effective behavior change techniques to promote physical activity in adults with overweight or obesity: a systematic review and meta-analysis. Obes Rev. Jul 2021;22 Suppl 4(Suppl 4):e13258. [CrossRef] [Medline]
  57. Spring B, Champion KE, Acabchuk R, Hennessy EA. Self-regulatory behaviour change techniques in interventions to promote healthy eating, physical activity, or weight loss: a meta-review. Health Psychol Rev. Dec 2021;15(4):508-539. [CrossRef] [Medline]
  58. Paganini S, Terhorst Y, Sander LB, et al. Quality of physical activity apps: systematic search in app stores and content analysis. JMIR Mhealth Uhealth. Jun 9, 2021;9(6):e22587. [CrossRef] [Medline]
  59. Simões P, Silva AG, Amaral J, Queirós A, Rocha NP, Rodrigues M. Features, behavioral change techniques, and quality of the most popular mobile apps to measure physical activity: systematic search in app stores. JMIR Mhealth Uhealth. Oct 26, 2018;6(10):e11281. [CrossRef] [Medline]
  60. Higgins JP. Smartphone applications for patients’ health and fitness. Am J Med. Jan 2016;129(1):11-19. [CrossRef] [Medline]
  61. Knight E, Stuckey MI, Prapavessis H, Petrella RJ. Public health guidelines for physical activity: is there an app for that? A review of android and apple app stores. JMIR Mhealth Uhealth. May 21, 2015;3(2):e43. [CrossRef] [Medline]
  62. Modave F, Bian J, Leavitt T, Bromwell J, Harris III C, Vincent H. Low quality of free coaching apps with respect to the American College of Sports Medicine guidelines: a review of current mobile apps. JMIR Mhealth Uhealth. Jul 24, 2015;3(3):e77. [CrossRef] [Medline]
  63. Soto-Bagaria L, Eis S, Pérez LM, et al. Mobile applications to prescribe physical exercise in frail older adults: review of the available tools in app stores. Age Ageing. Dec 1, 2023;52(12):afad227. [CrossRef] [Medline]
  64. Mrklas KJ, Barber T, Campbell-Scherer D, et al. Co-design in the development of a mobile health app for the management of knee osteoarthritis by patients and physicians: qualitative study. JMIR Mhealth Uhealth. Jul 10, 2020;8(7):e17893. [CrossRef] [Medline]
  65. Song T, Yu P, Bliokas V, et al. A clinician-led, experience-based co-design approach for developing mHealth services to support the patient self-management of chronic conditions: development study and design case. JMIR Mhealth Uhealth. Jul 20, 2021;9(7):e20650. [CrossRef] [Medline]
  66. Grundy QH, Wang Z, Bero LA. Challenges in assessing mobile health app quality: a systematic review of prevalent and innovative methods. Am J Prev Med. Dec 2016;51(6):1051-1059. [CrossRef] [Medline]


ACSM: American College of Sports Medicine
BCT: behavior change technique
BCTTv1: Behavior Change Technique Taxonomy version 1
CERT: Consensus on Exercise Reporting Template
FITT: frequency, intensity, time, and type
FITT-VP: frequency, intensity, time, type, volume, and progression
ICC: intraclass correlation coefficient
MARS: Mobile App Rating Scale
mHealth: mobile health
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses


Edited by Georgian Badicu; submitted 16.May.2025; peer-reviewed by Amie Woodward, Melitta Mcnarry; final revised version received 13.Feb.2026; accepted 14.Feb.2026; published 25.Mar.2026.

Copyright

© Cheng-Hao Wu, Che-Ning Chang, Chu-Fang Chang, Ming-Hwai Lin, Hsing-Yu Chen, Yu-Chun Chen. Originally published in JMIR mHealth and uHealth (https://mhealth.jmir.org), 25.Mar.2026.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mHealth and uHealth, is properly cited. The complete bibliographic information, a link to the original publication on https://mhealth.jmir.org/, as well as this copyright and license information must be included.