RefCheck Maintenance Notice

On Monday, December 3, 2018, from 16:00-18:00 EST, RefCheck will be undergoing maintenance. RefCheck is the process where, during copyediting, all references are extracted from the manuscript file, parsed, matched against various databases (eg, PubMed and CrossRef), and automatically corrected. For more information on RefCheck, please visit our Knowledge Base.

Who will be affected?

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 27.07.15 in Vol 3, No 3 (2015): Jul-Sep

This paper is in the following e-collection/theme issue:

    Review

    Expert Involvement and Adherence to Medical Evidence in Medical Mobile Phone Apps: A Systematic Review

    1Centre for Clinical Education, The Capital Region of Denmark, Copenhagen, Denmark

    2Clinical Eye Research Unit, Department of Ophthalmology, Copenhagen University Hospital Roskilde, Roskilde, Denmark

    3Department of Ophthalmology, Copenhagen University Hospital Glostrup, Glostrup, Denmark

    Corresponding Author:

    Yousif Subhi, MD

    Centre for Clinical Education

    The Capital Region of Denmark

    Copenhagen University Hospital Rigshospitalet, Section 5404

    Blegdamsvej 9

    Copenhagen, DK-4300

    Denmark

    Phone: 45 28746055

    Fax:45 35454437

    Email:


    ABSTRACT

    Background: Both clinicians and patients use medical mobile phone apps. Anyone can publish medical apps, which leads to contents with variable quality that may have a serious impact on human lives. We herein provide an overview of the prevalence of expert involvement in app development and whether or not app contents adhere to current medical evidence.

    Objective: To systematically review studies evaluating expert involvement or adherence of app content to medical evidence in medical mobile phone apps.

    Methods: We systematically searched 3 databases (PubMed, The Cochrane Library, and EMBASE), and included studies evaluating expert involvement or adherence of app content to medical evidence in medical mobile phone apps. Two authors performed data extraction independently. Qualitative analysis of the included studies was performed.

    Results: Based on inclusion criteria, 52 studies were included in this review. These studies assessed a total of 6520 apps. Studies dealt with a variety of medical specialties and topics. As much as 28 studies assessed expert involvement, which was found in 9-67% of the assessed apps. Thirty studies (including 6 studies that also assessed expert involvement) assessed adherence of app content to current medical evidence. Thirteen studies found that 10-87% of the assessed apps adhered fully to the compared evidence (published studies, recommendations, and guidelines). Seventeen studies found that none of the assessed apps (n=2237) adhered fully to the compared evidence.

    Conclusions: Most medical mobile phone apps lack expert involvement and do not adhere to relevant medical evidence.

    JMIR mHealth uHealth 2015;3(3):e79

    doi:10.2196/mhealth.4169

    KEYWORDS



    Introduction

    Background

    Mobile health is growing [1]. Mobile apps are frequently used in daily clinical practice and enable immediate on-the-go access to key clinical information that supports clinical decision making [2-5]. Patients use apps for disease information, screening, self-treatment, and management [6-9]. One may rightly ask, “Who provides us our app content?” Currently, anyone can publish medical apps. Although some app stores check for fulfillment of a number of technical criteria (eg, whether the app crashes upon launch), no one validates the medical content and no expert approval or peer-review systems exist. Consequently, there are apps with variable quality: opioid-conversion apps suggest medication doses that may threaten patient safety [10], asthma self-treatment apps contain potentially life-threatening information [11], and very few apps on cardiopulmonary resuscitation are actually designed according to existing basic life-support guidelines [12].

    Objective

    From the aforementioned discussion, it is obvious that we need an overview of the literature to understand the extent of this problem. In this paper, we review studies that evaluate quality of medical apps by evaluating expert involvement or adherence of app content to medical evidence. We relate our findings to current initiatives that seek to encounter this problem.


    Methods

    Eligibility Criteria

    We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines for reporting systematic reviews [13]. We included studies evaluating expert involvement or adherence of app content to medical evidence in medical mobile phone apps. The following studies were considered eligible: (1) investigating medical mobile phone apps within a predefined topic using a search strategy, and (2) assessing expert involvement or adherence to relevant medical evidence. Given that the definition of an expert and acceptable credentials may vary widely, we did not restrict the inclusion of studies to our own definitions of these concepts. Similarly, the degree of adherence to relevant medical evidence was not defined in advance; instead, we noted the included studies’ own definitions and judgments. Language was restricted to only English. Case studies and reviews of a single app were excluded, because they did not include a search strategy to systematically review available apps.

    Search Strategy and Study Selection

    We searched existing literature through the bibliographic databases PubMed, The Cochrane Library, and EMBASE using the following search terms: (“smartphone” OR “iPhone” OR “Android”) AND (“app” OR “application”). This broad search string was used to identify as many relevant studies as possible. The last search was performed March 17, 2015. One researcher (YS) removed all duplicates and screened all abstracts. All potentially eligible studies were read in full by 2 independent researchers (YS and SRB). Disagreements were resolved by discussion. References of all included studies were read to find additional eligible studies. We only included studies with original data.

    Data Collection and Synthesis of Results

    The research group piloted a data-extraction form. We extracted information on topic, app stores searched, methods used for assessment of expert involvement/adherence to medical evidence, and study results. Two researchers (YS and SHB) extracted data independently. Disagreements were solved through discussion and consensus. Microsoft Excel (Redmond, WA, USA) was used for data collection and management. The heterogeneity of the studies did not permit pooling of study results to conduct a meta-analysis. All studies were included in a qualitative analysis.


    Results

    Studies Identified

    The broad search strategy yielded 1936 records, of which a great number were duplicates or irrelevant (eg, mobile-phone-assisted data collection in biomedical research). Fifty-two studies were identified as relevant, and included in this review. These studies assessed 6520 apps. Details on search and study selection are presented in Figure 1.

    Included studies are presented in Tables 1 and 2. Topics tended to be broader for studies of expert involvement (eg, dermatology [14], ophthalmology [15], or pain management [16-18]), whereas studies on adherence to medical evidence tended to be more specific (eg, asthma self-management [11], prostate cancer [19], pediatric obesity [20,21]). Studies included a median of 71 apps (interquartile range 41-148), and studies of expert involvement tended to have slightly higher number of included apps with a median of 85 apps (interquartile range 39-192 apps), compared with studies of adherence to medical evidence having a median of 63 apps (interquartile range 40-104 apps). Studies reviewed mostly included apps from the Apple App Store (n=49, 98%) and Google Android Market (n=36, 71%). Fewer studies included apps from less popular app stores such as BlackBerry Market (n=19, 38%), Windows Market (n=16, 32%), Nokia Ovi (n=11, 22%), and Samsung Market (n=9, 18%). Studies that included a search in these less popular app stores were often unable to find any relevant apps for inclusion [10,16,22-26].

    Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram of search results and study selection.
    View this figure

    Studies on Expert Involvement

    Twenty-eight studies assessed 3852 apps for expert involvement (Table 1). These studies dealt with topics within a variety of medical specialties and topics. The following 2 topics were assessed more than once: pain management (n=3) [16-18] and bariatric surgery (n=2) [22,23]. Studies mostly used the app stores’ app description (n=28, 100%) and the developers’ website (n=15, 54%) to determine whether an app had expert involvement. Nine studies (32%) also downloaded the apps. All studies found that at least some of the assessed apps had expert involvement and none found expert involvement in all assessed apps. Overall, expert involvement was found in 9-67% of assessed apps.

    Table 1. Included studies with assessment of expert involvement.a,b,c,d
    View this table

    Studies on Adherence to Medical Evidence

    Thirty studies assessed 3051 apps for adherence to medical evidence (Table 2). Six topics were investigated in more than 1 study: weight loss (n=4) [44-47], smoking cessation (n=3) [48-50], disease self-management (n=3) [11,51,52], pediatric obesity (n=2) [20,21], physical activity (n=2) [53,54], and sports injury (n=2) [55,56]. Remaining studies investigated apps on a diverse range of topics. Assessment was mostly based on downloaded app content (n=24, 86%). In 2 studies, it was unclear whether the assessment was based on downloaded app content [52,57]. Three studies only used the app stores’ app description for the assessment [38,44,58]. Studies compared the apps with a variety of forms of medical evidence. For example, smoking cessation apps were compared with US Public Health Service’s clinical practice guidelines for treating tobacco use and dependence [48,49]. Several studies correlated the app contents with available Cochrane reviews, other systematic reviews, or other published evidence [10,11,28,38,41,55,58-60]. In 6 studies, the assessment relied on criteria for ideal app contents as defined by the authors [33,61] or whether the app contents adhered to the general knowledge of the authors [19,43,46,62]. In 17 studies, none of the assessed apps (n=2237) adhered fully to the compared evidence [11,20,21,33,38,44-49,51-54,58,61]. In the remaining 13 studies, 10-87% of the assessed apps showed complete adherence to medical evidence [10,12,19,28,41,43,50,55-57,59,60,62]. Of these, only 5 studies found that more than half of the assessed apps showed complete adherence to medical evidence [19,41,56,60,62]; of note, 2 of these were based on the authors’ own self-stated expertise [19,62]. In most studies, a number of apps adhered partly to the assessed evidence. No topic was clearly associated with a higher or lower prevalence of adherence to available evidence—lack of adherence was highly prevalent in all studied topics.

    Table 2. Included studies with assessment of adherence to available evidence.a-d
    View this table

    Discussion

    Principal Findings

    Medical apps may save lives; with no regulation of the content, however, we fear that they may also do harm. Studies in this review focused on a wide range of medical topics, app platforms, and assessment methods and all reached one general conclusion: medical mobile phone apps generally lack expert involvement and do not adhere to relevant medical evidence. Expert involvement was found in 9-67% of assessed apps. Adherence to medical evidence was found in 10-87% of the assessed apps in 13 studies, and in none of the assessed apps in 17 studies. Medical professionals and patients should be aware of this, as mobile phones increasingly play a role in medical education [5], clinical decision making [2], and patient empowerment [6-9].

    For the common user, it may be practically impossible to assess whether or not an app adheres to current evidence and guidelines. In some cases, the app descriptions include references to publications from which the content is based. Levels of evidence as defined by the Oxford Centre for Evidence-Based Medicine state that systematic reviews and individual studies rank higher than opinions of an expert, but an expert opinion ranks better than nothing [63]. Hence, although expert involvement does not guarantee adherence to relevant medical evidence, it may be safer to have an expert involved than none.

    Cheap and technically simple methods enable experts and clinicians to develop medical apps on their own [64-67]. These methods are based on Web apps developed using tools with a simple interface, hosted online, and distributed by the experts and clinicians [64-67]. Published examples include 1 Web app with clinical instructional videos for joint examination and 1 Web app with videos on psychiatric assessments and psychopathology lessons [65,67]. These works demonstrate that it is possible for experts to develop Web apps on their own with useful results [64,66]. However, 1 study in our review assessed both expert involvement and adherence of content to published evidence among opioid-conversion apps, and found that expert involvement per se does not necessarily lead to medical correctness of the content [10].

    Apps can be considered an interactive way of communicating knowledge. We already use peer-review systems for such purposes—at least in scholarly journals—and one way of ensuring medically correct apps could be through peer reviews, which due to the unregulated nature of app stores would arrive after app publication. There are examples of short publications in medical journals of a review of 1 or more apps [68,69], and app developers are able to get an independent app review by submitting a request to Journal of Medical Internet Research mHealth and uHealth [70]. In addition, dedicated Web pages for app reviews exist [71,72]. One example of this is the Health Apps Library, which is developed and supported by the National Health Service in the United Kingdom [72]. The Health Apps Library enables developers to submit their app for review by clinicians that assesses whether the app is relevant to people in the United Kingdom, provide information from trusted sources, and comply with relevant data protection regulations [72]. The clinician then decides whether the app can be approved and published on the Health Apps Library [72]. However, even if a review exists, the user may not be aware of this. If the review is undesirable, the app developer may omit from referring to the review, which creates a bias. Previous studies on health information on the Internet reported similar results—some sources provide medically correct information, and some do not [73]—therefore, the problem highlighted in our systematic review is not new. However, some differences do exist when dealing with apps, which may allow to address this problem in the future. Apps are already reviewed by app stores before publication and app stores provide a streamlined access to content. Therefore, one possible way of addressing this problem could involve the collaboration between app stores and a regulatory third party such as the Health Apps Library when publishing apps with medical content.

    Limitations

    Limitations of our approach should be noted. Apps can have expert involvement without stating it to the user, and app content may be accurate without referring to medical publications. In addition, apps with expert involvement can also contain inaccurate information, and referring to medical publications does not prevent out-of-date or inaccurate content. None of the studies included assessment of the actual use of the apps, which would provide an interesting dimension to our research question, as owning an app does not necessarily mean that the apps is used. These dimensions may be enlightened by future studies. Our review found that different methods were used for the assessment of expert involvement and medical adherence. Some studies assessed expert involvement or adherence to medical evidence only by reviewing the app descriptions in the app stores and by visiting the app developers’ website (Tables 1 and 2). For example, 1 study reviewed apps dealing with alcohol abuse and categorized each app’s approach using the app description [58]. We acknowledge that in some cases, this approach may provide sufficient results. However, one should note that app descriptions do not necessarily reflect the actual app content. Therefore, future studies are encouraged to download and review actual app content. A clear consensus on a methodological golden standard does not exist, but we are currently seeing inspiring studies that explore different methods that evaluate authorship and content [47,74]. One recent example is the Mobile App Rating Scale (MARS), a 23-item assessment tool that provides quality scores for an app within 5 dimensions (engagement, functionality, aesthetics, information quality, and subjective quality), which demonstrated a high level of internal consistency and inter-rater reliability [74]. Reliable tools such as the MARS are important for the future direction of how and what to review, and may help future research in providing more comparable results.

    In conclusion, most medical mobile phone apps lack expert involvement and do not adhere to relevant medical evidence. Because mobile phones are highly prevalent among medical professionals and patients, this poses a significant problem. Review services do exist, but additional effort is needed, and attention to the problem may help the community to figure out the solutions of the future.

    Acknowledgments

    This study was supported by the Quality in Education grant “Undervisningskvalitetspuljen” from the University of Copenhagen, Copenhagen, Denmark. The sponsors had no role in the design and conduct of the study, collection, management, analysis, and interpretation of the data, nor in preparation, review, and approval of the manuscript.

    Conflicts of Interest

    None declared.

    References

    1. Becker S, Miron-Shatz T, Schumacher N, Krocza J, Diamantidis C, Albrecht U. mHealth 2.0: Experiences, possibilities, and perspectives. JMIR Mhealth Uhealth 2014;2(2):e24 [FREE Full text] [CrossRef] [Medline]
    2. Payne KB, Wharrad H, Watts K. Smartphone and medical related app use among medical students and junior doctors in the United Kingdom (UK): A regional survey. BMC Med Inform Decis Mak 2012;12:121 [FREE Full text] [CrossRef] [Medline]
    3. Franko OI, Tirrell TF. Smartphone app use among medical providers in ACGME training programs. J Med Syst 2012 Oct;36(5):3135-3139. [CrossRef] [Medline]
    4. O'Connor P, Byrne D, Butt M, Offiah G, Lydon S, Mc Inerney K, et al. Interns and their smartphones: Use for clinical practice. Postgrad Med J 2014 Feb;90(1060):75-79. [CrossRef] [Medline]
    5. Hardyman W, Bullock A, Brown A, Carter-Ingram S, Stacey M. Mobile technology supporting trainee doctors' workplace learning and patient care: An evaluation. BMC Med Educ 2013;13:6 [FREE Full text] [CrossRef] [Medline]
    6. Lee J, Nguyen AL, Berg J, Amin A, Bachman M, Guo Y, et al. Attitudes and preferences on the use of mobile health technology and health games for self-management: Interviews with older adults on anticoagulation therapy. JMIR Mhealth Uhealth 2014;2(3):e32 [FREE Full text] [CrossRef] [Medline]
    7. Mirkovic J, Kaufman DR, Ruland CM. Supporting cancer patients in illness management: Usability evaluation of a mobile app. JMIR Mhealth Uhealth 2014;2(3):e33 [FREE Full text] [CrossRef] [Medline]
    8. Bender JL, Yue RY, To MJ, Deacken L, Jadad AR. A lot of action, but not in the right direction: Systematic review and content analysis of smartphone applications for the prevention, detection, and management of cancer. J Med Internet Res 2013;15(12):e287 [FREE Full text] [CrossRef] [Medline]
    9. Martínez-Pérez B, de la Torre-Díez I, López-Coronado M, Herreros-González J. Mobile apps in cardiology: Review. JMIR Mhealth Uhealth 2013;1(2):e15 [FREE Full text] [CrossRef] [Medline]
    10. Haffey F, Brady RR, Maxwell S. A comparison of the reliability of smartphone apps for opioid conversion. Drug Saf 2013 Feb;36(2):111-117. [CrossRef] [Medline]
    11. Huckvale K, Car M, Morrison C, Car J. Apps for asthma self-management: A systematic assessment of content and tools. BMC Med 2012;10:144 [FREE Full text] [CrossRef] [Medline]
    12. Kalz M, Lenssen N, Felzen M, Rossaint R, Tabuenca B, Specht M, et al. Smartphone apps for cardiopulmonary resuscitation training and real incident support: A mixed-methods evaluation study. J Med Internet Res 2014;16(3):e89 [FREE Full text] [CrossRef] [Medline]
    13. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ 2009;339:b2535 [FREE Full text] [CrossRef] [Medline]
    14. Hamilton AD, Brady RR. Medical professional involvement in smartphone 'apps' in dermatology. Br J Dermatol 2012 Jul;167(1):220-221. [CrossRef] [Medline]
    15. Cheng NM, Chakrabarti R, Kam JK. iPhone applications for eye care professionals: A review of current capabilities and concerns. Telemed J E Health 2014 Apr;20(4):385-387. [CrossRef] [Medline]
    16. Rosser BA, Eccleston C. Smartphone applications for pain management. J Telemed Telecare 2011;17(6):308-312. [CrossRef] [Medline]
    17. Reynoldson C, Stones C, Allsop M, Gardner P, Bennett MI, Closs SJ, et al. Assessing the quality and usability of smartphone apps for pain self-management. Pain Med 2014 Jun;15(6):898-909. [CrossRef] [Medline]
    18. Wallace LS, Dhingra LK. A systematic review of smartphone applications for chronic pain available for download in the United States. J Opioid Manag 2014 Feb;10(1):63-68. [CrossRef] [Medline]
    19. Landau D, Hanjdenberg J. Characterization of prostate cancer information available on smartphones. In: J Clin Oncol. 2012 Presented at: 2012 ASCO Annual Meeting; Jun 1-5, 2012; Chicago, IL p. e15102   URL: http:/​/meeting.​ascopubs.org/​cgi/​content/​abstract/​30/​15_suppl/​e15102?sid=e56b78c1-d8ce-4810-b566-8e0a6b6c4f39
    20. Schoffman DE, Turner-McGrievy G, Jones SJ, Wilcox S. Mobile apps for pediatric obesity prevention and treatment, healthy eating, and physical activity promotion: Just fun and games? Transl Behav Med 2013 Sep;3(3):320-325 [FREE Full text] [CrossRef] [Medline]
    21. Wearing JR, Nollen N, Befort C, Davis AM, Agemy CK. iPhone app adherence to expert-recommended guidelines for pediatric obesity prevention. Child Obes 2014 Apr;10(2):132-144 [FREE Full text] [CrossRef] [Medline]
    22. Connor K, Brady RR, Tulloh B, de Beaux A. Smartphone applications (apps) for bariatric surgery. Obes Surg 2013 Oct;23(10):1669-1672. [CrossRef] [Medline]
    23. Stevens DJ, Jackson JA, Howes N, Morgan J. Obesity surgery smartphone apps: A review. Obes Surg 2014 Jan;24(1):32-36. [CrossRef] [Medline]
    24. Visvanathan A, Hamilton A, Brady RR. Smartphone apps in microbiology—Is better regulation required? Clin Microbiol Infect 2012 Jul;18(7):E218-E220 [FREE Full text] [CrossRef] [Medline]
    25. Rodrigues MA, Visvanathan A, Murchison JT, Brady RR. Radiology smartphone applications; current provision and cautions. Insights Imaging 2013 Oct;4(5):555-562 [FREE Full text] [CrossRef] [Medline]
    26. Carter T, O'Neill S, Johns N, Brady RR. Contemporary vascular smartphone medical applications. Ann Vasc Surg 2013 Aug;27(6):804-809. [CrossRef] [Medline]
    27. Savic M, Best D, Rodda S, Lubman DI. Exploring the focus and experiences of smartphone applications for addiction recovery. J Addict Dis 2013;32(3):310-319. [CrossRef] [Medline]
    28. Mobasheri MH, Johnston M, King D, Leff D, Thiruchelvam P, Darzi A. Smartphone breast applications: What's the evidence? Breast 2014 Oct;23(5):683-689. [CrossRef] [Medline]
    29. Edlin JC, Deshpande RP. Caveats of smartphone applications for the cardiothoracic trainee. J Thorac Cardiovasc Surg 2013 Dec;146(6):1321-1326. [CrossRef] [Medline]
    30. O'Neill S, Brady RR. Colorectal smartphone apps: Opportunities and risks. Colorectal Dis 2012 Sep;14(9):e530-e534. [CrossRef] [Medline]
    31. Gal N, Zite NB, Wallace LS. Evaluation of smartphone oral contraceptive reminder applications. Res Social Adm Pharm 2015;11(4):584-587. [CrossRef] [Medline]
    32. Shen N, Levitan MJ, Johnson A, Bender JL, Hamilton-Page M, Jadad AA, et al. Finding a depression app: A review and content analysis of the depression app marketplace. JMIR Mhealth Uhealth 2015;3(1):e16 [FREE Full text] [CrossRef] [Medline]
    33. Hundert AS, Huguet A, McGrath PJ, Stinson JN, Wheaton M. Commercially available mobile phone headache diary apps: A systematic review. JMIR Mhealth Uhealth 2014;2(3):e36 [FREE Full text] [CrossRef] [Medline]
    34. Cantudo-Cuenca MR, Robustillo-Cortés MA, Cantudo-Cuenca MD, Morillo-Verdugo R. A better regulation is required in viral hepatitis smartphone applications. Farm Hosp 2014;38(2):112-117 [FREE Full text] [Medline]
    35. Connor K, Brady RR, de Beaux A, Tulloh B. Contemporary hernia smartphone applications (apps). Hernia 2014 Aug;18(4):557-561. [CrossRef] [Medline]
    36. Cuenca MRC, Cuenca MDC, Verdugo RM. Availability and medical professional involvement in mobile healthcare applications related to pathophysiology and pharmacotherapy of HIV/AIDS. Eur J Hosp Pharm 2013 Sep 03;20(6):356-361. [CrossRef]
    37. Sucala M, Schnur JB, Glazier K, Miller SJ, Green JP, Montgomery GH. Hypnosis—There's an app for that: A systematic review of hypnosis apps. Int J Clin Exp Hypn 2013;61(4):463-474 [FREE Full text] [CrossRef] [Medline]
    38. Kassianos AP, Emery JD, Murchie P, Walter FM. Smartphone applications for melanoma detection by community, patient and generalist clinician users: A review. Br J Dermatol 2015 Jun;172(6):1507-1518. [CrossRef] [Medline]
    39. Zaki M, Drazin D. Smartphone use in neurosurgery? APP-solutely!. Surg Neurol Int 2014;5:113 [FREE Full text] [CrossRef] [Medline]
    40. Haffey F, Brady RR, Maxwell S. Smartphone apps to support hospital prescribing and pharmacology education: A review of current provision. Br J Clin Pharmacol 2014 Jan;77(1):31-38. [CrossRef] [Medline]
    41. Dubey D, Amritphale A, Sawhney A, Amritphale N, Dubey P, Pandey A. Smart phone applications as a source of information on stroke. J Stroke 2014 May;16(2):86-90 [FREE Full text] [CrossRef] [Medline]
    42. Kulendran M, Lim M, Laws G, Chow A, Nehme J, Darzi A, et al. Surgical smartphone applications across different platforms: Their evolution, uses, and users. Surg Innov 2014 Aug;21(4):427-440. [CrossRef] [Medline]
    43. Stevens DJ, McKenzie K, Cui HW, Noble JG, Turney BW. Smartphone apps for urolithiasis. Urolithiasis 2015 Feb;43(1):13-19. [CrossRef] [Medline]
    44. Breton ER, Fuemmeler BF, Abroms LC. Weight loss-there is an app for that! But does it adhere to evidence-informed practices? Transl Behav Med 2011 Dec;1(4):523-529 [FREE Full text] [CrossRef] [Medline]
    45. Pagoto S, Schneider K, Jojic M, DeBiasse M, Mann D. Evidence-based strategies in weight-loss mobile apps. Am J Prev Med 2013 Nov;45(5):576-582. [CrossRef] [Medline]
    46. Alnasser A, Sathiaseelan A, Al-Khalifa A, Marais D. Do arabic weight loss mobile applications comply with evidence-based weight management guidelines? In: Ann Nutr Metab. 2013 Presented at: 20th International Congress of Nutrition; Sept 15-20, 2013; Granada, Spain p. 1130-1130. [CrossRef]
    47. Jeon E, Park HA, Min YH, Kim HY. Analysis of the information quality of korean obesity-management smartphone applications. Healthc Inform Res 2014 Jan;20(1):23-29 [FREE Full text] [CrossRef] [Medline]
    48. Abroms LC, Padmanabhan N, Thaweethai L, Phillips T. iPhone apps for smoking cessation: A content analysis. Am J Prev Med 2011 Mar;40(3):279-285 [FREE Full text] [CrossRef] [Medline]
    49. Abroms LC, Lee Westmaas J, Bontemps-Jones J, Ramani R, Mellerson J. A content analysis of popular smartphone apps for smoking cessation. Am J Prev Med 2013 Dec;45(6):732-736 [FREE Full text] [CrossRef] [Medline]
    50. Choi J, Noh GY, Park DJ. Smoking cessation apps for smartphones: Content analysis with the self-determination theory. J Med Internet Res 2014;16(2):e44 [FREE Full text] [CrossRef] [Medline]
    51. Breland JY, Yeh VM, Yu J. Adherence to evidence-based guidelines among diabetes self-management apps. Transl Behav Med 2013 Sep;3(3):277-286 [FREE Full text] [CrossRef] [Medline]
    52. Korn D, Makowsky M. An evaluation of hypertension self-management applications for the iPhone. J Pharm Pharm Sci 2013;16(3):118s [FREE Full text]
    53. Cowan LT, Van Wagenen SA, Brown BA, Hedin RJ, Seino-Stephan Y, Hall PC, et al. Apps of steel: Are exercise apps providing consumers with realistic expectations?: A content analysis of exercise apps for presence of behavior change theory. Health Educ Behav 2013 Apr;40(2):133-139. [CrossRef] [Medline]
    54. Middelweerd A, Mollee JS, van der Wal NC, Brug J, Te Velde SJ. Apps to promote physical activity among adults: A review and content analysis. Int J Behav Nutr Phys Act 2014;11:97 [FREE Full text] [CrossRef] [Medline]
    55. van Mechelen DM, van Mechelen W, Verhagen EA. Sports injury prevention in your pocket?! Prevention apps assessed against the available scientific evidence: A review. Br J Sports Med 2014 Jun;48(11):878-882. [CrossRef] [Medline]
    56. Lee H, Sullivan SJ, Schneiders AG, Ahmed OH, Balasundaram AP, Williams D, et al. Smartphone and tablet apps for concussion road warriors (team clinicians): A systematic review for practical users. Br J Sports Med 2015 Apr;49(8):499-505. [CrossRef] [Medline]
    57. Floyd C, Parmesar K, Ferro A. Monitoring of hypertension using smartphone applications: A systematic review. Can J Cardiol 2014 Oct;30(10):S213. [CrossRef]
    58. Cohn AM, Hunter-Reel D, Hagman BT, Mitchell J. Promoting behavior change from alcohol use through mobile technology: The future of ecological momentary assessment. Alcohol Clin Exp Res 2011 Dec;35(12):2209-2215 [FREE Full text] [CrossRef] [Medline]
    59. Fairburn CG, Rothwell ER. Apps and eating disorders: A systematic clinical appraisal. Int J Eat Disord 2015 Feb 27 (forthcoming). [CrossRef] [Medline]
    60. Rozati H, Shah SP, Shah N. Smartphone applications for the clinical oncologist in UK practice. J Cancer Educ 2015 Jun;30(2):367-373. [CrossRef] [Medline]
    61. Dayer L, Heldenbrand S, Anderson P, Gubbins PO, Martin BC. Smartphone medication adherence apps: Potential benefits to patients and providers. J Am Pharm Assoc (2003) 2013;53(2):172-181 [FREE Full text] [CrossRef] [Medline]
    62. Pandey A, Hasan S, Dubey D, Sarangi S. Smartphone apps as a source of cancer information: Changing trends in health information-seeking behavior. J Cancer Educ 2013 Mar;28(1):138-142. [CrossRef] [Medline]
    63. OCEBM Levels of Evidence Working Group. The Oxford 2011 Levels of Evidence. Oxford, UK: Oxford Centre for Evidence-Based Medicine; 2011.   URL: http://www.cebm.net/index.aspx?o=5653 [accessed 2014-12-20] [WebCite Cache]
    64. Subhi Y, Todsen T, Ringsted C, Konge L. Designing web-apps for smartphones can be easy as making slideshow presentations. BMC Res Notes 2014;7:94 [FREE Full text] [CrossRef] [Medline]
    65. Subhi Y, Foss KT, Henriksen MJV, Todsen T. Development and use of web-based apps. Tidsskr Læring Medier 2014;7(12):12 [FREE Full text]
    66. Zhang M, Cheow E, Ho C, Ng BY, Ho R, Cheok CC. Application of low-cost methodologies for mobile phone app development. JMIR Mhealth Uhealth 2014;2(4):e55 [FREE Full text] [CrossRef] [Medline]
    67. Zhang MW, Tsang T, Cheow E, Ho C, Yeong NB, Ho RC. Enabling psychiatrists to be mobile phone app developers: Insights into app development methodologies. JMIR Mhealth Uhealth 2014;2(4):e53 [FREE Full text] [CrossRef] [Medline]
    68. Shih G. App review series introduction. J Digit Imaging 2014 Dec 17;28(1):7-9. [CrossRef]
    69. Christensen S, Morelli R. MyPapp: A mobile app to enhance understanding of pap testing. Comput Inform Nurs 2012 Dec;30(12):627-631. [Medline]
    70. JMIR Publications. Contribute to mHealth Research and Development of the New JMIR mHealth Peer-Review Tool for Mobile Apps!. Toronto, Canada: JMIR mHealth and uHealth   URL: http://mhealth.jmir.org/announcement/view/78 [accessed 2014-12-20] [WebCite Cache]
    71. iMedicalApps.   URL: http://www.imedicalapps.com/ [accessed 2014-12-20] [WebCite Cache]
    72. National Health Service. The Health Apps Library. London, UK: National Health Service   URL: http://apps.nhs.uk/about/ [accessed 2015-05-31] [WebCite Cache]
    73. Eysenbach G, Powell J, Kuss O, Sa ER. Empirical studies assessing the quality of health information for consumers on the world wide web: A systematic review. JAMA 2002;287(20):2691-2700. [Medline]
    74. Stoyanov SR, Hides L, Kavanagh DJ, Zelenko O, Tjondronegoro D, Mani M. Mobile app rating scale: A new tool for assessing the quality of health mobile apps. JMIR Mhealth Uhealth 2015;3(1):e27 [FREE Full text] [CrossRef] [Medline]


    Abbreviations

    MARS: Mobile App Rating Scale


    Edited by G Eysenbach; submitted 22.12.14; peer-reviewed by M Zhang, K Muessig; comments to author 12.02.15; revised version received 24.02.15; accepted 31.05.15; published 27.07.15

    ©Yousif Subhi, Sarah Hjartbro Bube, Signe Rolskov Bojsen, Ann Sofia Skou Thomsen, Lars Konge. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 27.07.2015.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.