JMIR Publications

JMIR mHealth and uHealth

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 06.10.17 in Vol 5, No 10 (2017): October

This paper is in the following e-collection/theme issue:

    Viewpoint

    A Call to Digital Health Practitioners: New Guidelines Can Help Improve the Quality of Digital Health Evidence

    1Bloomberg School of Public Health, Johns Hopkins University, Baltimore, MD, United States

    2Global mHealth Initiative, Johns Hopkins University, Baltimore, MD, United States

    Corresponding Author:

    Alain B Labrique, MPH, PhD

    Bloomberg School of Public Health

    Johns Hopkins University

    615 N. Wolfe Street

    Baltimore, MD,

    United States

    Phone: 1 443 287 4744

    Fax:1 410 510 1055

    Email:


    ABSTRACT

    Background: Despite the rapid proliferation of health interventions that employ digital tools, the evidence on the effectiveness of such approaches remains insufficient and of variable quality. To address gaps in the comprehensiveness and quality of reporting on the effectiveness of digital programs, the mHealth Technical Evidence Review Group (mTERG), convened by the World Health Organization, proposed the mHealth Evidence Reporting and Assessment (mERA) checklist to address existing gaps in the comprehensiveness and quality of reporting on the effectiveness of digital health programs.

    Objective: We present an overview of the mERA checklist and encourage researchers working in the digital health space to use the mERA checklist for reporting their research.

    Methods: The development of the mERA checklist consisted of convening an expert group to recommend an appropriate approach, convening a global expert review panel for checklist development, and pilot-testing the checklist.

    Results: The mERA checklist consists of 16 core mHealth items that define what the mHealth intervention is (content), where it is being implemented (context), and how it was implemented (technical features). Additionally, a 29-item methodology checklist guides authors on reporting critical aspects of the research methodology employed in the study. We recommend that the core mERA checklist is used in conjunction with an appropriate study-design specific checklist.

    Conclusions: The mERA checklist aims to assist authors in reporting on digital health research, guide reviewers and policymakers in synthesizing evidence, and guide journal editors in assessing the completeness in reporting on digital health studies. An increase in transparent and rigorous reporting can help identify gaps in the conduct of research and understand the effects of digital health interventions as a field of inquiry.

    JMIR Mhealth Uhealth 2017;5(10):e136

    doi:10.2196/mhealth.6640

    KEYWORDS



    Introduction

    Over the last decade, there has been a dramatic increase in health programs employing digital tools, such as mobile phones and tablets, to stimulate demand for or the delivery of health care services. This is especially true in low- and middle-income countries, where public health practitioners are tapping into the unprecedented growth in the use of mobile phones to overcome information and communications challenges [1,2]. Donors have rallied around digital approaches, and much has been invested into developing, testing, and deploying digital systems. However, after nearly a decade of concerted efforts, widely available evidence in support of digital health is limited [1,3,4]. As an emergent field, there is substantial variability in the reporting of digital program implementations, evaluations, and outcomes. Inconsistency in reporting is problematic as it limits policy makers’ ability to understand precise program details and extract, compare, and synthesize linkages (if any) between the digital investments and consequent health effects.

    To address gaps in the comprehensiveness and quality of reporting on the effectiveness of digital programs, the mHealth Technical Evidence Review Group (mTERG)—an expert committee convened by the World Health Organization (WHO) to advise on approaches to strengthening digital health evidence—proposed guidelines for reporting evidence on the development and evaluation of digital health interventions. These guidelines—presented as the mHealth Evidence Reporting and Assessment (mERA) checklists—were published in March 2016 [5] and have since been widely accessed [1,6-10].


    Methods

    The design of the mERA checklist followed a systematic process for the development of reporting guidelines [11]. In October 2012, WHO convened an expert working group led by the Johns Hopkins Global mHealth Initiative to develop an approach for the mERA guideline. In December 2012, this working group presented an initial draft of the checklist to a global panel of 18 experts convened by WHO during a 3-day meeting in Montreaux, Switzerland. At this meeting, the approach and checklist underwent intensive analysis for improvement, and a quality of information (QoI) taskforce was established to pilot-test the checklist. After testing by the QoI taskforce, the checklist and associated item descriptions were applied to 10 English language reports to test the applicability of each criterion to a range of existing mHealth literature. Readers may refer to further details about the methodology in the complete manuscript [5].


    Results

    The mERA checklists comprises 2 components. The core mHealth checklist (see Table 1) identifies a minimum amount of information needed to define what the mHealth intervention is (content), where it is being implemented (context), and how it was implemented (technical features). This checklist may be valuable to researchers in reporting on the program and research results in peer-reviewed journals and reports, to policy makers in consolidating evidence and understanding the quality of information that has been used to generate the evidence, and to program implementers thinking through and selecting core elements for new digital health projects. L’Engle et al [12] applied the mERA checklist to evaluate the quality of evidence on the use of digital health approaches to improving sexual and reproductive health outcomes for adolescents. The study found that, on average, 7 out of 16 (41%) of the core mHealth checklist items were reported on, suggesting a lack of the availability of a clear description of the digital health intervention [12]. During the development and testing phase, the mERA checklist was applied to literature on the use of digital devices in reducing drug stockouts and the use of digital protocols to improve provider adherence to treatment protocols. Interested authors should refer to the definitions and examples for the core mHealth checklist available freely online [5].

    Table 1. mHealth Evidence Reporting and Assessment (mERA) core checklist items.
    View this table

    Textbox 1. mHealth Evidence Reporting and Assessment (mERA) methodology.
    View this box

    The methodology checklist (see Textbox 1) outlines 29 items that highlight the key study design features that should be reported by researchers and evaluators of digital health interventions. Authors interested in using this checklist should note that there are other recommended checklists specific to different study designs—for example, Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) for observational studies [13] and Consolidated Standards of Reporting Trials (CONSORT) for randomized trials [14]. We recommend that the core mHealth checklist be used in conjunction with these extant checklists based on the appropriate research study design that is being reported. However, we also recognize that a number of digital health studies that are being conducted to evaluate early-stage digital health interventions are more exploratory in nature, and the extant guidelines might not be as relevant to them. In such cases, the authors may decide to use the mERA methodology checklist, developed to be study-design agnostic, for reporting on the study design and results. A detailed explanation of the mERA methodology checklist items is available as a Web appendix [5].


    Discussion

    We present an overview of the mERA checklist. For details about each of the checklist items under the core checklist items and the methodology items, we refer the readers to the complete publication [5]. The mERA checklist marks the culmination of several years of multiinstitutional collaborations, led by WHO, to determine appropriate standards for reporting on digital health evidence—standards that not only address issues of methodological and reporting rigor but also are responsive to the current state of the digital health space. We recognize that the digital health space is constantly evolving and is somewhat unique in its multidisciplinary nature, borrowing approaches from the fields of health care and technology and often engaging innovators who are unfamiliar with scientific methodologies. The mERA core and methodology checklists were pragmatically developed to be useful to a wide audience of innovators. We expect that the detailed explanations and examples make the checklist easy to use for individuals with varying levels of experience in academic reporting.

    Even as the numbers of digital health interventions continue to increase, the evidence to support such interventions remains sparse. Without the support and shared commitment of the diverse digital health community in advancing the quality of evidence, the state of the much-critiqued “pilotitis” in mHealth will not change [15]. Transparency in the reporting of what constitutes a digital health intervention and clarity on evaluation methods are both critical to determining whether the digital strategy might be scalable to an entire population. In order to support the widespread adoption of the checklist, we encourage digital health researchers and program managers to ensure conformity with the checklist items. Additionally, we would like to call upon editors of journals publishing mHealth literature to encourage the use of the mERA checklist by presenting the link to the guidelines under Instructions to Authors and inclusion of a statement in the manuscript that “this manuscript was developed in conformity with the recommended criteria for reporting digital health as described in the mERA guidelines.”

    Conflicts of Interest

    None declared

    References

    1. Zhao J, Freeman B, Li M. Can mobile phone apps influence people's health behavior change? An evidence review. J Med Internet Res 2016 Oct 31;18(11):e287 [FREE Full text] [CrossRef] [Medline]
    2. Agarwal S, Perry HB, Long L, Labrique AB. Evidence on feasibility and effective use of mHealth strategies by frontline health workers in developing countries: systematic review. Trop Med Int Health 2015 Aug;20(8):1003-1014 [FREE Full text] [CrossRef] [Medline]
    3. Agarwal S, Labrique A. Newborn health on the line: the potential mHealth applications. JAMA 2014 Jul 16;312(3):229-230. [CrossRef] [Medline]
    4. Lee SH, Nurmatov UB, Nwaru BI, Mukherjee M, Grant L, Pagliari C. Effectiveness of mHealth interventions for maternal, newborn and child health in low- and middle-income countries: systematic review and meta-analysis. J Glob Health 2016 Jun;6(1):010401 [FREE Full text] [CrossRef] [Medline]
    5. Agarwal S, LeFevre AE, Lee J, L'Engle K, Mehl G, Sinha C, et al. Guidelines for reporting of health interventions using mobile phones: mobile health (mHealth) evidence reporting and assessment (mERA) checklist. BMJ 2016;352:i1174. [Medline]
    6. Altmetric. Guidelines for reporting of health interventions using mobile phones: mobile health (mHealth) evidence reporting and assessment (mERA) checklist: overview of attention for article published in British Medical Journal.   URL: https://www.altmetric.com/details/6208109 [accessed 2017-02-28] [WebCite Cache]
    7. Fottrell E, Jennings H, Kuddus A, Ahmed N, Morrison J, Akter K, et al. The effect of community groups and mobile phone messages on the prevention and control of diabetes in rural Bangladesh: study protocol for a three-arm cluster randomised controlled trial. Trials 2016 Dec 19;17(1):600 [FREE Full text] [CrossRef] [Medline]
    8. Rothstein JD, Jennings L, Moorthy A, Yang F, Gee L, Romano K, et al. Qualitative assessment of the feasibility, usability, and acceptability of a mobile client data app for community-based maternal, neonatal, and child care in rural Ghana. Int J Telemed Appl 2016;2016:2515420 [FREE Full text] [CrossRef] [Medline]
    9. Pagliari C. Digital support for childbirth in developing countries: seeds of hope in an evidential desert. JAMA Pediatr 2016 Aug 01;170(8):737-739. [CrossRef] [Medline]
    10. Torous J, Firth J, Mueller N, Onnela JP, Baker JT. Methodology and reporting of mobile heath and smartphone application studies for schizophrenia. Harv Rev Psychiatry 2017 Feb 23. [CrossRef] [Medline]
    11. Moher D, Schulz KF, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med 2010 Feb;7(2):e1000217 [FREE Full text] [CrossRef] [Medline]
    12. L'Engle KL, Mangone ER, Parcesepe AM, Agarwal S, Ippoliti NB. Mobile phone interventions for adolescent sexual and reproductive health: a systematic review. Pediatrics 2016 Sep;138(3). [CrossRef] [Medline]
    13. von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP. The strengthening the reporting of observational studies in epidemiology (STROBE) statement: guidelines for reporting observational studies. Lancet 2007 Oct 20;370(9596):1453-1457. [CrossRef] [Medline]
    14. Schulz KF, Altman DG, Moher D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMC Med 2010;8:18 [FREE Full text] [CrossRef] [Medline]
    15. Labrique A, Vasudevan L, Chang LW, Mehl G. H_pe for mHealth: more “y” or “o” on the horizon? Int J Med Inform 2013 May;82(5):467-469 [FREE Full text] [CrossRef] [Medline]


    Abbreviations

    CONSORT: Consolidated Standards of Reporting Trials
    mERA: mHealth Evidence Reporting and Assessment
    mTERG: mHealth Technical Evidence Review Group
    QoI: quality of information
    STROBE: Strengthening the Reporting of Observational Studies in Epidemiology
    WHO: World Health Organization


    Edited by CL Parra-Calderón; submitted 10.09.16; peer-reviewed by K Reuter, P Seth, C Auschra, S Clough, A Garg, R Scott, M Honary, I Montagni; comments to author 23.10.16; revised version received 10.11.16; accepted 28.03.17; published 06.10.17

    ©Smisha Agarwal, Amnesty E Lefevre, Alain B Labrique. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 06.10.2017.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.