JMIR Publications

JMIR mHealth and uHealth

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 01.11.17 in Vol 5, No 11 (2017): November

This paper is in the following e-collection/theme issue:

    Original Paper

    Recruitment and Ongoing Engagement in a UK Smartphone Study Examining the Association Between Weather and Pain: Cohort Study

    1Arthritis Research UK Centre for Epidemiology, University of Manchester, Manchester, United Kingdom

    2NIHR Manchester Musculoskeletal Biomedical Research Unit, Central Manchester University Hospitals NHS Foundation Trust, Manchester, United Kingdom

    3Department of Statistics, University of Warwick, Coventry, United Kingdom

    4Oxford Internet Institute, University of Oxford, Oxford, United Kingdom

    5School of Informatics, University of Edinburgh, Edinburgh, United Kingdom

    6uMotif, London, United Kingdom

    7Centre for Atmospheric Science, School of Earth and Environmental Sciences, University of Manchester, Manchester, United Kingdom

    8Medical Sociology, Division of Population Health, Health Services Research and Primary Care, University of Manchester, Manchester, United Kingdom

    Corresponding Author:

    Katie L Druce, PhD

    Arthritis Research UK Centre for Epidemiology

    University of Manchester

    Stopford Building

    Oxford Road

    Manchester, M13 9PT

    United Kingdom

    Phone: 44 161 275 1604

    Email:


    ABSTRACT

    Background: The huge increase in smartphone use heralds an enormous opportunity for epidemiology research, but there is limited evidence regarding long-term engagement and attrition in mobile health (mHealth) studies.

    Objective: The objective of this study was to examine how representative the Cloudy with a Chance of Pain study population is of wider chronic-pain populations and to explore patterns of engagement among participants during the first 6 months of the study.

    Methods: Participants in the United Kingdom who had chronic pain (≥3 months) and enrolled between January 20, 2016 and January 29, 2016 were eligible if they were aged ≥17 years and used the study app to report any of 10 pain-related symptoms during the study period. Participant characteristics were compared with data from the Health Survey for England (HSE) 2011. Distinct clusters of engagement over time were determined using first-order hidden Markov models, and participant characteristics were compared between the clusters.

    Results: Compared with the data from the HSE, our sample comprised a higher proportion of women (80.51%, 5129/6370 vs 55.61%, 4782/8599) and fewer persons at the extremes of age (16-34 and 75+). Four clusters of engagement were identified: high (13.60%, 865/6370), moderate (21.76%, 1384/6370), low (39.35%, 2503/6370), and tourists (25.44%, 1618/6370), between which median days of data entry ranged from 1 (interquartile range; IQR: 1-1; tourist) to 149 (124-163; high). Those in the high-engagement cluster were typically older, whereas those in the tourist cluster were mostly male. Few other differences distinguished the clusters.

    Conclusions: Cloudy with a Chance of Pain demonstrates a rapid and successful recruitment of a large, representative, and engaged sample of people with chronic pain and provides strong evidence to suggest that smartphones could provide a viable alternative to traditional data collection methods.

    JMIR Mhealth Uhealth 2017;5(11):e168

    doi:10.2196/mhealth.8162

    KEYWORDS



    Introduction

    In the United Kingdom, 70% of adults own a smartphone, over half of whom use apps [1]. This growth in smartphone use within the general population heralds an enormous opportunity for epidemiology and population-health research [2-4], allowing data collection to be integrated into people’s lives. Smartphone apps for health monitoring can potentially deliver frequent and regular self-reported symptoms, whereas sensors on smartphones can aid collection of new data types, including position, movement, and environmental exposures [5].

    Despite high expectations about mobile health (or mHealth) [4] studies and initial evidence that mHealth studies can recruit at scale [5], limited evidence exists on representativeness of populations who participate in digital health studies and patterns of engagement over time [6-8]. This is particularly pertinent, given the known existence of both primary and secondary digital divides, in which younger adults from higher socioeconomic backgrounds are not only more likely to have access to a smartphone device but will also utilize them differently from older adults [1,9]. Thus, though younger adults are more likely to download apps and play games on their devices, older users primarily view their smartphone as a means of communication [1].

    Although smartphones appear to offer a more rapid and mobile method of data collection without compromising completion rates obtained by traditional methods [10,11], relatively little detailed information is available regarding participant recruitment and retention, or engagement, in smartphone studies, particularly when compared with other traditional methods [12] or Web-based studies [13]. Engagement has previously been defined in ways which fail to account for the potentially variable patterns of use through time, including continuity of data entry [5,14-16], and this nonuniformity in definitions makes it difficult to draw conclusions regarding the viability of mHealth studies for longitudinal research.

    Cloudy with a Chance of Pain is a UK smartphone-based, prospective cohort study investigating the link between the weather and pain in people with chronic pain. Specifically, Cloudy with a Chance of Pain seeks to investigate whether self-reported pain severity is associated with weather variables and whether the observed relationships differ between specific patient groups. Earlier research on this topic has been inconclusive [17], despite more than two-thirds of patients with musculoskeletal pain believing that there is an association between the weather and pain [18,19]. The numerous methodological challenges that have traditionally contributed to this ambiguity include small sample sizes, a lack of temporally rich data, and poor availability of data pertaining to geographical and meteorological variability. However, smartphone apps have the capacity to overcome these challenges, if they can recruit and continue to engage a representative study population.

    The two aims of this paper were to examine how representative the Cloudy with a Chance of Pain study population is of wider chronic-pain populations and to explore patterns of engagement among participants during the first 6 months of the study.


    Methods

    From January 20, 2016 to January 20, 2017, Cloudy with a Chance of Pain aimed to recruit over 1000 UK residents aged 17 or over who owned an Android or iPhone operating system (iOS; Apple Inc) smartphone, and who experienced pain for at least the preceding 3 months. The study was advertised through national and regional television, radio and newspaper media, social media, and via charity and patient partner organizations (Multimedia Appendix 1). Further information for interested participants was available on the study website [20].

    To enroll in the study, participants downloaded the uMotif app [21] on their smartphone from the Apple App Store or Google Play Store. After completion of digital consent, the app enabled participants to report their symptoms daily for 6 months, or longer if willing. In the background, the smartphone’s Global Positioning System (GPS) reported hourly location, allowing linkage to local weather data from the Met Office (the UK’s national weather service) and investigation of the association between weather and pain. More details on the app and data collection are provided below.

    Participants included in this analysis were those recruited between January 20, 2016 and February 29, 2016, with patterns of engagement examined through to July 20, 2016, 6 months from the study launch date. Participants provided a year of birth through the consent process to confirm that they were 17 years of age or older. Not everyone who downloaded the app used it, so eligibility was further restricted to those who had reported their symptoms at least once between enrollment and July 20, 2016.

    Ethical approval was obtained in December 2015 from the University of Manchester Research Ethics Committee 4 (ref: ethics/15522).

    Data

    Baseline Data

    The baseline questionnaire collected demographic data: sex, year of birth, and first half of participant’s postcode. Participants reported the site of pain (eg, head, face, knee) and were able to report pain at multiple sites or having pain all over the body. Participants were asked to record whether they had been diagnosed (by a doctor) with rheumatoid arthritis, ankylosing spondylitis or spondyloarthropathy, gout or other calcium-crystal arthritis (eg, pseudogout), arthritis (type not specified), fibromyalgia or chronic widespread pain, chronic headache, or neuropathic pain. A free-text entry box was provided for any diagnoses not otherwise listed. Due to a coding error, diagnoses of osteoarthritis (OA) were not collected for the first 9 weeks of data collection, after which it was included within the above list. A push notification was sent out on March 24, 2016 asking existing participants to indicate whether or not they had the condition. Responses were received from 1157 of 8267 (13.99%) of participants recruited by March 24, 2016. For this reason, prevalence rates of OA are not provided in this paper.

    Participants reported their use of paracetamol, nonsteroidal anti-inflammatory drugs (NSAIDs), simple analgesics, weak opiates, strong opiates, and drugs for neuropathic pain. Participants reported their use of glucocorticoids (steroids), synthetic disease modifying antirheumatic drugs (DMARDs), and biologic DMARDS. Participants could also report the use of other medications. If “other” was selected, a free-text entry box was provided.

    Participants reported how likely they thought it was that the weather was associated with pain using a 0 to 10 numerical rating scale (NRS), where 0 indicated not at all likely and 10 indicated extremely likely. Participants were also asked which weather conditions they most felt were associated with pain, selecting from damp/rain, cold, heat, change in barometric pressure, change in temperature, and other (free-text box provided to specify belief). Examples of data-entry screens are shown in Figure 1.

    Daily Symptom Domains

    Following completion of the baseline questionnaire, participants were asked to report 10 symptoms every day using the uMotif app (Figure 2), prompted by a daily notification at 6:24 p.m. Each symptom was scored in five ordinal categories (eg, pain was scored as no pain, mild, moderate, severe, or very severe). The symptoms were pain severity, fatigue, morning stiffness, the impact of pain on activities, sleep quality, time spent outside, feeling tired on waking, physical activity, mood, and well-being. A study motif was considered complete when all 10 variables were reported at a single time point. The app was codesigned with a patient and public involvement group and refined after a feasibility study of 20 participants with rheumatoid arthritis [22].

    Figure 1. Screenshot of example baseline data collection.
    View this figure

    Analysis

    Representativeness of Participants

    To explore the representativeness of participants recruited to this study, we compared the age and sex distribution of participants with that of a sample of persons with chronic pain (≥3 months) from the Health Survey for England (2011) [23]. The Health Survey for England is a large-scale annual survey that has been conducted since 1994 and recruits a stratified random probability sample of private households within England. Full description of the methods of data collection are available elsewhere [24].

    Engagement

    We sought to define common patterns of engagement (ie, data entry), using a three-step process.

    Following recruitment, individuals were labeled as engaged if they reported any of the ten symptoms on a given day. A first-order hidden Markov model [25,26] was then used to estimate the levels of engagement of participants, using the depmixS4 R package (I Visser, Netherlands)[27] (Multimedia Appendix 2). The model assumed three latent engagement states: high, low, and disengaged. The model was initialized assuming every participant started highly engaged. Furthermore, the model assumed that disengagement was an “absorbing state,” so that participants entering this state could not reengage with the study. Finally, clusters were defined according to different probabilities of transitioning between high engagement, low engagement, and disengagement during the study. The optimal number of clusters between 2 and 10 was identified visually using the “elbow method” [28]. The elbow method involves plotting the curve of log-likelihood against number of clusters, such that the location of a bend (“elbow”) in the plot is considered to identify the best number of clusters. The clusters were generated by a “blind” algorithmic process. Therefore, to assign names to the clusters, the engagement patterns of a random selection of users within each cluster were inspected.

    Comparisons were then made between the clusters regarding duration of study engagement, defined as (1) the median number of days “in study” (defined as the number of days from first to last symptoms report) and (2) the median number of days of data entry (defined as a day when any symptoms were reported). Data completion was compared between the clusters, defined as (1) the total number of segments reported, (2) the total number of complete motifs, (3) the proportion of days in the study (days between enrollment and July 20, 2016) on which complete motifs were reported (days of data entry/total days in study), and (4) the proportion of days of data entry on which complete motifs were reported. Baseline data were then compared between the clusters, with data presented as median and interquartile range (IQR), or proportion and 95% CI where appropriate. Due to the initial configuration of the app, data regarding the mobile-phone platform used by participants are not available for all participants, and we are unable to compare or draw conclusions about how app use differs between Apple and Android platforms.

    Figure 2. Screenshot of motif for daily symptom collection.
    View this figure

    Results

    Of 7972 participants enrolled in the study between January 20, 2016 and February 29, 2016, 6370 (79.90%) were eligible for the analysis in this paper (Table 1). Reasons for failing eligibility included no baseline data (n=802), age indeterminate (n=308) and age <17 (n=3). A further 489 participants had downloaded the app but never reported symptoms. Those who installed the app but did not prospectively record symptoms did not differ from those who recorded symptoms based on age (median 51; IQR 41-61 vs 49; IQR 41-59) or strength of belief in the association between the weather and pain (median 7; IQR 5-9 vs 7; IQR 6-9). However, a larger proportion were male (30.3%, 95% CI 26.2-34.4 vs 19.5, 18.5-20.5).

    Eligible participants were 80.51% (5129/6370) female, with a mean age of 49 years. The majority of those included in the analysis reported pain at more than one site (73.39%, 4675/6370). A further 16.62% (1059/6370) reported pain “all over” and 9.49% (605/6370) reported pain at a single site. The most common diagnosis was arthritis (40.29% type unspecified [2567/6370], 19.12% rheumatoid arthritis [1218/6370]), followed by fibromyalgia/chronic widespread pain (23.75% , 1513/6370) and “other pain diagnosis” (22.64%, 1442/6370). Beliefs about the existence of a relationship between the weather and pain were strong, with a median belief score of 7 (IQR: 6-9). Participants most commonly believed that pain was affected by the damp/rain (74.43% , 4741/6370) and the cold (68.67%, 4374/6370) but least commonly believed that hot weather affected pain (14.76%, 940/6370).

    Table 1. Baseline characteristics of participants eligible for analysis.
    View this table

    Comparison With Other Chronic Pain Populations

    Compared with data from the Health Survey for England (2011) [23], a greater proportion of participants in this study were women (80.52%, 5129/6370 compared with 55.61%, 4782/8599 expected). The age bands 35 to 64 years were over-represented in this study (73.11%, 4657/6370 compared with 51.18%, 4401/8599 expected; Table 2), with fewer participants in the extremes of age (<35: 14.65%, 933/6370 compared with 24.7%, 2126/8599 expected; ≥75: 1.19%, 76/6370 compared with 11.13%, 957/8599 expected).

    Identifying Clusters of Engagement

    Following inspection of the log-likelihood plot (Figure 3), a four-cluster solution was retained. The clusters (Figure 4) were allocated names based on the best description of their engagement patterns: high engagement (14%, 865/6370; red), moderate engagement (22%, 1384/6370; purple), low engagement (39%, 2503/6370; green), and tourists (25%, 1618/6370; teal).

    The proportion of days on which data were entered and rates of data completion varied substantially between clusters (Table 3).The median days “in study” ranged from 175 days (IQR: 152-177) in the high-engagement cluster to 1 day (IQR: 1-1) in the tourist cluster. Participants in the moderate-engagement cluster stayed in the study 10 times longer than those in the low-engagement cluster (88 days, 42-163 vs 8 days, 4-16).

    Those in the high-engagement cluster provided data on most days throughout follow-up (Figure 4). The high-engagement cluster reported complete motifs on 89.13% (106,360/119,332) of the days that they provided data, and the moderate-engagement cluster provided complete motifs on 87.5% (67,704/77,368) of all data-entry days. Rates of completion were slightly lower in the other clusters, with the low-engagement cluster and tourists recording complete motifs on 82.88% (13,415/16,186) and 64.85% (1947/2848) of the days on which any data were reported, respectively (Table 3).

    Table 2. Comparison of the sex and age distribution of persons with chronic pain from the Health Survey for England (2011) and participants recruited to Cloudy with a Chance of Pain.
    View this table
    Figure 3. Plot of the log-likelihood of different numbers of clusters in hidden Markov sequences; the elbow indicates the optimal number of clusters which should be accepted.
    View this figure
    Table 3. Data provided by 6370 Cloudy with a Chance of Pain participants clustered by levels of engagement.
    View this table
    Figure 4. Examples of participants from clusters; High engagement (red), Moderate engagement (purple), Low engagement (green), Tourists (teal).
    View this figure

    Between-Cluster Differences

    Higher engagement was associated with increased age, with a difference of more than 5 years between the median age of those who were in the low-engagement (47, IQR: 39-57), or tourist clusters (49, IQR: 40-58), and those who were in the high-engagement cluster (median 56 years, IQR: 47-63). A substantially lower proportion of those in the tourist cluster were women (76.27%, 1234/1618; 95% CI 74.2-78.3) than any other cluster (high engagement: 82.31%, 712/865; 95% CI 79.6-84.7; moderate engagement: 84.10%, 1164/1384; 95% CI 82.1-85.9; low engagement: 80.66%, 2019/2503; 95% CI 19.1-82.2).

    There were no differences between clusters with respect to the site of pain or in the prevalence of rheumatic disease diagnoses (eg, rheumatoid arthritis, fibromyalgia). The proportion of people in the tourist cluster (17.74%, 287/1618; 95% CI 15.88-19.60) who reported “other” pain conditions was also lower than in the high-engagement (23.70%, 205/865; 95% CI 20.87-26.55), moderate-engagement (24.49%, 339/1384; 95% CI 22.22-26.76), and low-engagement (24.41%, 611/2503; 95% CI 22.73-26.09) groups.

    Table 4. Characteristics of the 6370 Cloudy with a Chance of Pain participants clustered by levels of engagement.
    View this table

    No differences were observed between the clusters regarding the use of analgesics and steroids. Only the use of synthetic DMARDs differed substantially between the clusters, with less of those in the tourist cluster (16.63%, 269/1618; 95% CI 14.82-18.44) reporting taking the medication than those in the other engagement clusters (high engagement: 22.20%, 192/865; 95% CI 19.43-24.97; moderate engagement: 21.10%, 292/1384; 95% CI 18.95-23.25; low engagement: 21.13%, 529/2503; 95% CI 19.53-22.73). Comparable proportions were using biologic or other DMARDs.

    There were no differences in the strength of belief that the weather affected pain, but fewer of those in the high-engagement cluster believed the cold affected their pain (62.31%, 539/865; 95% CI 59.08-65.54) when compared with those in the low-engagement and tourist clusters (71.87%,1799/2503; 95% CI 70.11-73.63 and 68.29%, 1105/1618; 95% CI 66.02-70.56, respectively). Conversely, more of those who were highly engaged (35.49%, 307/865; 95% CI 32.30-38.68) believed that changes in barometric pressure were associated with pain that those in the low-engagement and tourist clusters (28.35%, 714/2503; 95% CI 26.76-30.30 and 28.99%, 469/1618; 95% CI 26.78-31.20, respectively). There were no observed differences in the proportion of participants who believed their pain is associated with damp or rain, heat, or changes in temperature (Table 4).


    Discussion

    Principal Findings

    Cloudy with a Chance of Pain is the first mHealth study to demonstrate successful and rapid mass recruitment of a largely representative sample of highly engaged participants. Among our sample, patterns of ongoing engagement showed that around 1 in 7 participants provided data on most days in the first 6 months, completing full data entry on 89% of those days.

    A major strength of Cloudy with a Chance of Pain is the rapid mass recruitment of eligible participants. Our study benefitted from wide promotion by the UK national media at the time of the study launch, which emphasizes the power of national media to promote. Indeed, as a result of coverage including, among others, the BBC2 television show Trust Me I’m a Doctor on January 20, 2016 and BBC Breakfast on January 26, 2016, 90% of participants enrolled in the study by July 20 were recruited within 1 month of the study launch.

    Furthermore, ongoing engagement within Cloudy with a Chance of Pain was high. More than 30% of participants were in the high-engagement or moderate-engagement cluster, entering data on at least half of days throughout the 6 months. In comparison, fewer than 25% of participants in Apple’s ResearchKit studies were active by 10 weeks [29], with similar proportions active in a physical-activity study by 42 days [14]. In one of the largest mHealth studies reported to date (mPower study of people with Parkinson disease and healthy controls), less than 10% of enrolled participants completed 5 or more days within the first 6 months of the study [5]. One in 7 participants were in the high-engagement cluster and provided data on most days throughout the 6 months; we are not aware of other mHealth studies that have reported such high levels of ongoing engagement to date.

    Previous analyses have used arbitrary definitions that fail to capture the patterns of use through time and may ignore the importance of continuity of data entry [5,14-16]. In contrast, this analysis attempted to account fully for data complexity and made no a priori decisions to define engagement. Thus, this study has improved understanding of the extent to which participants remain engaged over time and provides a promising method for future engagement studies.

    Our recruitment strategy enrolled a sample which comprised an under-representation of males and persons at the extremes of age (<35 years and ≥75 years) than would have been expected from the general population data of the Health Survey for England (2011) [23]. Although women are more likely to respond to more traditional population surveys [30-33], we recruited a much higher proportion of women than would have been expected using traditional recruitment methods. A possible explanation is that women recruited to this study more commonly viewed the television programs or may have perceived the potential for additional benefits for participation. However, we also note that self-selection likely accounts for the observed differences in our population. For example, not only are women known to use social media [34] and health apps more than men [35], but they also use digital content differently [36,37].

    Therefore, future mHealth studies may benefit from the use of supplementary and targeted recruitment strategies used by other digital health interventions [38] in which it would be possible to oversample men, such as the use of health professionals, friends, and families, or work-based campaigns, as well as outreach programs designed to access hard-to-reach groups. Although similar methods could also promote the recruitment of younger adults, other opportunities to promote participation among this group include the use of social networks, community components, and app gamification [39-41].

    Nevertheless, the internal validity of the study results (ie, the relationship between the weather and pain within our sample) is unlikely to be influenced by the excess of women entering the study, as there is no reason to suspect the relationship differs by sex. Analysis of the relationship between the weather and pain, and whether the relationship differs between the sexes, is underway and will be reported separately.

    The impact on external validity (ie, the generalizability) is unclear, as people with a particular belief may have been more inclined to participate (which may, in turn, differ by sex). That said, our findings about beliefs align with prior research that suggests that as many as 92% of patients with arthritis believe in an association between weather and pain [42].

    The reasons for the unprecedented rates of engagement observed in this study are worth exploring, particularly as we sought to collect a large amount of daily data, and this burden on the participants might well have been expected to result in a higher loss to follow-up through time and over such a long period. This study found that older participants were more likely to remain engaged in the study. One possible explanation for this is that older persons are less likely to use smartphone apps [1] and therefore may be less likely to experience “app fatigue” than younger participants. They may also feel a greater responsibility to complete the ongoing data entry once registered or have more time to give to the study. Furthermore, functionalities such as geolocation consume battery power, which may have a greater impact on younger persons, who use their smartphones for a greater number of varied tasks, than on older persons [1]. We note, however, that reasons for declining engagement are likely numerous.

    Earlier studies have sought to examine possible mechanisms of engagement, including the complexity of tasks [3], the time of day data are entered [43], and various functionality features such as reminders, interactivity, tailored content, and delivery of feedback [14,15]. In a feasibility study [22], we reported that key motivators for ongoing engagement were the simple graphical user interface, automated reminders for data entry, a desire to contribute to answering an understandable and engaging research question, and visualization of data. However, limited information was available in this larger study to delineate the motivators of engagement in this population. We also acknowledge that the study did not capture education and income, which would have enabled this study to investigate the potential impact of the digital divide on recruitment and engagement.

    Conclusions

    In summary, Cloudy with a Chance of Pain demonstrates a rapid and successful recruitment of a large and engaged sample of people with chronic pain. Although there may be selection bias toward older females in our study, younger men are also less likely to participate in studies using traditional data-collection methods. Thus, our study provides strong evidence to suggest that smartphones could provide a viable alternative to traditional data collection methods, particularly for collecting daily data over long periods.

    Acknowledgments

    This work was supported by Arthritis Research UK (grant number 21225); the Medical Research Council (MRC)’s Confidence in Concept Scheme (Grant number MC_PC_13070); the Arthritis Research UK Centre for Epidemiology and the Farr Institute @Health eResearch Centre (HeRC) (Grant number MR/K006665/1); MRC Clinician Scientist Award (Grant number G0902272 to W.G.D); and the National Institute for Health Research (NIHR) Manchester Musculoskeletal Biomedical Research Unit (for support of JS). This report includes independent research supported by the National Institute for Health Research Biomedical Research Unit Funding Scheme. The views expressed in this publication are those of the authors and not necessarily those of the National Health Service (NHS), the National Institute for Health Research, or the Department of Health.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Charity and patient partner organisations who facilitated participant recruitment.

    PDF File (Adobe PDF File), 12KB

    Multimedia Appendix 2

    Hidden Markov model.

    PDF File (Adobe PDF File), 23KB

    References

    1. Ofcom. Ofcom: adults media use and attitudes report 2016   URL: https://www.ofcom.org.uk/__data/assets/pdf_file/0026/80828/2016-adults-media-use-and-attitudes.pdf [WebCite Cache]
    2. Salathé M, Bengtsson L, Bodnar TJ, Brewer DD, Brownstein JS, Buckee C, et al. Digital epidemiology. PLoS Comput Biol 2012;8(7):e1002616 [FREE Full text] [CrossRef] [Medline]
    3. Check Hayden E. Mobile-phone health apps deliver data bounty. Nature 2016 Mar 24;531(7595):422-423. [CrossRef] [Medline]
    4. Who.int. 2011. mHealth: new horizons for health through mobile technologies   URL: http://www.who.int/goe/publications/goe_mhealth_web.pdf [WebCite Cache]
    5. Bot BB, Suver C, Neto EC, Kellen M, Klein A, Bare C, et al. The mPower study, Parkinson disease mobile data collected using ResearchKit. Sci Data 2016;3:160011. [Medline]
    6. Eysenbach G. The law of attrition. J Med Internet Res 2005 Mar 31;7(1):e11 [FREE Full text] [CrossRef] [Medline]
    7. Sanders C, Rogers A, Bowen R, Bower P, Hirani S, Cartwright M, et al. Exploring barriers to participation and adoption of telehealth and telecare within the Whole System Demonstrator trial: a qualitative study. BMC Health Serv Res 2012;12:220 [FREE Full text] [CrossRef] [Medline]
    8. Rixon L, Hirani S, Cartwright M, Beynon M, Selva A, Sanders C, et al. What influences withdrawal because of rejection of telehealth - the whole system demonstrator evaluation. J Assist Technol 2013;7(4):219-227 [FREE Full text] [CrossRef]
    9. Latulippe K, Hamel C, Giroux D. Social health inequalities and eHealth: a literature review with qualitative synthesis of theoretical and empirical studies. J Med Internet Res 2017 Apr 27;19(4):e136 [FREE Full text] [CrossRef] [Medline]
    10. Buskirk TD, Andrus CH. Making mobile browser surveys smarter: results from a randomized experiment comparing online surveys completed via computer or smartphone. Field Methods 2014 Apr 14;26(4):322-342. [CrossRef]
    11. Dale O, Hagen KB. Despite technical problems personal digital assistants outperform pen and paper when collecting patient diary data. J Clin Epidemiol 2007 Jan;60(1):8-17. [CrossRef] [Medline]
    12. Marcano Belisario JS, Jamsek J, Huckvale K, O'Donoghue J, Morrison CP, Car J. Comparison of self-administered survey questionnaire responses collected using mobile apps versus other methods. Cochrane Database Syst Rev 2015 Jul 27(7):MR000042. [CrossRef] [Medline]
    13. Lane TS, Armin J, Gordon JS. Online recruitment methods for web-based and mobile health studies: a review of the literature. J Med Internet Res 2015;17(7):e183 [FREE Full text] [CrossRef] [Medline]
    14. Guertler D, Vandelanotte C, Kirwan M, Duncan MJ. Engagement and nonusage attrition with a free physical activity promotion program: the case of 10,000 steps Australia. J Med Internet Res 2015;17(7):e176 [FREE Full text] [CrossRef] [Medline]
    15. Bort-Roig J, Gilson ND, Puig-Ribera A, Contreras RS, Trost SG. Measuring and influencing physical activity with smartphone technology: a systematic review. Sports Med 2014 May;44(5):671-686. [CrossRef] [Medline]
    16. Zan S, Agboola S, Moore SA, Parks KA, Kvedar JC, Jethwani K. Patient engagement with a mobile web-based telemonitoring system for heart failure self-management: a pilot study. JMIR Mhealth Uhealth 2015;3(2):e33 [FREE Full text] [CrossRef] [Medline]
    17. Smedslund G, Hagen KB. Does rain really cause pain? A systematic review of the associations between weather factors and severity of pain in people with rheumatoid arthritis. Eur J Pain 2011 Jan;15(1):5-10. [CrossRef] [Medline]
    18. Jamison RN, Anderson KO, Slater MA. Weather changes and pain: perceived influence of local climate on pain complaint in chronic pain patients. Pain 1995 May;61(2):309-315. [Medline]
    19. Ng J, Scott D, Taneja A, Gow P, Gosai A. Weather changes and pain in rheumatology patients. APLAR J Rheumatol 2004;7(3):204-206. [CrossRef]
    20. Cloudywithachanceofpain.   URL: https://www.cloudywithachanceofpain.com/ [WebCite Cache]
    21. Umotif. Next-generation patient data capture platform   URL: http://umotif.com/ [accessed 2017-06-07] [WebCite Cache]
    22. Reade S, Spencer K, Sergeant JC, Sperrin M, Schultz DM, Ainsworth J, et al. Cloudy with a Chance of Pain: engagement and subsequent attrition of daily data entry in a smartphone pilot study tracking weather, disease severity, and physical activity in patients with rheumatoid arthritis. JMIR Mhealth Uhealth 2017 Mar 24;5(3):e37 [FREE Full text] [CrossRef] [Medline]
    23. Bridges S. Digital.nhs. Health survey for England - 2011. Chronic pain   URL: http:/​/digital.​nhs.uk/​media/​22200/​Health-Survey-for-England-2011-Chapter-9-Chronic-pain/​Any/​HSE2011-Ch9-Chronic-Pain [accessed 2017-06-07] [WebCite Cache]
    24. Boniface S, Bridges S, Craig R, Fuller E, Hancock R, Henderson C, et al. Digital.nhs. 2011. Health Survey for England 2011. Methods and documentation   URL: http:/​/digital.​nhs.uk/​media/​22203/​Health-Survey-for-England-2011-Methods-and-documentation/​Any/​HSE2011-Methods-and-docs [accessed 2017-06-07] [WebCite Cache]
    25. Rabiner L. Ece.ucsb.edu. 1989. A tutorial on hidden Markov models and selected applications in speech recognition   URL: http://www.ece.ucsb.edu/Faculty/Rabiner/ece259/Reprints/tutorial%20on%20hmm%20and%20applications.pdf [accessed 2017-10-19] [WebCite Cache]
    26. Barber D. Discrete-state Markov models. In: Bayesian Reasoning and Machine Learning. Cambridge, UK: Cambridge University Press; 2012:492-508.
    27. Visser I, Speekenbrink M. R-forge.r-project. 2010. depmixS4: An R-package for hidden Markov models   URL: https://r-forge.r-project.org/scm/viewvc.php/*checkout*/papers/jss/V1/depmixS4.pdf?root=depmix [accessed 2017-06-07] [WebCite Cache]
    28. Alpaydin E. Clustering. In: Introduction to Machine Learning. Cambridge, Massachusetts: MIT Press; 2004.
    29. Dorsey ER, Yvonne CY, McConnell MV, Shaw SY, Trister AD, Friend SH. The use of smartphones for health research. Acad Med 2017 Feb;92(2):157-160. [CrossRef] [Medline]
    30. Flüß E, Bond CM, Jones GT, Macfarlane GJ. The effect of an internet option and single-sided printing format to increase the response rate to a population-based study: a randomized controlled trial. BMC Med Res Methodol 2014 Sep 09;14:104 [FREE Full text] [CrossRef] [Medline]
    31. Ayorinde AA, Bhattacharya S, Druce KL, Jones GT, Macfarlane GJ. Chronic pelvic pain in women of reproductive and post-reproductive age: a population-based study. Eur J Pain 2017 Mar;21(3):445-455. [CrossRef] [Medline]
    32. Korkeila K, Suominen S, Ahvenainen J, Ojanlatva A, Rautava P, Helenius H, et al. Non-response and related factors in a nation-wide health survey. Eur J Epidemiol 2001;17(11):991-999. [Medline]
    33. Torrance N, Smith BH, Bennett MI, Lee AJ. The epidemiology of chronic pain of predominantly neuropathic origin. Results from a general population survey. J Pain 2006 Apr;7(4):281-289. [CrossRef] [Medline]
    34. Vermeren I. Brandwatch. Men vs. women: who is more active on social media?   URL: https://www.brandwatch.com/blog/men-vs-women-active-social-media/ [accessed 2017-06-07] [WebCite Cache]
    35. Statista. 2013. Usage penetration of medical, health, and fitness-related apps in Great Britain in 2013, by demographic group   URL: https:/​/www.​statista.com/​statistics/​286875/​fitness-health-and-medical-app-usage-in-great-britain-by-demographic-group/​ [accessed 2017-06-07] [WebCite Cache]
    36. Seale C, Charteris-Black J, MacFarlane A, McPherson A. Interviews and internet forums: a comparison of two sources of qualitative data. Qual Health Res 2010 May;20(5):595-606. [CrossRef] [Medline]
    37. Seale C, Ziebland S, Charteris-Black J. Gender, cancer experience and internet use: a comparative keyword analysis of interviews and online cancer support groups. Soc Sci Med 2006 May;62(10):2577-2590. [CrossRef] [Medline]
    38. O'Connor S, Hanlon P, O'Donnell C, Garcia S, Glanville J, Mair F. Understanding factors affecting patient and public engagement and recruitment to digital health interventions: a systematic review of qualitative studies. BMC Med Inform Decis Mak 2016;16(1):120. [Medline]
    39. Thornton L, Batterham PJ, Fassnacht DB, Kay-Lambkin F, Calear AL, Hunt S. Recruiting for health, medical or psychosocial research using Facebook: systematic review. Internet Interv 2016 May;4:72-81. [CrossRef]
    40. Cafazzo JA, Casselman M, Hamming N, Katzman DK, Palmert MR. Design of an mHealth app for the self-management of adolescent type 1 diabetes: a pilot study. J Med Internet Res 2012;14(3):e70 [FREE Full text] [CrossRef] [Medline]
    41. Herschman J, Kasenberg T, Levy D, Ruth N, Taberner C, Kaufman M, et al. Development of a smartphone app for adolescents with lupus: a collaborative meeting-based methodology inclusive of a wide range of stakeholders. Rev Panam Salud Publica 2014 Jun;35(5-6):471-476 [FREE Full text] [Medline]
    42. Aikman H. The association between arthritis and the weather. Int J Biometeorol 1997 Jun;40(4):192-199. [Medline]
    43. Whitehead L, Seaton P. The effectiveness of self-management mobile phone and tablet apps in long-term condition management: a systematic review. J Med Internet Res 2016;18(5):e97. [Medline]


    Abbreviations

    DMARDs: disease-modifying antirheumatic drugs
    GPS: global positioning system
    HSE: Health Survey for England
    iOS: iPhone operating system
    IQR: interquartile range
    mHealth: mobile health
    NRS: numerical rating scale
    NSAIDs: nonsteroidal anti-inflammatory drugs
    OA: osteoarthritis


    Edited by C Dias; submitted 07.06.17; peer-reviewed by A Elmessiry, M Numans, M Keller; comments to author 13.07.17; revised version received 18.08.17; accepted 27.08.17; published 01.11.17

    ©Katie L Druce, John McBeth, Sabine N van der Veer, David A Selby, Bertie Vidgen, Konstantinos Georgatzis, Bruce Hellman, Rashmi Lakshminarayana, Afiqul Chowdhury, David M Schultz, Caroline Sanders, Jamie C Sergeant, William G Dixon. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 01.11.2017.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.