The Karma system is currently undergoing maintenance (Monday, January 29, 2018).
The maintenance period has been extended to 8PM EST.

Karma Credits will not be available for redeeming during maintenance.
Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 19.07.16 in Vol 4, No 3 (2016): Jul-Sept

This paper is in the following e-collection/theme issue:

    Original Paper

    Variations in the Use of mHealth Tools: The VA Mobile Health Study

    Connected Health, Office of Informatics and Analytics, Veterans Health Administration, Department of Veteran Affairs, Washington, DC, United States

    Corresponding Author:

    Kathleen L Frisbee, MPH, PhD

    Connected Health

    Office of Informatics and Analytics

    Veterans Health Administration, Department of Veteran Affairs

    810 Vermont Ave (102PD)

    Washington, DC, 20420

    United States

    Phone: 1 202 461 5840

    Fax:1 202 461 5840

    Email:


    ABSTRACT

    Background: Mobile health (mHealth) technologies exhibit promise for offering patients and their caregivers point-of-need tools for health self-management. This research study involved the dissemination of iPads containing a suite of mHealth apps to family caregivers of veterans who receive care from the Veterans Affairs (VA) Health Administration and have serious physical or mental injuries.

    Objective: The goal of the study was to identify factors and characteristics of veterans and their family caregivers that predict the use of mHealth apps.

    Methods: Veteran/family caregiver dyads (N=882) enrolled in VA’s Comprehensive Assistance for Family Caregivers program were recruited to participate in an mHealth pilot program. Veterans and caregivers who participated and received an iPad agreed to have their use of the apps monitored and were asked to complete a survey assessing Caregiver Preparedness, Caregiver Traits, and Caregiver Zarit Burden Inventory baseline surveys.

    Results: Of the 882 dyads, 94.9% (837/882) of caregivers were women and 95.7% (844/882) of veteran recipients were men. Mean caregiver age was 40 (SD 10.2) years and mean veteran age was 39 (SD 9.15) years, and 39.8% (351/882) lived in rural locations. Most (89%, 788/882) of the caregivers were spouses. Overall, the most frequently used app was Summary of Care, followed by RX Refill, then Journal, Care4Caregivers, VA Pain Coach, and last, VA PTSD Coach. App use was significantly predicted by the caregiver being a spouse, increased caregiver computer skills, a rural living location, lower levels of caregiver preparedness, veteran mental health diagnosis (other than posttraumatic stress disorder), and veteran age.

    Conclusions: This mHealth Family Caregiver pilot project effectively establishes the VA’s first patient-facing mHealth apps that are integrated within the VA data system. Use varied considerably, and apps that were most used were those that assisted them in their caregiving responsibilities.

    JMIR Mhealth Uhealth 2016;4(3):e89

    doi:10.2196/mhealth.3726

    KEYWORDS



    Introduction

    The US health care system is under tremendous pressure to find ways to reduce costs and improve the quality of care. The responsibility for managing health is shifting from health care providers to patients and their families. This shift reflects an overall trend in health care, moving from a provider-centered delivery system to a patient- and family-centered participatory model of care [1]. This places greater emphasis on patients and family members to assist in the provision of health care. A variety of technologies are being developed in the commercial health market to support self-management, but these technologies need to be available at the point of need to be most useful. One specific group of technologies, the mobile health (mHealth) technologies, shows promise for offering patients and their caregivers’ point-of-need tools for the self-management of health. These mHealth technologies are defined as apps that run on mobile devices for the purpose of assisting consumers or health care providers in monitoring health status or improving health outcomes [2]. mHealth also encompasses sensors, phones, or other devices worn on the body or carried that transmit and receive data wirelessly. mHealth is a subset of the larger field of electronic health (eHealth) that involves the information technologies used in health care delivery [3].

    mHealth technologies that run on accessible mobile platforms may be able to accelerate the transformation of health care by empowering patients and their families with the tools and information that have historically resided with health care professionals. Studies have been published that involve the use of mHealth technologies to improve access to care, improve communication between patients and providers, assist patients in their disease management, and support disease monitoring [4-7]. However, research into the factors that influence use and acceptance of mHealth technology has not kept pace with the rapid proliferation of mHealth tools [2,7]. The factors influencing mHealth use and acceptance may be similar to the factors driving other consumer-based eHealth technologies, but evaluations of mHealth tools have been limited to small studies where key variations in use have not been assessed [6].

    Technology-based interventions designed to support caregivers and their care recipients have been used with mostly positive results. mCARE, a mobile phone‒based secure messaging system designed for veterans, encompasses several assistive components for patient and caregiver self-management [8]. Some of these components were appointment reminders, self-report assessments, health tips, and secure messaging with their provider. More than 90% of users believed that the mCARE system was somewhat or was easy to use [9], demonstrating that this mHealth app was feasible and effective for this population. A randomized trial was conducted to assess the impact of Comprehensive Health Enhancement Support System (CHESS), a Web-based lung cancer information, communication, and coaching system for caregivers on caregiver burden, disruptiveness, and mood [10]. Caregivers randomized to CHESS reported lower burden and negative mood when compared to those in the Internet group, suggesting that eHealth and mHealth interventions similar to CHESS may improve caregivers’ coping skills and, in turn, decrease their perceived burden levels. Tele-Savvy, an Internet-based version of the in-person, evidence-based psychoeducation Savvy Caregiver Program for caregivers of veterans with dementia, used synchronous (teleconferences) and asynchronous components (video modules) to provide program access to caregivers in their homes [11]. In an effectiveness trial, caregivers demonstrated moderately high initial levels of burden, anxiety, and depressive symptoms, all of which decreased significantly at follow-up. There were slightly significant increases in caregiver competence. While there is notable literature on the positive outcomes associated with already developed eHealth interventions [12], it is critical to continue to understand the needs of the caregiver users.

    Numerous studies have shown that in order for technology to be accepted by consumers it must be perceived as beneficial, be easy to use, fit into the workflow of the end user, and be help desk‒supported [13-15]. Understanding what caregivers want from technology-based interventions is important for designing mHealth interventions as well as understanding the factors that will likely drive adoption. Focus groups conducted with community-dwelling patients with complex chronic disease and disability and their caregivers revealed that open two-way communication and dialogue between them and their providers, and better information sharing between providers in order to support continuity and coordination of care as issues that eHealth interventions could address and be of most benefit [16]. Additionally, privacy and data security, accessibility, the loss of necessary visits, increased social isolation, provider burden, shifting responsibility onto patients for care management, entry errors, training requirements, and potentially confusing interfaces were all identified as concerns of patients [16] and therefore need to be taken into consideration when developing eHealth/mHealth technologies. Despite these concerns, upwards of 95% of caregivers who use mobile systems find that interactive features of communication technologies assist in their caregiving [13].

    The National Alliance for Caregiving reports that caregivers consistently convey a need for more information including information on keeping the care recipient safe at home (37%), managing their own stress (34%), identifying easy activities to do for their care recipient (34%), and finding time for themselves (32%). Only 24% of caregivers of veterans reported receiving the formal training they need to perform their caregiver responsibilities and a majority feel ill-equipped to deal with the veteran’s condition, both in terms of having confidence in their own skills or knowing how to seek out additional sources of information or support [17]. In a recent survey of 1000 technology-using family caregivers by the National Alliance for Caregiving [18], caregivers were asked to rate 12 technologies on their potential helpfulness to the caregiver. Those technologies that ranked the highest were Personal Health Record Tracking, Medications Support System, and Symptom Monitoring and Transmission. Those technologies rated the lowest were Caregiving Coaching Software, Transportation Display, and Caregiver Mentor Matching Service. The top benefits expected from the technology include saving time, easing the organizational logistics of caregiving, making the care recipient feel safer, increasing the feeling of being effective, and reducing stress. The overriding barrier expected was the expense of the technology, which is echoed in other studies [13].

    Using the organizing framework for caregiver interventions devised by Van Houtven et al [19] as a guide, the purpose of this study was to generate new knowledge on the relative rates of use of different mHealth tools and the characteristics of veterans and their family caregivers that would predict their use of mHealth tools. The Caregiver Intervention Organizing Framework has three main directives: (1) interventions should assess the quantity and/or quality of care provided, (2) consider a broader range of caregiver and care recipient outcomes, and (3) consider a common set of caregiver and care recipient outcomes to facilitate comparison across studies and over time [19]. As suggested by the aforementioned framework, the quality of the intervention was assessed by using validated caregiving quality measures, as well as the quantity of care (usage rates). In considering a broader range of caregiver and care recipient outcomes, we assessed several different veteran and caregiver factors that we believed may contribute to use of the intervention. Our caregiver outcomes were measured at several points in time to allow for a longitudinal assessment. The results of this study advance our understanding of the potential for adoption of mHealth tools within the context of caregiving.


    Methods

    Summary

    This research study involved the dissemination of iPads (N=881) containing a specific suite of mHealth apps to family caregivers of veterans who receive care in the Veterans Affairs (VA) Health Administration and have serious physical or mental injuries resulting from the post-9/11 wars. Veterans in the study had a combination of physical injuries, mental health diagnoses, and chronic medical conditions, and all were supported by a family caregiver. Thus, these patients exhibit complexities along several axes of the Vector Model of Complexity, a conceptual model that defines patient complexity along axes representing major determinants of health [20]. The suite of mHealth tools was designed by the VA to assist the caregiver in managing veteran posttraumatic stress disorder (PTSD) and pain, as well as provide support with health care-related tasks and help caregivers manage their own stress.

    Study Design and Setting

    This study was designed as a prospective cohort study with the objective of better understanding the factors that influence the use of a suite of mHealth tools (apps). The study participants were enrollees in the VA Comprehensive Assistance for Family Caregivers program as of May 2013, who agreed to participate in the VA Family Caregiver Mobile Health Pilot program. The VA Comprehensive Assistance for Family Caregivers program supports the care of post-9/11 veterans and service members who have sustained serious physical or mental injuries because of their service in the military. As part of this program, family caregivers provide personal care services to the eligible veteran in the veteran’s home. The caregivers are eligible to receive a stipend and health insurance if they do not already qualify for it. In addition, the program provides training, counseling, and respite care to support the caregivers in their caregiving role. The Family Caregiver program is staffed by VA Caregiver Support Coordinators who are located at each VA facility and are responsible for making quarterly home visits to families enrolled in the program and provide ongoing support and assistance to these families.

    The VA Family Caregiver Mobile Health Pilot is a program that distributed government furnished iPads loaded with VA mHealth tools to VA family caregivers and the veterans they care for. A 1-year data and service plan was provided with the iPads. The mHealth apps were developed by the VA for this mHealth pilot and were available only to pilot participants. This mHealth Family Caregiver Pilot project established the VA’s first patient-facing mHealth apps that are integrated with the VA data system and allowed for the exchange of health-related data between the VA and veterans and their family caregivers.

    Study Population and Recruitment

    The study population comprised a cohort of 882 caregiver/veteran dyads that received the iPads, which were loaded with a suite of mHealth apps. A dyad is defined as each caregiver and the unique veteran they provide care for. There were two layers of participation within this study group. The first were caregivers who agreed to participate in the VA mHealth pilot program (N=882). VA administrative data were available for this dyad group, and consent was waived based on its use for secondary data analysis. The second was a subset of caregivers from the study group that completed three baseline surveys (n=577) and consented to participate in this research study. This group will be referred to as the survey group. The Institutional Review Boards of both George Washington University and the Veterans Administration approved the study.

    The study group participants were recruited by a letter sent in August 2012 to all 4501 caregivers enrolled in the VA Family Caregiver program, inviting them to participate in the VA Family Caregiver Mobile Health Pilot program. The VA received 23.22% (1045/4501) affirmative responses. Prior to distributing the iPads, caregivers were eliminated for distribution from the original 1045 if they (1) were no longer enrolled in the Family Caregiver program or (2) could not verbally confirm their shipping address. A total of 84.31% (881/1045) of iPads were distributed in late May to June 2013 to caregivers, which represented 882 unique caregiver/veteran dyads (one caregiver had 2 veterans under care, resulting in an additional unique dyad). A second letter was sent to the 881 caregivers in the study group who had agreed to participate in the VA Family Caregiver Mobile Health Pilot program, asking them if they would like to participate in a research study that was intended to help the VA better understand the needs and challenges experienced by those using the mHealth apps. The letter indicated that by completing the initial survey the study participant was giving their consent to participate in the research study. An opt-out postcard was also provided and study participants were asked to return the card if they were not interested in participating in the study. Survey information, from three different surveys, was collected on 65.4% (577/882) of study participants (see Figure 1). The surveys completed by this survey group included the Caregiver Preparedness, Caregiver Traits, and Caregiver Zarit Burden Inventory surveys, which are provided in Multimedia Appendix 1.

    Figure 1. Consort diagram: how the study cohort was formed.
    View this figure

    Intervention

    The intervention consisted of supplying an iPad loaded with a suite of mHealth apps designed to support caregivers in their caregiving role. Support was provided to users in the form of a quick start guide for setting up the iPad, a website with answers to frequently asked questions, a monthly newsletter, and a Help Desk that received call inquiries. All of the caregivers participating in the study were also called early on to facilitate obtaining a DS Logon (the Department of Veteran Affairs’ self-service account) and were referred to the VA Mobile Health Help Desk for additional assistance.

    Several family caregivers/veteran focus groups and usability tests were conducted to assist VA in selecting the types of apps that they would develop and in designing the apps provided in the mHealth pilot. The apps were developed as native iOS apps for the iOS 6 operating system.

    The suite of apps was bundled within the Launchpad app, which functioned as the “container” that housed all of the mHealth apps in the study. The Launchpad enabled the user to log on once rather than having to log on to each individual mHealth app. The logon credential used for the mHealth apps was the Department of Defense’s “DS Logon” premium account credential. In many cases, caregivers reported using the veteran’s credentials to log on to the VA mHealth apps instead of their own, thus making it difficult to distinguish whether the caregiver or the veteran was using the app. Figure 2 displays the LaunchPad app and the apps as they appeared within the Launchpad.

    Figure 2. Descriptions and screenshots of the mHealth apps.
    View this figure

    Data Collection

    Distribution of the mHealth iPad tools began in late May 2013 and continued through June. Data were collected on the use of these tools for each study participant during their intervention assessment period. The intervention assessment period was defined as the time between when the iPad was received by the study subject and the study end date of September 18, 2013. All the iPads distributed to caregivers were loaded with mobile device management software that allowed the VA to track the use and location of the devices and wipe the devices if they were stolen or manipulated to remove Apple’s security controls. The VA mHealth apps were developed with back-end data metrics that enabled the VA to see the utilization of each VA mHealth app by individual pilot participants and the duration of each use session. Survey data were collected by having study subjects complete three survey instruments that were rendered on the iPads, or by collecting the information verbally over the phone (see Multimedia Appendix 1 for survey items). The survey data fed a back-end database that recorded the date and results of the survey by individual study participant identifier. Descriptive data about the study participants was taken from the VA’s administrative databases. Nonusers of the iPad/mHealth apps intervention were contacted in the early part of the study to determine the reasons for nonuse.

    Study Variables

    The study outcome variable was the use of the mHealth/iPad tools. Use was measured in two ways: (1) a binary outcome representing at least one use of the apps versus no use, and (2) the frequency of app use, for those participants using the apps at least once. Frequency of app use was computed as the number of times the app was used during the intervention assessment period. App use was measured for each individual app and for the entire group of apps.

    The predictor variables for the study group dyad (N=882) comprised veteran and caregiver characteristics that were obtained from VA administrative databases and which are described in Multimedia Appendix 2. We received a waiver of Health Insurance Portability and Accountability Act authorization to collect this data as it was deemed infeasible to obtain consent for all caregivers enrolled in the VA Caregiver program (N=4501). The predictor variables for the survey group dyad (n=577) consisted of the same administrative predictor variables as the study group dyad and augmented with variables derived from the three self-administered survey instruments. Surveys could be completed on the iPad. If study participants had not completed the surveys on the iPad within 2 weeks of receiving the iPad and had not returned the opt-out postcard, then they were contacted by research staff and were given the opportunity to complete the survey using a telephone interview. The survey instruments are listed in Multimedia Appendix 1.

    The caregiver characteristic survey questions represent a subset of questions derived from the 2009 National Alliance for Caregiving survey [17]. These questions include self-reported demographics, activities of daily living, caregiver stress/strain, and computer skills. The caregiver preparedness questions were taken from the Preparedness for Caregiving Scale [21], which asks caregivers to rate themselves on their perceived readiness for the multiple domains of caregiving. The final summary question of the preparedness survey, “Overall, how well prepared do you think you are to care for your Veteran?,” was used as the measure of preparedness because it correlated with the other preparedness questions and had good face validity. The 4-question Zarit Caregiver Burden screening inventory was the survey instrument used to obtain information about caregiver burden levels [22]. The Zarit Caregiver Burden screening inventory is scored with values ranging from 0-4 for each of the four questions. The total possible score is 16. The total score was used as the measure of burden.

    Statistical Analysis

    Analysis began by comparing the baseline caregiver/veteran dyad characteristics of the study and survey groups using a chi-square test to determine if the groups differed from one another. Next, a parsimonious set of predictor variables was selected by examining the bivariate relationships between caregiver and veteran dyad characteristics and app use. The strength of the bivariate analysis was assessed, and those variables strongly associated with the outcome variable were reserved as potential predictor variables. Next, a correlation analysis between pairs of potential predictor variables was performed. When two variables were highly correlated, one was dropped or a composite variable was created in order to reduce model multicollinearity. Finally, multivariate modeling was undertaken using SAS version 9.3 software, to predict app use. Logistic regression modeling was performed to predict the binary use/nonuse outcome for the seven apps as a whole. The analysis was then repeated using negative binomial regression modeling. The binary use/nonuse analysis was intended to provide information on the factors associated with initial interest in using the app, while the frequency analysis was intended to provide information on the factors driving sustained use of the app once app use was established.

    Multivariate models were assessed for fit. Logistic regression models were evaluated using a Hosmer and Leneshow Goodness of Fit statistic of 0.05 or greater and a C-statistic greater than 0.65. Negative binomial fit was assessed by evaluating if the value of the Pearson chi-square statistic divided by the degrees of freedom was close to the value of 1 and by ensuring that the dispersion parameter was not equal to 0. Model results were assessed using odds ratios in the logistic regression model. Since our models were guided by a specific research purpose, we report each P value “as is” without further adjustment for the total number of tests conducted.


    Results

    Table 1 displays the characteristics of the study and survey groups. The chi-square analysis of the study group (N=882) and the survey group (n=577) showed that they were not significantly different from one another with respect to their baseline characteristics. In the study group, the majority of caregivers (94.9%, 837/882) were women and the majority of veteran recipients were men (95.7%, 844/882). The average age of the caregiver was 40 years, and the average age of the veteran was 39 years. The caregivers were primarily spouses (89.3%, 788/882) and were geographically dispersed across the United States with 60.0% (529/882) living in urban locations and 39.8% (351/882) living in rural locations.

    Table 1. Baseline characteristics of caregivers and veterans dyads (N=882).
    View this table

    Table 2 displays the outcome variable, App Use, as both distinct users (used at least once) and as frequency of use in the study and survey groups. Table 2 shows that 29.7% (262/882) of the study group never used one of the seven mHealth apps. In the survey group (n=577), the number of nonusers was 13.5% (78/577). An analysis of these nonusers was conducted to understand how many of the caregiver/veterans dyads lacked the DS Logon credentials required to access the VA mHealth Apps. In the study group, 43.1% of the nonusers (113/262) did not have a DS Logon credential and 33% of the nonusers (23/78) in the survey group did not have a DS Logon credential.

    Table 2. mHealth app use in study and survey groups (N=882).
    View this table

    A subset of nonuser caregivers (n=96) were contacted by phone in the early phase of the study to understand the reasons for nonuse of the apps. Main reasons for nonuse included having DS Logon issues (55%, 53/96), having issues with the apps (22%, 21/96), or experiencing other usability issues (9%, 9/96).

    The distribution of the frequency of app use displayed a negative binomial with a zero inflated dispersion. Figure 3 displays the frequency distribution of app use for the seven mHealth Apps as a whole in the survey group (n=577).

    The results of the bivariate analysis that crossed each potential predictor variable in the survey group (n=577) with the outcome variable, frequency of mHealth app use, are displayed in Tables 3,4, and 5. Tables 3 and 4 contain caregiver-specific variables and Table 5 contains veteran-specific variables. mHealth app use was categorized into four levels: high (>18 uses), medium (>7 and ≤18), low (> 0 and ≤7), and no use. Use categories were constructed by selecting use ranges that produced three relatively equal groupings among the app users.

    Table 3. Results of bivariate analysis of caregiver characteristics and frequency of total app use in the survey group.
    View this table

    The influence of caregiver strain, burden, preparedness, and health was most notable in the bivariate analysis, with a high usage associated with poor health, low preparedness, high burden, and high strain. Caregiver age and education showed an association with high use, with middle-aged and lower-educated caregivers showing higher use. Those with higher reported computer skills tended to be higher users of the apps.

    Table 4. Results of bivariate analysis of caregiving behaviors and frequency of total app use in the survey group.
    View this table

    Similar to caregivers, veterans in the middle-age range were higher users of the apps. Veterans assessed at a monthly stipend level of Tier 1 were higher users. The Tier level represents the amount of work required of the caregiver to meet the care needs of the veteran. Tier 3 represents the highest amount of work and Tier 1 the lowest. Mental health conditions, other than PTSD, were associated with higher app use. Those veterans with a higher percentage of service connected related injuries were associated with lower app use.

    Table 5. Results of bivariate analysis of veteran characteristics and frequency of app use in the survey group.
    View this table

    A correlation analysis was performed on the set of potential predictor variables that had a strong association with the outcome variable. Many variables were strongly correlated with one another, for example, caregiver age and veteran age, relationship and marital status, as well as caregiver stress, burden, health, and preparedness, and education with computer skills. A parsimonious set of predictor variables was selected based on the results of the bivariate and correlation analyses. The final set of variables selected for modeling included veteran age, caregiver-veteran relationship, urban-rural living location, other mental health diagnosis, receiving polytrauma care, overall preparedness survey question, and computer skills.

    Logistic regression modeling was performed to predict at least one use of the mHealth Apps. Table 6 displays the results of modeling the administrative explanatory for the study group (N=882) (Model 1a) and the administrative explanatory variables plus two additional survey variables, Caregiver Preparedness and Computer Skills, on the survey group (n=577) (Model 2b). The negative binomial analysis demonstrated similar associations (data not shown).

    Table 6. Logistic regression model predicting at least one use of a clinical mHealth app.
    View this table

    For the study group (N=882), significant predictors of using an mHealth App at least once included (1) assessment period—the longer the caregiver/veteran dyad had the mHealth/iPad intervention, the more likely it was they would use it, (2) veteran age—for every one unit increase in age, the likelihood of using a clinical app declined by 0.02%, (3) if the caregiver of the veteran is a spouse, then the odds of using at least one clinical app was 2.4 times greater, (4) those living in a rural location had a 1.5 times greater chance of using a clinical app than those living in a urban location, and (5) veterans with a mental health diagnosis, other than PTSD, were 1.6 times more likely to use a clinical app. When a dummy variable was added to the model to reflect if the caregiver/veteran dyad had a logon credential, veteran age was no longer significant. The significance of this is that veteran age is likely a proxy for the likelihood of having a DS Logon credential. Younger veterans tend to have DS Logon credentials issued when they separated from the service, while this was not true for older veterans.

    For survey group (n=577), the only administrative predictor that remained significant was the diagnosis of “Other Mental Health,” which resulted in a 12% increase in the likelihood of using a clinical app compared with those who do not have this diagnosis. Two survey variables were significant predictors. Each one unit increase in caregiver computer skill competency increased the likelihood of using a clinical app by 28%, and each one unit increase in caregiver preparedness decreased the chances of using a clinical app by 42%.

    Figure 3. The frequency distribution of total mHealth app use of the caregivers who completed the baseline surveys.
    View this figure

    Discussion

    Principal Considerations

    To the author’s knowledge, this is the first study that has looked at factors that predict the use of mHealth apps in the context of caregiving. The study provided a number of key insights. It was found that the mHealth apps used most frequently in this population of caregivers of seriously injured veterans were the Summary of Care, Rx Refill, and Notification apps. Apps used less frequently included the Care4Caregiver Journal, PTSD Coach, and Pain Coach apps. The implication of this finding, based on IT acceptance models, is that use is driven by the perceived usefulness of the app and ease of use [23-25].

    The picture that emerged from the bivariate analysis is that there are four principal components driving mHealth app usage. The first relates to the amount of time and effort required for the caregiver to manage the veteran’s medical condition. The second relates to the caregiver strain and preparedness for caregiving. The third has to do with the demographics of the caregivers and veterans. The fourth has to do with computer skills and technology adoption. Caregivers providing care for seriously injured veterans, such as those in polytrauma care or with a high percentage of service-connected conditions as reflected by a high Tier rating (ie, Tier 3) in the caregiver stipend, was associated with decreased app use. This may be related to fewer hours available by the caregiver to use the apps, or it could reflect that use of the apps was a combination of caregiver and veteran use and seriously injured veterans were not as likely to use the apps. The variable selected to represent this dimension in this study was Polytrauma Care. The second component is related to the caregiver’s and veteran care recipient’s physical and mental health condition. Lower health and caregiver preparedness scores coupled with higher strain scores were associated with higher app use. The variable selected to represent the state of the caregiver is Overall Preparedness for Caregiving. The veteran’s medical condition was also an important factor with a diagnosis of a mental health condition, excluding PTSD, being associated with higher usage. Consistent with other studies on factors driving eHealth, demographics were found to be important drivers associated with app use. Increased age of both the veteran and caregiver decreased app use, as did being a non-spouse caregiver. The fourth and final component was related to caregiver computer skills. Those with poor computer skills and low technology adoption rates were less likely to use the Apps; the variable Computer Skills was chosen to represent this dimension.

    The results of the logistic regression modeling predicting use versus nonuse of the apps revealed that at least one use of any of the seven study apps was increased by living in a rural location, being a spouse caregiver, being younger, taking care of a veteran with a mental health condition (excluding PTSD), having better computer skills, and feeling less prepared for caregiving. These findings that older individuals and those with lower computer literacy make less use of consumer health technologies is consistent with other research [26,27]. Rural living locations have often been associated with lower eHealth use due to lower Internet access in rural areas [28]. However, in this study, rural living was associated with increased odds of using the mHealth intervention, which is likely associated with the data plans provided to study participants reducing their requirement for Internet access.

    The surprising 30% nonuse rate found in the study group deserves further investigation. We know that about 50% of these nonusers did not obtain the proper logon credentials required to use the mHealth intervention. The barriers created by the requirement to obtain user credentials are an important consideration when designing future mHealth apps. Another 30% of nonuse was accounted for by issues the users had with the apps. Although the design of many of the apps was informed by collecting feedback from caregiver focus groups, this finding highlights the need to collect regular feedback from app users to understand usability issues so that these issues can be addressed in subsequent app releases. A surprising finding from this study was the low use of the PTSD app in patients with PTSD. This may be related to the fact that the VA already released a PTSD app to public app stores prior to this study. This was the only one of the study apps that has been released to the public during the course of the study.

    Limitations

    It should be noted that this was a pragmatic study examining a target population that is dissimilar to the general patient and caregiver populations, and therefore care must be exercised in extrapolating the results. The study population was restricted to veterans with multiple comorbidities who have sustained serious injuries due to their service in the military. The prevalence of mental health conditions in this population was high and the average age of the population was young, with the average age equal to 39 years. The caregivers in this study were also young, with the average age equal to 40 years old, and therefore do not reflect the typical family caregiver found in the general population. Due to their unique health care needs, future research, both qualitative and quantitative in nature, should aim to evaluate the effects that programs like the Comprehensive Assistance for Family Caregivers program have on veteran/caregiver dyads.

    Conclusions

    This study was designed to contribute to our understanding of the factors that drive veteran and caregiver mHealth use within the caregiving context. The mHealth apps that were most used by family caregivers and their veteran recipients were those that provided information from their health care record and assisted them in their caregiving responsibilities, specifically, filling prescriptions and setting medication reminders. This is consistent with previous research indicating that patients value having health information electronically in one place so that it can be shared and used for the management of their health care. Another key finding in this study was that when tablets with data services plans are provided to health care consumers, those in rural areas were more likely to use the technology than those in urban locations. Computer skills and age continue to matter in mHealth usage as they have in other consumer health technologies, reinforcing the need to provide age-target support to avoid disenfranchising older, less computer-savvy individuals. A final key finding of this study was that those caregivers reporting that they are less prepared for caregiving were more likely to use mHealth tools to support their caregiving responsibilities. This mHealth family caregiver VA pilot project was the first to identify predictors of the use of patient-facing mHealth apps that are integrated within the VA data system and that facilitate the exchange of health-related data between the VA and veterans and their family caregivers.

    Acknowledgments

    The author would like to acknowledge Thomas Houston and everyone on his staff that helped, Kevin Todd who assisted in pulling data, and Brian Olinger and Jessica Bralley who both helped in the distribution of the iPads.

    Conflicts of Interest

    None declared.

    Multimedia Appendix 1

    Survey instruments used for baseline data collection.

    PDF File (Adobe PDF File), 56KB

    Multimedia Appendix 2

    Description of variables obtained from VA administrative databases.

    PDF File (Adobe PDF File), 34KB

    References

    1. Laine C, Davidoff F. Patient-centered medicine. A professional evolution. JAMA 1996 Jan 10;275(2):152-156. [Medline]
    2. Kumar S, Nilsen WJ, Abernethy A, Atienza A, Patrick K, Pavel M, et al. Mobile health technology evaluation: the mHealth evidence workshop. Am J Prev Med 2013 Aug;45(2):228-236 [FREE Full text] [CrossRef] [Medline]
    3. Wang J, Wang Y, Wei C, Yao NA, Yuan A, Shan Y, et al. Smartphone interventions for long-term health management of chronic diseases: an integrative review. Telemed J E Health 2014 Jun;20(6):570-583. [CrossRef] [Medline]
    4. Logan AG. Transforming hypertension management using mobile health technology for telemonitoring and self-care support. Can J Cardiol 2013 May;29(5):579-585. [CrossRef] [Medline]
    5. Quinn CC, Shardell MD, Terrin ML, Barr EA, Ballew SH, Gruber-Baldini AL. Cluster-randomized trial of a mobile phone personalized behavioral intervention for blood glucose control. Diabetes Care 2011 Sep;34(9):1934-1942 [FREE Full text] [CrossRef] [Medline]
    6. Varnfield M, Karunanithi M, Lee C, Honeyman E, Arnold D, Ding H, et al. Smartphone-based home care model improved use of cardiac rehabilitation in postmyocardial infarction patients: results from a randomised controlled trial. Heart 2014 Nov;100(22):1770-1779. [CrossRef] [Medline]
    7. Barr P. mEvidence, please. Mobile health tech is the rage, but does it work? Hosp Health Netw 2013 Sep;87(9):22, 24. [Medline]
    8. Poropatich R, Pavliscsak HH, Tong JC, Little JR, McVeigh FL. mCare: using secure mobile technology to support soldier reintegration and rehabilitation. Telemed J E Health 2014 Jun;20(6):563-569. [CrossRef] [Medline]
    9. Pavliscsak H. mCare: Development, Deployment and Evaluation of a Mobile Telephony-based Patient Secure Messaging System. 2010.   URL: http://www.webcitation.org/6ipIyq9sO [WebCite Cache]
    10. DuBenske L, Gustafson DH, Namkoong K, Hawkins RP, Atwood AK, Brown RL, et al. CHESS improves cancer caregivers' burden and mood: results of an eHealth RCT. Health Psychol 2014 Oct;33(10):1261-1272 [FREE Full text] [CrossRef] [Medline]
    11. Griffiths P, Whitney MK, Kovaleva M, Hepburn K. Development and Implementation of Tele-Savvy for Dementia Caregivers: A Department of Veterans Affairs Clinical Demonstration Project. Gerontologist 2016 Feb;56(1):145-154. [CrossRef] [Medline]
    12. Chi N, Demiris G. A systematic review of telehealth tools and interventions to support family caregivers. J Telemed Telecare 2015 Jan;21(1):37-44 [FREE Full text] [CrossRef] [Medline]
    13. Bossen A, Kim H, Williams KN, Steinhoff AE, Strieker M. Emerging roles for telemedicine and smart technologies in dementia care. Smart Homecare Technol Telehealth 2015;3:49-57 [FREE Full text] [CrossRef] [Medline]
    14. Hu P, Chau P, Sheng O, Tam K. Examining the Technology Acceptance Model Using Physician Acceptance of Telemedicine Technology. Journal of Management Information Systems 2015 Dec 02;16(2):91-112. [CrossRef]
    15. Davis F. User acceptance of information technology: system characteristics, user perceptions and behavioral impacts. International Journal of Man-Machine Studies 1993 Mar;38(3):475-487. [CrossRef]
    16. Steele GC, Miller D, Kuluski K, Cott C. Tying eHealth Tools to Patient Needs: Exploring the Use of eHealth for Community-Dwelling Patients With Complex Chronic Disease and Disability. JMIR Res Protoc 2014;3(4):e67 [FREE Full text] [CrossRef] [Medline]
    17. Alwan M, Orlov L, Schultz R, Vuckovic N. National Alliance for Caregiving, in collaboration with AARP. 2009. Caregiving in the US   URL: http://www.caregiving.org/data/Caregiving_in_the_US_2009_full_report.pdf [accessed 2016-07-07] [WebCite Cache]
    18. e-Connected Family Caregiver: Bringing Caregiving into the 21st Century.   URL: http://www.caregiving.org/data/FINAL_eConnected_Family_Caregiver_Study_Jan%202011.pdf [accessed 2016-07-07] [WebCite Cache]
    19. Van Houtven CH, Voils CI, Weinberger M. An organizing framework for informal caregiver interventions: detailing caregiving activities and caregiver and care recipient outcomes to optimize evaluation efforts. BMC Geriatr 2011;11:77 [FREE Full text] [CrossRef] [Medline]
    20. Safford MM, Allison JJ, Kiefe CI. Patient complexity: more than comorbidity. The vector model of complexity. J Gen Intern Med 2007 Dec;22 Suppl 3:382-390 [FREE Full text] [CrossRef] [Medline]
    21. Archbold PG, Stewart BJ, Greenlick MR, Harvath T. Mutuality and preparedness as predictors of caregiver role strain. Res Nurs Health 1990 Dec;13(6):375-384. [Medline]
    22. Higginson IJ, Gao W, Jackson D, Murray J, Harding R. Short-form Zarit Caregiver Burden Interviews were valid in advanced conditions. J Clin Epidemiol 2010 May;63(5):535-542. [CrossRef] [Medline]
    23. Wilson EV, Lankton NK. Modeling patients' acceptance of provider-delivered e-health. J Am Med Inform Assoc 2004;11(4):241-248 [FREE Full text] [CrossRef] [Medline]
    24. Taylor S, Todd PA. Understanding Information Technology Usage: A Test of Competing Models. Information Systems Research 1995 Jun;6(2):144-176. [CrossRef]
    25. Venkatesh V, Morris M, Davis G, Davis F. User acceptance of information technology: Toward a unified view. MIS Q 2003;27(3):425-478. [CrossRef]
    26. Turvey C, Klein D, Fix G, Hogan TP, Woods S, Simon SR, et al. Blue Button use by patients to access and share health record information using the Department of Veterans Affairs' online patient portal. J Am Med Inform Assoc 2014;21(4):657-663 [FREE Full text] [CrossRef] [Medline]
    27. Kontos E, Blake KD, Chou WS, Prestin A. Predictors of eHealth usage: insights on the digital divide from the Health Information National Trends Survey 2012. J Med Internet Res 2014;16(7):e172 [FREE Full text] [CrossRef] [Medline]
    28. Lustria MLA, Smith SA, Hinnant CC. Exploring digital divides: an examination of eHealth technology use in health information seeking, communication and personal health information management in the USA. Health Informatics J 2011 Sep;17(3):224-243. [CrossRef] [Medline]


    Abbreviations

    CHESS: Comprehensive Health Enhancement Support System
    PTSD: post-traumatic stress disorder
    TBI: traumatic brain injury
    VA: Veterans Affairs Health Administration


    Edited by G Eysenbach; submitted 24.07.14; peer-reviewed by W Nilsen, C Turvey, B Wakefield; comments to author 25.09.14; revised version received 13.07.15; accepted 28.03.16; published 19.07.16

    ©Kathleen L Frisbee. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 19.07.2016.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mhealth and uhealth, is properly cited. The complete bibliographic information, a link to the original publication on http://mhealth.jmir.org/, as well as this copyright and license information must be included.