This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR mHealth and uHealth, is properly cited. The complete bibliographic information, a link to the original publication on https://mhealth.jmir.org/, as well as this copyright and license information must be included.
As a computerized drug–drug interaction (DDI) alert system has not been widely implemented in China, health care providers are relying on mobile health (mHealth) apps as references for checking drug information, including DDIs.
The main objective of this study was to evaluate the quality and content of mHealth apps supporting DDI checking in Chinese app stores.
A systematic review was carried out in November 2020 to identify mHealth apps providing DDI checking in both Chinese iOS and Android platforms. We extracted the apps’ general information (including the developer, operating system, costs, release date, size, number of downloads, and average rating), scientific or clinical basis, and accountability, based on a multidimensional framework for evaluation of apps. The quality of mHealth apps was evaluated by using the Mobile App Rating Scale (MARS). Descriptive statistics, including numbers and percentages, were calculated to describe the characteristics of the apps. For each app selected for evaluation, the section-specific MARS scores were calculated by taking the arithmetic mean, while the overall MARS score was described as the arithmetic mean of the section scores. In addition, the Cohen kappa (κ) statistic was used to evaluate the interrater agreement.
A total of 7 apps met the selection criteria, and only 3 included citations. The average rating score for Android apps was 3.5, with a minimum of 1.0 and a maximum of 4.9, while the average rating score for iOS apps was 4.7, with a minimum of 4.2 and a maximum of 4.9. The mean MARS score was 3.69 out of 5 (95% CI 3.34-4.04), with the lowest score of 1.96 for Medication Guidelines and the highest score of 4.27 for MCDEX mobile. The greatest variation was observed in the information section, which ranged from 1.41 to 4.60. The functionality section showed the highest mean score of 4.05 (95% CI 3.71-4.40), whereas the engagement section resulted in the lowest average score of 3.16 (95% CI 2.81-3.51). For the information quality section, which was the focus of this analysis, the average score was 3.42, with the MCDEX mobile app having the highest score of 4.6 and the Medication Guidelines app having the lowest score of 1.9. For the overall MARS score, the Cohen interrater κ was 0.354 (95% CI 0.236-0.473), the Fleiss κ was 0.353 (95% CI, 0.234-0.472), and the Krippendorff α was 0.356 (95% CI 0.237-0.475).
This study systematically reviewed the mHealth apps in China with a DDI check feature. The majority of investigated apps demonstrated high quality with accurate and comprehensive information on DDIs. However, a few of the apps that had a massive number of downloads in the Chinese market provided incorrect information. Given these apps might be used by health care providers for checking potential DDIs, this creates a substantial threat to patient safety.
Medications are generally safe when used appropriately, but there are risks associated with medication use. Adverse drug event (ADEs), defined as injuries caused by the use of a drug [
DDIs are avoidable, however, and preventing DDI remains a patient safety challenge in many countries including China. It has been widely reported that physicians and pharmacists, who are in the front line of detecting DDIs, cannot recognize clinically important DDIs [
With the advance of smartphones and mobile apps, using mobile health (mHealth) apps with a DDI checking function seems promising, especially with consideration to the convenience of searching for drug information. However, mHealth apps supporting DDI checks are not subject to the National Medical Products Administration (NMPA) regulations, posing a substantial threat to patient safety. A Canadian study recently reported the results of an assessment evaluating mHealth apps supporting DDI checks and found a lack of high-quality apps [
To our best knowledge, there is no published study evaluating the DDI-related mHealth apps available in Chinese app stores. As using incorrect drug information can have serious consequences, the aim of this study was to systematically evaluate DDI-related mHealth apps in Chinese app stores.
This review followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) systematic review protocol [
We used the “crawling” method to interact directly with app stores’ mHealth repository to avoid any personalized search results that might have been determined by a previous search query. Compared to traditional methods using the search query, creating a health-related app repository allowed us to perform a more thorough and reliable search [
To avoid potential omissions, we also carried out an extensive keyword search in both iOS and Android app stores. The search terms included “drug interaction*,” “pill interaction*,” “medication interaction*,” and “DDI*,” following previous app review studies [
CW and MG independently searched in an iPhone or Android phone and selected the apps for inclusion according to the selection criteria. In situations where there was a discrepancy, a third senior rater (CYS) reviewed the app description, and a consensus was made after a thorough discussion. All 3 raters are pharmacists who are currently practicing in the hospital.
We first extracted data on the general information about the apps, including the developer, operating system (iOS, Android, or both), costs (free or paid), release date, size (in megabytes), number of downloads, and average rating in the app stores. For mHealth apps supporting a DDI check, the information quality and content accountability are critical for patient safety. Hence, we also extracted specific information related to the scientific or clinical basis and accountability [
The scientific or clinical basis refers to the scientific quality of the content and was evaluated by the following metrics: accordance with existing evidence (yes or no), presence of citations (yes or no), clinician involvement (yes or no), affiliation with credible organization (yes or no), and expert assessment (yes or no) [
Accountability relates to the credibility of the app developer and was assessed by the presence of the following information: regulatory approval (yes or no), copyright information (yes or no), date of the last update, contact information, and disclaimer [
To assess the different factors related to app quality, we used the Mobile App Rating Scale (MARS), a multidimensional instrument for systematically evaluating the quality of mHealth apps [
Descriptive statistics, including numbers and percentages, were calculated to describe the characteristics of the apps. For each app selected for evaluation, the section-specific MARS scores were calculated by taking the arithmetic mean, while the overall MARS score were calculated as the arithmetic mean of the section scores [
A total of 296 and 498 apps were identified in the iOS and Android App stores, respectively (
Flowchart of the mobile health app selection. DDI: Drug–Drug Interaction; MARS: Mobile App Rating Scale.
Of the 6 apps with a download count available, 3 (50%) had been downloaded more than 1 million times. The average rating score for android apps was 3.5, with a minimum of 1.0 and a maximum of 4.9, while the average rating score for iOS apps was 4.7, with a minimum of 4.2 and a maximum of 4.9.
As shown in
General information on the mobile health apps included in the reviewa.
App name in English | Target market | Platform | Size (MB) | Cost per month, RMBb (USD) | Release date | Downloads, nc | Mean user rating | |
|
Android | iOS | ||||||
MCDEX mobile | Health care professionals | iOS & Android | 13.06 | 89 (13.97) | 3/5/2015 | 11,193 | 3.4 | 4.2 |
Medication Assistant by DXY | Health care professionals | iOS & Android | 169.16 | 30 (4.71) | 11/17/2012 | 55,585,879 | 4.5 | 4.8 |
Medication Reference | Doctors, pharmacists, and other HCPsd | iOS & Android | 334.96 | free | 7/10/2012 | —e | 4.9 | 4.5 |
Medication Assistant of People's Health | Doctors, pharmacists, nurses, and other HCPs | iOS & Android | 86.52 | 12 (1.88) | 6/9/2017 | 227,021 | 2.0 | 4.9 |
Medication Guidelines | Health care professionals | Android | 2.38 | free | — | 750,760 | 1.0 | — |
DXY | Health care professionals | iOS & Android | 262.67 | free | 1/29/2012 | 66,298,525 | 4.7 | 4.9 |
Yi Mai Tong | Health care professionals | iOS & Android | 73.67 | free | 9/26/2013 | 8,900,870 | 4.1 | 4.8 |
aApps in Chinese app stores were searched on November 15, 2020.
bRMB: yuan renminbi.
cNumber of downloads was not available for the iOS platform.
dHCP: health care provider.
eData not available.
Information quality and accountability of the mobile health apps.
Overall, the mean MARS score was 3.69 (95% CI 3.34-4.04;
MARS scores by section. The box plot shows the mean, IQR, minimum, and maximum scores. The left and right edge of the boxes represent the first and third quartiles, the line within the boxes represents the mean, and the left and right whiskers represents the minimum and maximum scores. The scatter plot shows the distribution of MARS scores evaluated by 2 raters. MARS: Mobile App Rating Scale.
For the information quality section, which was the focus of this analysis, the average score was 3.42; the MCDEX mobile app had the highest mean score of 4.6, while the Medication Guidelines app had the lowest average score of 1.9. Of note, in the evaluation of information accuracy, the average score was 2.2 out of 5, or only 44% of the DDI pairs were described correctly. Only 1 of the 7 apps (14%), MCDEX, identified all 35 DDI pairs correctly, while 4 (57%) failed to describe half of the DDI pairs. For those DDI pairs with interactions, the average score was 2.35 out of 5, while the average score was 1.70 out of 5 for those drug pairs without DDIs, indicating these apps had a relatively higher false-positive rate. In the evaluation of the comprehensiveness of information, the average score was 2.9. Out of the 7 apps, 6 (86%) provided incomplete DDI information, while 4 apps covered less than half of the DDIs. The detailed results are presented in
For the overall MARS score, the κ coefficient was 0.354 (95% CI 0.236-0.473), the Fleiss κ was 0.353 (95% CI 0.234-0.472), and the Krippendorff α was 0.356 (95% CI 0.237-0.475). Based on the cutoff level of the κ statistic commonly cited in the literature, a κ of 0.354 was interpreted as fair agreement [
Correlation matrix among MARS scores, price, number of downloads, and average user rating.
Characteristics | MARSa scores | Price | Number of downloadsb | Mean user rating | |||||||
|
1 | 2 | 3 | 4 |
|
||||||
|
|||||||||||
|
Engagement | —c |
|
|
|
|
|
|
|||
|
Functionality | 0.68d | — |
|
|
|
|
|
|||
|
Aesthetics | 0.95e | 0.80f | — |
|
|
|
|
|||
|
Information | 0.68g | 0.94h | 0.82i | — |
|
|
|
|||
Price | 0.36 | 0.48 | 0.54 | 0.70j | — |
|
|
||||
Number of downloadsb | 0.30 | 0.32 | 0.25 | 0.18 | –0.40 | — |
|
||||
Average user rating | 0.09 | 0.12 | 0.06 | 0.18 | 0.00 | 0.73k | — |
aMARS: Mobile App Rating Scale.
bNumber of downloads was not available for the iOS platform.
cNot applicable.
d
e
f
g
h
i
j
k
This systematic review found there to be an acceptable quality of mHealth apps with a DDI check in Chinese app stores, with an average MARS score of 3.63. However, the quality of the information section was polarized among apps included in the review. Specifically, nearly half of the investigated apps that aimed to identify any significant interaction associated with concurrently administered drugs showed relatively poor quality in scientific information. On the other hand, the MCDEX mobile app, developed under the supervision of the China National Health Committee, demonstrated high quality in content accuracy and comprehensiveness, highlighting the importance of the scientific or clinical basis and accountability dimensions. To our best knowledge, this study was the first analysis to systematically evaluate apps for DDI checks in China and underscores the importance of regulation in the mHealth apps, which is becoming a major source of information for health care providers in China, based on our ongoing survey exploring physicians’ knowledge and sources of information on DDIs.
In this in-depth analysis, the information quality of mHealth apps with a DDI check feature showed great variety. The total MARS score ranged from 1.97 to 4.23, whereas the MARS score for the information section ranged from 1.41 to 4.60, suggesting there was a certain proportion of apps with relatively low quality. These findings were consistent with another app review conducted in Canadian app stores [
Our study also suggested that prices could be an important factor influencing the information quality of mHealth apps with DDI features. The MCDEX mobile app, which required the highest subscription fee, scored highest in the information section by providing accurate and comprehensive information in DDIs. However, 4 apps available for free provided unsatisfactory drug information. Of further note, these free apps were very popular in the market, making patient safety a serious concern.
In this review, only 7 mHealth apps were identified in Chinese app stores, fewer than those available to Canadians in English (n=26) according to a similar review conducted in 2018 [
This study has the following strengths. First, this is the first systematic review for DDI-related apps using the advanced crawling method for ensuring a more comprehensive search [
There are also several limitations to our study. First, we searched the apps at a certain time point, and we cannot exclude the possibility that newly released apps might have been missed in the search. Second, our search might not have been sufficiently thorough. However, 2 raters performed an independent review with the consultation of a third rater, and thus the possibility that certain apps were missed should have been minimal. Third, despite the efforts to make raters familiar with the MARS scale by watching videos and reading protocols, the rating scores might have been subjective, which makes it difficult to compare across different apps. A higher score reported for a certain app may not indicate higher quality; instead, it may suggest that this app was overscored by the raters. To address this concern, we also used 35 drug pairs to assess the information quality in a more objective way.
In conclusion, this study provided a comprehensive overview of the mHealth apps with a DDI check function available in Chinese app stores. Using the multidimensional framework for the evaluation of apps, we found that the quality of mHealth apps was acceptable although a limited number of apps provided inaccurate and incomplete information about DDIs. The majority of investigated apps provided accurate and comprehensive information. A few of the apps that had large number of downloads offered a relatively low quality of drug information. As most of the apps found in Chinese app stores targeted health care professionals who may use these apps as a reference for DDI information, our findings underscore the importance of providing accurate scientific information in mHealth apps, as DDIs can have serious consequences.
Mobile App Rating Scale (MARS).
List of drug–drug interaction pairs for the Mobile App Rating Scale (MARS) number 15 and 16.
Detailed results of information quality and accountability.
Detailed results of Mobile App Rating Scale (MARS) number 15 and 16.
Detailed interrater reliability results.
adverse drug event
drug–drug interaction
Mobile App Rating Scale
mobile health
National Medical Products Administration
Preferred Reporting Items for Systematic Reviews and Meta-Analyses
None declared.